Jan 31 01:25:50 np0005603622 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 31 01:25:50 np0005603622 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 31 01:25:50 np0005603622 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:25:50 np0005603622 kernel: BIOS-provided physical RAM map:
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 31 01:25:50 np0005603622 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 31 01:25:50 np0005603622 kernel: NX (Execute Disable) protection: active
Jan 31 01:25:50 np0005603622 kernel: APIC: Static calls initialized
Jan 31 01:25:50 np0005603622 kernel: SMBIOS 2.8 present.
Jan 31 01:25:50 np0005603622 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 31 01:25:50 np0005603622 kernel: Hypervisor detected: KVM
Jan 31 01:25:50 np0005603622 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 31 01:25:50 np0005603622 kernel: kvm-clock: using sched offset of 8447228359 cycles
Jan 31 01:25:50 np0005603622 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 31 01:25:50 np0005603622 kernel: tsc: Detected 2800.000 MHz processor
Jan 31 01:25:50 np0005603622 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 31 01:25:50 np0005603622 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 31 01:25:50 np0005603622 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 31 01:25:50 np0005603622 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 31 01:25:50 np0005603622 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 31 01:25:50 np0005603622 kernel: Using GB pages for direct mapping
Jan 31 01:25:50 np0005603622 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 31 01:25:50 np0005603622 kernel: ACPI: Early table checksum verification disabled
Jan 31 01:25:50 np0005603622 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 31 01:25:50 np0005603622 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:25:50 np0005603622 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:25:50 np0005603622 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:25:50 np0005603622 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 31 01:25:50 np0005603622 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:25:50 np0005603622 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 31 01:25:50 np0005603622 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 31 01:25:50 np0005603622 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 31 01:25:50 np0005603622 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 31 01:25:50 np0005603622 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 31 01:25:50 np0005603622 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 31 01:25:50 np0005603622 kernel: No NUMA configuration found
Jan 31 01:25:50 np0005603622 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 31 01:25:50 np0005603622 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 31 01:25:50 np0005603622 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 31 01:25:50 np0005603622 kernel: Zone ranges:
Jan 31 01:25:50 np0005603622 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 31 01:25:50 np0005603622 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 31 01:25:50 np0005603622 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 01:25:50 np0005603622 kernel:  Device   empty
Jan 31 01:25:50 np0005603622 kernel: Movable zone start for each node
Jan 31 01:25:50 np0005603622 kernel: Early memory node ranges
Jan 31 01:25:50 np0005603622 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 31 01:25:50 np0005603622 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 31 01:25:50 np0005603622 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 31 01:25:50 np0005603622 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 31 01:25:50 np0005603622 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 31 01:25:50 np0005603622 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 31 01:25:50 np0005603622 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 31 01:25:50 np0005603622 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 31 01:25:50 np0005603622 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 31 01:25:50 np0005603622 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 31 01:25:50 np0005603622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 31 01:25:50 np0005603622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 31 01:25:50 np0005603622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 31 01:25:50 np0005603622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 31 01:25:50 np0005603622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 31 01:25:50 np0005603622 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 31 01:25:50 np0005603622 kernel: TSC deadline timer available
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Max. logical packages:   8
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Max. logical dies:       8
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Max. dies per package:   1
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Max. threads per core:   1
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Num. cores per package:     1
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Num. threads per package:   1
Jan 31 01:25:50 np0005603622 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 31 01:25:50 np0005603622 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 31 01:25:50 np0005603622 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 31 01:25:50 np0005603622 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 31 01:25:50 np0005603622 kernel: Booting paravirtualized kernel on KVM
Jan 31 01:25:50 np0005603622 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 31 01:25:50 np0005603622 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 31 01:25:50 np0005603622 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 31 01:25:50 np0005603622 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 31 01:25:50 np0005603622 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:25:50 np0005603622 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 31 01:25:50 np0005603622 kernel: random: crng init done
Jan 31 01:25:50 np0005603622 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: Fallback order for Node 0: 0 
Jan 31 01:25:50 np0005603622 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 31 01:25:50 np0005603622 kernel: Policy zone: Normal
Jan 31 01:25:50 np0005603622 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 31 01:25:50 np0005603622 kernel: software IO TLB: area num 8.
Jan 31 01:25:50 np0005603622 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 31 01:25:50 np0005603622 kernel: ftrace: allocating 49438 entries in 194 pages
Jan 31 01:25:50 np0005603622 kernel: ftrace: allocated 194 pages with 3 groups
Jan 31 01:25:50 np0005603622 kernel: Dynamic Preempt: voluntary
Jan 31 01:25:50 np0005603622 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 31 01:25:50 np0005603622 kernel: rcu: #011RCU event tracing is enabled.
Jan 31 01:25:50 np0005603622 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 31 01:25:50 np0005603622 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 31 01:25:50 np0005603622 kernel: #011Rude variant of Tasks RCU enabled.
Jan 31 01:25:50 np0005603622 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 31 01:25:50 np0005603622 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 31 01:25:50 np0005603622 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 31 01:25:50 np0005603622 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:25:50 np0005603622 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:25:50 np0005603622 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 31 01:25:50 np0005603622 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 31 01:25:50 np0005603622 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 31 01:25:50 np0005603622 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 31 01:25:50 np0005603622 kernel: Console: colour VGA+ 80x25
Jan 31 01:25:50 np0005603622 kernel: printk: console [ttyS0] enabled
Jan 31 01:25:50 np0005603622 kernel: ACPI: Core revision 20230331
Jan 31 01:25:50 np0005603622 kernel: APIC: Switch to symmetric I/O mode setup
Jan 31 01:25:50 np0005603622 kernel: x2apic enabled
Jan 31 01:25:50 np0005603622 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 31 01:25:50 np0005603622 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 31 01:25:50 np0005603622 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 31 01:25:50 np0005603622 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 31 01:25:50 np0005603622 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 31 01:25:50 np0005603622 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 31 01:25:50 np0005603622 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 31 01:25:50 np0005603622 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 31 01:25:50 np0005603622 kernel: Spectre V2 : Mitigation: Retpolines
Jan 31 01:25:50 np0005603622 kernel: RETBleed: Mitigation: untrained return thunk
Jan 31 01:25:50 np0005603622 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 31 01:25:50 np0005603622 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 31 01:25:50 np0005603622 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 31 01:25:50 np0005603622 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 31 01:25:50 np0005603622 kernel: active return thunk: retbleed_return_thunk
Jan 31 01:25:50 np0005603622 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 31 01:25:50 np0005603622 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 31 01:25:50 np0005603622 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 31 01:25:50 np0005603622 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 31 01:25:50 np0005603622 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 31 01:25:50 np0005603622 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 31 01:25:50 np0005603622 kernel: Freeing SMP alternatives memory: 40K
Jan 31 01:25:50 np0005603622 kernel: pid_max: default: 32768 minimum: 301
Jan 31 01:25:50 np0005603622 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 31 01:25:50 np0005603622 kernel: landlock: Up and running.
Jan 31 01:25:50 np0005603622 kernel: Yama: becoming mindful.
Jan 31 01:25:50 np0005603622 kernel: SELinux:  Initializing.
Jan 31 01:25:50 np0005603622 kernel: LSM support for eBPF active
Jan 31 01:25:50 np0005603622 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 31 01:25:50 np0005603622 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 31 01:25:50 np0005603622 kernel: ... version:                0
Jan 31 01:25:50 np0005603622 kernel: ... bit width:              48
Jan 31 01:25:50 np0005603622 kernel: ... generic registers:      6
Jan 31 01:25:50 np0005603622 kernel: ... value mask:             0000ffffffffffff
Jan 31 01:25:50 np0005603622 kernel: ... max period:             00007fffffffffff
Jan 31 01:25:50 np0005603622 kernel: ... fixed-purpose events:   0
Jan 31 01:25:50 np0005603622 kernel: ... event mask:             000000000000003f
Jan 31 01:25:50 np0005603622 kernel: signal: max sigframe size: 1776
Jan 31 01:25:50 np0005603622 kernel: rcu: Hierarchical SRCU implementation.
Jan 31 01:25:50 np0005603622 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 31 01:25:50 np0005603622 kernel: smp: Bringing up secondary CPUs ...
Jan 31 01:25:50 np0005603622 kernel: smpboot: x86: Booting SMP configuration:
Jan 31 01:25:50 np0005603622 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 31 01:25:50 np0005603622 kernel: smp: Brought up 1 node, 8 CPUs
Jan 31 01:25:50 np0005603622 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 31 01:25:50 np0005603622 kernel: node 0 deferred pages initialised in 17ms
Jan 31 01:25:50 np0005603622 kernel: Memory: 7763776K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618400K reserved, 0K cma-reserved)
Jan 31 01:25:50 np0005603622 kernel: devtmpfs: initialized
Jan 31 01:25:50 np0005603622 kernel: x86/mm: Memory block size: 128MB
Jan 31 01:25:50 np0005603622 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 31 01:25:50 np0005603622 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 31 01:25:50 np0005603622 kernel: pinctrl core: initialized pinctrl subsystem
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 31 01:25:50 np0005603622 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 31 01:25:50 np0005603622 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 31 01:25:50 np0005603622 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 31 01:25:50 np0005603622 kernel: audit: initializing netlink subsys (disabled)
Jan 31 01:25:50 np0005603622 kernel: audit: type=2000 audit(1769840749.192:1): state=initialized audit_enabled=0 res=1
Jan 31 01:25:50 np0005603622 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 31 01:25:50 np0005603622 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 31 01:25:50 np0005603622 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 31 01:25:50 np0005603622 kernel: cpuidle: using governor menu
Jan 31 01:25:50 np0005603622 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 31 01:25:50 np0005603622 kernel: PCI: Using configuration type 1 for base access
Jan 31 01:25:50 np0005603622 kernel: PCI: Using configuration type 1 for extended access
Jan 31 01:25:50 np0005603622 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 31 01:25:50 np0005603622 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 31 01:25:50 np0005603622 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 31 01:25:50 np0005603622 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 31 01:25:50 np0005603622 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 31 01:25:50 np0005603622 kernel: Demotion targets for Node 0: null
Jan 31 01:25:50 np0005603622 kernel: cryptd: max_cpu_qlen set to 1000
Jan 31 01:25:50 np0005603622 kernel: ACPI: Added _OSI(Module Device)
Jan 31 01:25:50 np0005603622 kernel: ACPI: Added _OSI(Processor Device)
Jan 31 01:25:50 np0005603622 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 31 01:25:50 np0005603622 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 31 01:25:50 np0005603622 kernel: ACPI: Interpreter enabled
Jan 31 01:25:50 np0005603622 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 31 01:25:50 np0005603622 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 31 01:25:50 np0005603622 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 31 01:25:50 np0005603622 kernel: PCI: Using E820 reservations for host bridge windows
Jan 31 01:25:50 np0005603622 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 31 01:25:50 np0005603622 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [3] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [4] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [5] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [6] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [7] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [8] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [9] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [10] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [11] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [12] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [13] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [14] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [15] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [16] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [17] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [18] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [19] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [20] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [21] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [22] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [23] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [24] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [25] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [26] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [27] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [28] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [29] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [30] registered
Jan 31 01:25:50 np0005603622 kernel: acpiphp: Slot [31] registered
Jan 31 01:25:50 np0005603622 kernel: PCI host bridge to bus 0000:00
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 31 01:25:50 np0005603622 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 31 01:25:50 np0005603622 kernel: iommu: Default domain type: Translated
Jan 31 01:25:50 np0005603622 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 31 01:25:50 np0005603622 kernel: SCSI subsystem initialized
Jan 31 01:25:50 np0005603622 kernel: ACPI: bus type USB registered
Jan 31 01:25:50 np0005603622 kernel: usbcore: registered new interface driver usbfs
Jan 31 01:25:50 np0005603622 kernel: usbcore: registered new interface driver hub
Jan 31 01:25:50 np0005603622 kernel: usbcore: registered new device driver usb
Jan 31 01:25:50 np0005603622 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 31 01:25:50 np0005603622 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 31 01:25:50 np0005603622 kernel: PTP clock support registered
Jan 31 01:25:50 np0005603622 kernel: EDAC MC: Ver: 3.0.0
Jan 31 01:25:50 np0005603622 kernel: NetLabel: Initializing
Jan 31 01:25:50 np0005603622 kernel: NetLabel:  domain hash size = 128
Jan 31 01:25:50 np0005603622 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 31 01:25:50 np0005603622 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 31 01:25:50 np0005603622 kernel: PCI: Using ACPI for IRQ routing
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 31 01:25:50 np0005603622 kernel: vgaarb: loaded
Jan 31 01:25:50 np0005603622 kernel: clocksource: Switched to clocksource kvm-clock
Jan 31 01:25:50 np0005603622 kernel: VFS: Disk quotas dquot_6.6.0
Jan 31 01:25:50 np0005603622 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 31 01:25:50 np0005603622 kernel: pnp: PnP ACPI init
Jan 31 01:25:50 np0005603622 kernel: pnp: PnP ACPI: found 5 devices
Jan 31 01:25:50 np0005603622 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_INET protocol family
Jan 31 01:25:50 np0005603622 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 31 01:25:50 np0005603622 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_XDP protocol family
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 31 01:25:50 np0005603622 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 31 01:25:50 np0005603622 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 31 01:25:50 np0005603622 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 23625 usecs
Jan 31 01:25:50 np0005603622 kernel: PCI: CLS 0 bytes, default 64
Jan 31 01:25:50 np0005603622 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 31 01:25:50 np0005603622 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 31 01:25:50 np0005603622 kernel: Trying to unpack rootfs image as initramfs...
Jan 31 01:25:50 np0005603622 kernel: ACPI: bus type thunderbolt registered
Jan 31 01:25:50 np0005603622 kernel: Initialise system trusted keyrings
Jan 31 01:25:50 np0005603622 kernel: Key type blacklist registered
Jan 31 01:25:50 np0005603622 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 31 01:25:50 np0005603622 kernel: zbud: loaded
Jan 31 01:25:50 np0005603622 kernel: integrity: Platform Keyring initialized
Jan 31 01:25:50 np0005603622 kernel: integrity: Machine keyring initialized
Jan 31 01:25:50 np0005603622 kernel: Freeing initrd memory: 88000K
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_ALG protocol family
Jan 31 01:25:50 np0005603622 kernel: xor: automatically using best checksumming function   avx       
Jan 31 01:25:50 np0005603622 kernel: Key type asymmetric registered
Jan 31 01:25:50 np0005603622 kernel: Asymmetric key parser 'x509' registered
Jan 31 01:25:50 np0005603622 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 31 01:25:50 np0005603622 kernel: io scheduler mq-deadline registered
Jan 31 01:25:50 np0005603622 kernel: io scheduler kyber registered
Jan 31 01:25:50 np0005603622 kernel: io scheduler bfq registered
Jan 31 01:25:50 np0005603622 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 31 01:25:50 np0005603622 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 31 01:25:50 np0005603622 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 31 01:25:50 np0005603622 kernel: ACPI: button: Power Button [PWRF]
Jan 31 01:25:50 np0005603622 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 31 01:25:50 np0005603622 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 31 01:25:50 np0005603622 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 31 01:25:50 np0005603622 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 31 01:25:50 np0005603622 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 31 01:25:50 np0005603622 kernel: Non-volatile memory driver v1.3
Jan 31 01:25:50 np0005603622 kernel: rdac: device handler registered
Jan 31 01:25:50 np0005603622 kernel: hp_sw: device handler registered
Jan 31 01:25:50 np0005603622 kernel: emc: device handler registered
Jan 31 01:25:50 np0005603622 kernel: alua: device handler registered
Jan 31 01:25:50 np0005603622 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 31 01:25:50 np0005603622 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 31 01:25:50 np0005603622 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 31 01:25:50 np0005603622 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 31 01:25:50 np0005603622 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 31 01:25:50 np0005603622 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 31 01:25:50 np0005603622 kernel: usb usb1: Product: UHCI Host Controller
Jan 31 01:25:50 np0005603622 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 31 01:25:50 np0005603622 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 31 01:25:50 np0005603622 kernel: hub 1-0:1.0: USB hub found
Jan 31 01:25:50 np0005603622 kernel: hub 1-0:1.0: 2 ports detected
Jan 31 01:25:50 np0005603622 kernel: usbcore: registered new interface driver usbserial_generic
Jan 31 01:25:50 np0005603622 kernel: usbserial: USB Serial support registered for generic
Jan 31 01:25:50 np0005603622 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 31 01:25:50 np0005603622 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 31 01:25:50 np0005603622 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 31 01:25:50 np0005603622 kernel: mousedev: PS/2 mouse device common for all mice
Jan 31 01:25:50 np0005603622 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 31 01:25:50 np0005603622 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 31 01:25:50 np0005603622 kernel: rtc_cmos 00:04: registered as rtc0
Jan 31 01:25:50 np0005603622 kernel: rtc_cmos 00:04: setting system clock to 2026-01-31T06:25:49 UTC (1769840749)
Jan 31 01:25:50 np0005603622 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 31 01:25:50 np0005603622 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 31 01:25:50 np0005603622 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 31 01:25:50 np0005603622 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 31 01:25:50 np0005603622 kernel: usbcore: registered new interface driver usbhid
Jan 31 01:25:50 np0005603622 kernel: usbhid: USB HID core driver
Jan 31 01:25:50 np0005603622 kernel: drop_monitor: Initializing network drop monitor service
Jan 31 01:25:50 np0005603622 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 31 01:25:50 np0005603622 kernel: Initializing XFRM netlink socket
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_INET6 protocol family
Jan 31 01:25:50 np0005603622 kernel: Segment Routing with IPv6
Jan 31 01:25:50 np0005603622 kernel: NET: Registered PF_PACKET protocol family
Jan 31 01:25:50 np0005603622 kernel: mpls_gso: MPLS GSO support
Jan 31 01:25:50 np0005603622 kernel: IPI shorthand broadcast: enabled
Jan 31 01:25:50 np0005603622 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 31 01:25:50 np0005603622 kernel: AES CTR mode by8 optimization enabled
Jan 31 01:25:50 np0005603622 kernel: sched_clock: Marking stable (928006250, 145891640)->(1142409890, -68512000)
Jan 31 01:25:50 np0005603622 kernel: registered taskstats version 1
Jan 31 01:25:50 np0005603622 kernel: Loading compiled-in X.509 certificates
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 31 01:25:50 np0005603622 kernel: Demotion targets for Node 0: null
Jan 31 01:25:50 np0005603622 kernel: page_owner is disabled
Jan 31 01:25:50 np0005603622 kernel: Key type .fscrypt registered
Jan 31 01:25:50 np0005603622 kernel: Key type fscrypt-provisioning registered
Jan 31 01:25:50 np0005603622 kernel: Key type big_key registered
Jan 31 01:25:50 np0005603622 kernel: Key type encrypted registered
Jan 31 01:25:50 np0005603622 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 31 01:25:50 np0005603622 kernel: Loading compiled-in module X.509 certificates
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 31 01:25:50 np0005603622 kernel: ima: Allocated hash algorithm: sha256
Jan 31 01:25:50 np0005603622 kernel: ima: No architecture policies found
Jan 31 01:25:50 np0005603622 kernel: evm: Initialising EVM extended attributes:
Jan 31 01:25:50 np0005603622 kernel: evm: security.selinux
Jan 31 01:25:50 np0005603622 kernel: evm: security.SMACK64 (disabled)
Jan 31 01:25:50 np0005603622 kernel: evm: security.SMACK64EXEC (disabled)
Jan 31 01:25:50 np0005603622 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 31 01:25:50 np0005603622 kernel: evm: security.SMACK64MMAP (disabled)
Jan 31 01:25:50 np0005603622 kernel: evm: security.apparmor (disabled)
Jan 31 01:25:50 np0005603622 kernel: evm: security.ima
Jan 31 01:25:50 np0005603622 kernel: evm: security.capability
Jan 31 01:25:50 np0005603622 kernel: evm: HMAC attrs: 0x1
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 31 01:25:50 np0005603622 kernel: Running certificate verification RSA selftest
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 31 01:25:50 np0005603622 kernel: Running certificate verification ECDSA selftest
Jan 31 01:25:50 np0005603622 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 31 01:25:50 np0005603622 kernel: clk: Disabling unused clocks
Jan 31 01:25:50 np0005603622 kernel: Freeing unused decrypted memory: 2028K
Jan 31 01:25:50 np0005603622 kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 31 01:25:50 np0005603622 kernel: Write protecting the kernel read-only data: 30720k
Jan 31 01:25:50 np0005603622 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 31 01:25:50 np0005603622 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 31 01:25:50 np0005603622 kernel: Run /init as init process
Jan 31 01:25:50 np0005603622 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 01:25:50 np0005603622 systemd: Detected virtualization kvm.
Jan 31 01:25:50 np0005603622 systemd: Detected architecture x86-64.
Jan 31 01:25:50 np0005603622 systemd: Running in initrd.
Jan 31 01:25:50 np0005603622 systemd: No hostname configured, using default hostname.
Jan 31 01:25:50 np0005603622 systemd: Hostname set to <localhost>.
Jan 31 01:25:50 np0005603622 systemd: Initializing machine ID from VM UUID.
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: Manufacturer: QEMU
Jan 31 01:25:50 np0005603622 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 31 01:25:50 np0005603622 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 31 01:25:50 np0005603622 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 31 01:25:50 np0005603622 systemd: Queued start job for default target Initrd Default Target.
Jan 31 01:25:50 np0005603622 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 01:25:50 np0005603622 systemd: Reached target Local Encrypted Volumes.
Jan 31 01:25:50 np0005603622 systemd: Reached target Initrd /usr File System.
Jan 31 01:25:50 np0005603622 systemd: Reached target Local File Systems.
Jan 31 01:25:50 np0005603622 systemd: Reached target Path Units.
Jan 31 01:25:50 np0005603622 systemd: Reached target Slice Units.
Jan 31 01:25:50 np0005603622 systemd: Reached target Swaps.
Jan 31 01:25:50 np0005603622 systemd: Reached target Timer Units.
Jan 31 01:25:50 np0005603622 systemd: Listening on D-Bus System Message Bus Socket.
Jan 31 01:25:50 np0005603622 systemd: Listening on Journal Socket (/dev/log).
Jan 31 01:25:50 np0005603622 systemd: Listening on Journal Socket.
Jan 31 01:25:50 np0005603622 systemd: Listening on udev Control Socket.
Jan 31 01:25:50 np0005603622 systemd: Listening on udev Kernel Socket.
Jan 31 01:25:50 np0005603622 systemd: Reached target Socket Units.
Jan 31 01:25:50 np0005603622 systemd: Starting Create List of Static Device Nodes...
Jan 31 01:25:50 np0005603622 systemd: Starting Journal Service...
Jan 31 01:25:50 np0005603622 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 01:25:50 np0005603622 systemd: Starting Apply Kernel Variables...
Jan 31 01:25:50 np0005603622 systemd: Starting Create System Users...
Jan 31 01:25:50 np0005603622 systemd: Starting Setup Virtual Console...
Jan 31 01:25:50 np0005603622 systemd: Finished Create List of Static Device Nodes.
Jan 31 01:25:50 np0005603622 systemd: Finished Apply Kernel Variables.
Jan 31 01:25:50 np0005603622 systemd: Finished Create System Users.
Jan 31 01:25:50 np0005603622 systemd-journald[304]: Journal started
Jan 31 01:25:50 np0005603622 systemd-journald[304]: Runtime Journal (/run/log/journal/ff96e0bf56de4f3eba9602bb86eece1f) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:25:50 np0005603622 systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 31 01:25:50 np0005603622 systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 31 01:25:50 np0005603622 systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 31 01:25:50 np0005603622 systemd: Started Journal Service.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 01:25:50 np0005603622 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 01:25:50 np0005603622 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 01:25:50 np0005603622 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 01:25:50 np0005603622 systemd[1]: Finished Setup Virtual Console.
Jan 31 01:25:50 np0005603622 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting dracut cmdline hook...
Jan 31 01:25:50 np0005603622 dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Jan 31 01:25:50 np0005603622 dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 31 01:25:50 np0005603622 systemd[1]: Finished dracut cmdline hook.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting dracut pre-udev hook...
Jan 31 01:25:50 np0005603622 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 31 01:25:50 np0005603622 kernel: device-mapper: uevent: version 1.0.3
Jan 31 01:25:50 np0005603622 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 31 01:25:50 np0005603622 kernel: RPC: Registered named UNIX socket transport module.
Jan 31 01:25:50 np0005603622 kernel: RPC: Registered udp transport module.
Jan 31 01:25:50 np0005603622 kernel: RPC: Registered tcp transport module.
Jan 31 01:25:50 np0005603622 kernel: RPC: Registered tcp-with-tls transport module.
Jan 31 01:25:50 np0005603622 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 31 01:25:50 np0005603622 rpc.statd[439]: Version 2.5.4 starting
Jan 31 01:25:50 np0005603622 rpc.statd[439]: Initializing NSM state
Jan 31 01:25:50 np0005603622 rpc.idmapd[444]: Setting log level to 0
Jan 31 01:25:50 np0005603622 systemd[1]: Finished dracut pre-udev hook.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 01:25:50 np0005603622 systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 01:25:50 np0005603622 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting dracut pre-trigger hook...
Jan 31 01:25:50 np0005603622 systemd[1]: Finished dracut pre-trigger hook.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting Coldplug All udev Devices...
Jan 31 01:25:50 np0005603622 systemd[1]: Created slice Slice /system/modprobe.
Jan 31 01:25:50 np0005603622 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 01:25:50 np0005603622 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 01:25:50 np0005603622 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 01:25:50 np0005603622 systemd[1]: Reached target Network.
Jan 31 01:25:50 np0005603622 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 31 01:25:50 np0005603622 systemd[1]: Starting dracut initqueue hook...
Jan 31 01:25:50 np0005603622 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:25:50 np0005603622 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 01:25:50 np0005603622 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 31 01:25:50 np0005603622 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 31 01:25:50 np0005603622 kernel: vda: vda1
Jan 31 01:25:50 np0005603622 systemd-udevd[494]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:25:50 np0005603622 kernel: scsi host0: ata_piix
Jan 31 01:25:50 np0005603622 kernel: scsi host1: ata_piix
Jan 31 01:25:50 np0005603622 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 31 01:25:50 np0005603622 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 31 01:25:51 np0005603622 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target Initrd Root Device.
Jan 31 01:25:51 np0005603622 systemd[1]: Mounting Kernel Configuration File System...
Jan 31 01:25:51 np0005603622 kernel: ata1: found unknown device (class 0)
Jan 31 01:25:51 np0005603622 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 31 01:25:51 np0005603622 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 31 01:25:51 np0005603622 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 31 01:25:51 np0005603622 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 31 01:25:51 np0005603622 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 31 01:25:51 np0005603622 systemd[1]: Mounted Kernel Configuration File System.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target System Initialization.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target Basic System.
Jan 31 01:25:51 np0005603622 systemd[1]: Finished dracut initqueue hook.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 31 01:25:51 np0005603622 systemd[1]: Reached target Remote File Systems.
Jan 31 01:25:51 np0005603622 systemd[1]: Starting dracut pre-mount hook...
Jan 31 01:25:51 np0005603622 systemd[1]: Finished dracut pre-mount hook.
Jan 31 01:25:51 np0005603622 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 31 01:25:51 np0005603622 systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Jan 31 01:25:51 np0005603622 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 31 01:25:51 np0005603622 systemd[1]: Mounting /sysroot...
Jan 31 01:25:51 np0005603622 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 31 01:25:51 np0005603622 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 31 01:25:52 np0005603622 kernel: XFS (vda1): Ending clean mount
Jan 31 01:25:52 np0005603622 systemd[1]: Mounted /sysroot.
Jan 31 01:25:52 np0005603622 systemd[1]: Reached target Initrd Root File System.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 31 01:25:52 np0005603622 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 31 01:25:52 np0005603622 systemd[1]: Reached target Initrd File Systems.
Jan 31 01:25:52 np0005603622 systemd[1]: Reached target Initrd Default Target.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting dracut mount hook...
Jan 31 01:25:52 np0005603622 systemd[1]: Finished dracut mount hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 31 01:25:52 np0005603622 rpc.idmapd[444]: exiting on signal 15
Jan 31 01:25:52 np0005603622 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Network.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Timer Units.
Jan 31 01:25:52 np0005603622 systemd[1]: dbus.socket: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Initrd Default Target.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Basic System.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Initrd Root Device.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Initrd /usr File System.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Path Units.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Remote File Systems.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Slice Units.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Socket Units.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target System Initialization.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Local File Systems.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Swaps.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut mount hook.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut pre-mount hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut initqueue hook.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Coldplug All udev Devices.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut pre-trigger hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Setup Virtual Console.
Jan 31 01:25:52 np0005603622 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 31 01:25:52 np0005603622 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Closed udev Control Socket.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Closed udev Kernel Socket.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut pre-udev hook.
Jan 31 01:25:52 np0005603622 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped dracut cmdline hook.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting Cleanup udev Database...
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 31 01:25:52 np0005603622 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 31 01:25:52 np0005603622 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Stopped Create System Users.
Jan 31 01:25:52 np0005603622 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 31 01:25:52 np0005603622 systemd[1]: Finished Cleanup udev Database.
Jan 31 01:25:52 np0005603622 systemd[1]: Reached target Switch Root.
Jan 31 01:25:52 np0005603622 systemd[1]: Starting Switch Root...
Jan 31 01:25:52 np0005603622 systemd[1]: Switching root.
Jan 31 01:25:52 np0005603622 systemd-journald[304]: Journal stopped
Jan 31 01:25:55 np0005603622 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 31 01:25:55 np0005603622 kernel: audit: type=1404 audit(1769840753.459:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:25:55 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:25:55 np0005603622 kernel: audit: type=1403 audit(1769840753.724:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 31 01:25:55 np0005603622 systemd: Successfully loaded SELinux policy in 280.399ms.
Jan 31 01:25:55 np0005603622 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 58.475ms.
Jan 31 01:25:55 np0005603622 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 31 01:25:55 np0005603622 systemd: Detected virtualization kvm.
Jan 31 01:25:55 np0005603622 systemd: Detected architecture x86-64.
Jan 31 01:25:55 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:25:55 np0005603622 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Stopped Switch Root.
Jan 31 01:25:55 np0005603622 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 31 01:25:55 np0005603622 systemd: Created slice Slice /system/getty.
Jan 31 01:25:55 np0005603622 systemd: Created slice Slice /system/serial-getty.
Jan 31 01:25:55 np0005603622 systemd: Created slice Slice /system/sshd-keygen.
Jan 31 01:25:55 np0005603622 systemd: Created slice User and Session Slice.
Jan 31 01:25:55 np0005603622 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 31 01:25:55 np0005603622 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 31 01:25:55 np0005603622 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 31 01:25:55 np0005603622 systemd: Reached target Local Encrypted Volumes.
Jan 31 01:25:55 np0005603622 systemd: Stopped target Switch Root.
Jan 31 01:25:55 np0005603622 systemd: Stopped target Initrd File Systems.
Jan 31 01:25:55 np0005603622 systemd: Stopped target Initrd Root File System.
Jan 31 01:25:55 np0005603622 systemd: Reached target Local Integrity Protected Volumes.
Jan 31 01:25:55 np0005603622 systemd: Reached target Path Units.
Jan 31 01:25:55 np0005603622 systemd: Reached target rpc_pipefs.target.
Jan 31 01:25:55 np0005603622 systemd: Reached target Slice Units.
Jan 31 01:25:55 np0005603622 systemd: Reached target Swaps.
Jan 31 01:25:55 np0005603622 systemd: Reached target Local Verity Protected Volumes.
Jan 31 01:25:55 np0005603622 systemd: Listening on RPCbind Server Activation Socket.
Jan 31 01:25:55 np0005603622 systemd: Reached target RPC Port Mapper.
Jan 31 01:25:55 np0005603622 systemd: Listening on Process Core Dump Socket.
Jan 31 01:25:55 np0005603622 systemd: Listening on initctl Compatibility Named Pipe.
Jan 31 01:25:55 np0005603622 systemd: Listening on udev Control Socket.
Jan 31 01:25:55 np0005603622 systemd: Listening on udev Kernel Socket.
Jan 31 01:25:55 np0005603622 systemd: Mounting Huge Pages File System...
Jan 31 01:25:55 np0005603622 systemd: Mounting POSIX Message Queue File System...
Jan 31 01:25:55 np0005603622 systemd: Mounting Kernel Debug File System...
Jan 31 01:25:55 np0005603622 systemd: Mounting Kernel Trace File System...
Jan 31 01:25:55 np0005603622 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 01:25:55 np0005603622 systemd: Starting Create List of Static Device Nodes...
Jan 31 01:25:55 np0005603622 systemd: Starting Load Kernel Module configfs...
Jan 31 01:25:55 np0005603622 systemd: Starting Load Kernel Module drm...
Jan 31 01:25:55 np0005603622 systemd: Starting Load Kernel Module efi_pstore...
Jan 31 01:25:55 np0005603622 systemd: Starting Load Kernel Module fuse...
Jan 31 01:25:55 np0005603622 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 31 01:25:55 np0005603622 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Stopped File System Check on Root Device.
Jan 31 01:25:55 np0005603622 systemd: Stopped Journal Service.
Jan 31 01:25:55 np0005603622 systemd: Starting Journal Service...
Jan 31 01:25:55 np0005603622 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 31 01:25:55 np0005603622 systemd: Starting Generate network units from Kernel command line...
Jan 31 01:25:55 np0005603622 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:25:55 np0005603622 systemd: Starting Remount Root and Kernel File Systems...
Jan 31 01:25:55 np0005603622 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 31 01:25:55 np0005603622 systemd: Starting Apply Kernel Variables...
Jan 31 01:25:55 np0005603622 kernel: fuse: init (API version 7.37)
Jan 31 01:25:55 np0005603622 systemd: Starting Coldplug All udev Devices...
Jan 31 01:25:55 np0005603622 systemd: Mounted Huge Pages File System.
Jan 31 01:25:55 np0005603622 systemd: Mounted POSIX Message Queue File System.
Jan 31 01:25:55 np0005603622 systemd: Mounted Kernel Debug File System.
Jan 31 01:25:55 np0005603622 systemd: Mounted Kernel Trace File System.
Jan 31 01:25:55 np0005603622 systemd: Finished Create List of Static Device Nodes.
Jan 31 01:25:55 np0005603622 systemd: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Finished Load Kernel Module configfs.
Jan 31 01:25:55 np0005603622 systemd: modprobe@efi_pstore.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Finished Load Kernel Module efi_pstore.
Jan 31 01:25:55 np0005603622 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 31 01:25:55 np0005603622 systemd: modprobe@fuse.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Finished Load Kernel Module fuse.
Jan 31 01:25:55 np0005603622 systemd-journald[676]: Journal started
Jan 31 01:25:55 np0005603622 systemd-journald[676]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:25:55 np0005603622 kernel: ACPI: bus type drm_connector registered
Jan 31 01:25:55 np0005603622 systemd[1]: Queued start job for default target Multi-User System.
Jan 31 01:25:55 np0005603622 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd: Started Journal Service.
Jan 31 01:25:55 np0005603622 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Load Kernel Module drm.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Generate network units from Kernel command line.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Apply Kernel Variables.
Jan 31 01:25:55 np0005603622 systemd[1]: Mounting FUSE Control File System...
Jan 31 01:25:55 np0005603622 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Rebuild Hardware Database...
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 31 01:25:55 np0005603622 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Load/Save OS Random Seed...
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Create System Users...
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Coldplug All udev Devices.
Jan 31 01:25:55 np0005603622 systemd[1]: Mounted FUSE Control File System.
Jan 31 01:25:55 np0005603622 systemd-journald[676]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 31 01:25:55 np0005603622 systemd-journald[676]: Received client request to flush runtime journal.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Create System Users.
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Load/Save OS Random Seed.
Jan 31 01:25:55 np0005603622 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 31 01:25:55 np0005603622 systemd[1]: Reached target Preparation for Local File Systems.
Jan 31 01:25:55 np0005603622 systemd[1]: Reached target Local File Systems.
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 31 01:25:55 np0005603622 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 31 01:25:55 np0005603622 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 31 01:25:55 np0005603622 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Automatic Boot Loader Update...
Jan 31 01:25:55 np0005603622 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Create Volatile Files and Directories...
Jan 31 01:25:55 np0005603622 bootctl[693]: Couldn't find EFI system partition, skipping.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Automatic Boot Loader Update.
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Create Volatile Files and Directories.
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Security Auditing Service...
Jan 31 01:25:55 np0005603622 systemd[1]: Starting RPC Bind...
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Rebuild Journal Catalog...
Jan 31 01:25:55 np0005603622 auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 31 01:25:55 np0005603622 auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Rebuild Journal Catalog.
Jan 31 01:25:55 np0005603622 systemd[1]: Started RPC Bind.
Jan 31 01:25:55 np0005603622 augenrules[704]: /sbin/augenrules: No change
Jan 31 01:25:55 np0005603622 augenrules[719]: No rules
Jan 31 01:25:55 np0005603622 augenrules[719]: enabled 1
Jan 31 01:25:55 np0005603622 augenrules[719]: failure 1
Jan 31 01:25:55 np0005603622 augenrules[719]: pid 699
Jan 31 01:25:55 np0005603622 augenrules[719]: rate_limit 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_limit 8192
Jan 31 01:25:55 np0005603622 augenrules[719]: lost 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog 3
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time 60000
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time_actual 0
Jan 31 01:25:55 np0005603622 augenrules[719]: enabled 1
Jan 31 01:25:55 np0005603622 augenrules[719]: failure 1
Jan 31 01:25:55 np0005603622 augenrules[719]: pid 699
Jan 31 01:25:55 np0005603622 augenrules[719]: rate_limit 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_limit 8192
Jan 31 01:25:55 np0005603622 augenrules[719]: lost 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog 4
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time 60000
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time_actual 0
Jan 31 01:25:55 np0005603622 augenrules[719]: enabled 1
Jan 31 01:25:55 np0005603622 augenrules[719]: failure 1
Jan 31 01:25:55 np0005603622 augenrules[719]: pid 699
Jan 31 01:25:55 np0005603622 augenrules[719]: rate_limit 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_limit 8192
Jan 31 01:25:55 np0005603622 augenrules[719]: lost 0
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog 4
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time 60000
Jan 31 01:25:55 np0005603622 augenrules[719]: backlog_wait_time_actual 0
Jan 31 01:25:55 np0005603622 systemd[1]: Started Security Auditing Service.
Jan 31 01:25:55 np0005603622 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 31 01:25:55 np0005603622 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 31 01:25:56 np0005603622 systemd[1]: Finished Rebuild Hardware Database.
Jan 31 01:25:56 np0005603622 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 31 01:25:56 np0005603622 systemd-udevd[727]: Using default interface naming scheme 'rhel-9.0'.
Jan 31 01:25:56 np0005603622 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 31 01:25:56 np0005603622 systemd[1]: Starting Load Kernel Module configfs...
Jan 31 01:25:56 np0005603622 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 31 01:25:56 np0005603622 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 31 01:25:56 np0005603622 systemd[1]: Finished Load Kernel Module configfs.
Jan 31 01:25:56 np0005603622 systemd-udevd[746]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:25:56 np0005603622 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 31 01:25:56 np0005603622 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 31 01:25:56 np0005603622 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 31 01:25:56 np0005603622 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 31 01:25:56 np0005603622 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 31 01:25:56 np0005603622 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 31 01:25:56 np0005603622 kernel: Console: switching to colour dummy device 80x25
Jan 31 01:25:56 np0005603622 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 31 01:25:56 np0005603622 kernel: [drm] features: -context_init
Jan 31 01:25:56 np0005603622 kernel: [drm] number of scanouts: 1
Jan 31 01:25:56 np0005603622 kernel: [drm] number of cap sets: 0
Jan 31 01:25:56 np0005603622 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 31 01:25:56 np0005603622 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 31 01:25:56 np0005603622 kernel: Console: switching to colour frame buffer device 128x48
Jan 31 01:25:56 np0005603622 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 31 01:25:56 np0005603622 kernel: kvm_amd: TSC scaling supported
Jan 31 01:25:56 np0005603622 kernel: kvm_amd: Nested Virtualization enabled
Jan 31 01:25:56 np0005603622 kernel: kvm_amd: Nested Paging enabled
Jan 31 01:25:56 np0005603622 kernel: kvm_amd: LBR virtualization supported
Jan 31 01:25:57 np0005603622 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 31 01:25:57 np0005603622 systemd[1]: Starting Update is Completed...
Jan 31 01:25:57 np0005603622 systemd[1]: Finished Update is Completed.
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target System Initialization.
Jan 31 01:25:57 np0005603622 systemd[1]: Started dnf makecache --timer.
Jan 31 01:25:57 np0005603622 systemd[1]: Started Daily rotation of log files.
Jan 31 01:25:57 np0005603622 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target Timer Units.
Jan 31 01:25:57 np0005603622 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 31 01:25:57 np0005603622 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target Socket Units.
Jan 31 01:25:57 np0005603622 systemd[1]: Starting D-Bus System Message Bus...
Jan 31 01:25:57 np0005603622 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:25:57 np0005603622 systemd[1]: Started D-Bus System Message Bus.
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target Basic System.
Jan 31 01:25:57 np0005603622 dbus-broker-lau[807]: Ready
Jan 31 01:25:57 np0005603622 systemd[1]: Starting NTP client/server...
Jan 31 01:25:57 np0005603622 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 31 01:25:57 np0005603622 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 31 01:25:57 np0005603622 systemd[1]: Starting IPv4 firewall with iptables...
Jan 31 01:25:57 np0005603622 systemd[1]: Started irqbalance daemon.
Jan 31 01:25:57 np0005603622 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 31 01:25:57 np0005603622 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:25:57 np0005603622 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:25:57 np0005603622 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target sshd-keygen.target.
Jan 31 01:25:57 np0005603622 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 31 01:25:57 np0005603622 systemd[1]: Reached target User and Group Name Lookups.
Jan 31 01:25:57 np0005603622 systemd[1]: Starting User Login Management...
Jan 31 01:25:57 np0005603622 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 31 01:25:57 np0005603622 systemd-logind[817]: New seat seat0.
Jan 31 01:25:57 np0005603622 systemd-logind[817]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 01:25:57 np0005603622 systemd-logind[817]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 01:25:57 np0005603622 systemd[1]: Started User Login Management.
Jan 31 01:25:57 np0005603622 chronyd[826]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 01:25:57 np0005603622 chronyd[826]: Loaded 0 symmetric keys
Jan 31 01:25:57 np0005603622 chronyd[826]: Using right/UTC timezone to obtain leap second data
Jan 31 01:25:57 np0005603622 chronyd[826]: Loaded seccomp filter (level 2)
Jan 31 01:25:57 np0005603622 systemd[1]: Started NTP client/server.
Jan 31 01:25:57 np0005603622 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 31 01:25:57 np0005603622 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 31 01:25:57 np0005603622 iptables.init[812]: iptables: Applying firewall rules: [  OK  ]
Jan 31 01:25:57 np0005603622 systemd[1]: Finished IPv4 firewall with iptables.
Jan 31 01:25:59 np0005603622 cloud-init[835]: Cloud-init v. 24.4-8.el9 running 'init-local' at Sat, 31 Jan 2026 06:25:59 +0000. Up 10.78 seconds.
Jan 31 01:25:59 np0005603622 systemd[1]: run-cloud\x2dinit-tmp-tmp5cz774t5.mount: Deactivated successfully.
Jan 31 01:25:59 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 01:25:59 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 01:25:59 np0005603622 systemd-hostnamed[851]: Hostname set to <np0005603622.novalocal> (static)
Jan 31 01:26:00 np0005603622 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 31 01:26:00 np0005603622 systemd[1]: Reached target Preparation for Network.
Jan 31 01:26:00 np0005603622 systemd[1]: Starting Network Manager...
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6464] NetworkManager (version 1.54.3-2.el9) is starting... (boot:4be470f6-96ef-472f-95f9-87bd5086f491)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6469] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6870] manager[0x555c493dc000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6918] hostname: hostname: using hostnamed
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6919] hostname: static hostname changed from (none) to "np0005603622.novalocal"
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.6924] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7093] manager[0x555c493dc000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7095] manager[0x555c493dc000]: rfkill: WWAN hardware radio set enabled
Jan 31 01:26:00 np0005603622 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7548] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7549] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7550] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7551] manager: Networking is enabled by state file
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7554] settings: Loaded settings plugin: keyfile (internal)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7646] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7691] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7713] dhcp: init: Using DHCP client 'internal'
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7719] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 01:26:00 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7750] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7816] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7848] device (lo): Activation: starting connection 'lo' (77165707-765c-458b-b9f6-329fd98abdae)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7858] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7861] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:26:00 np0005603622 systemd[1]: Started Network Manager.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7886] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7892] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7895] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7897] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7899] device (eth0): carrier: link connected
Jan 31 01:26:00 np0005603622 systemd[1]: Reached target Network.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7903] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7912] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7922] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7927] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7928] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7932] manager: NetworkManager state is now CONNECTING
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7934] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 systemd[1]: Starting Network Manager Wait Online...
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7944] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7948] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:26:00 np0005603622 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7987] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Jan 31 01:26:00 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.7998] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8025] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8213] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8215] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8216] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8221] device (lo): Activation: successful, device activated.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8228] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8232] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8235] device (eth0): Activation: successful, device activated.
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8240] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 01:26:00 np0005603622 NetworkManager[855]: <info>  [1769840760.8244] manager: startup complete
Jan 31 01:26:00 np0005603622 systemd[1]: Finished Network Manager Wait Online.
Jan 31 01:26:00 np0005603622 systemd[1]: Starting Cloud-init: Network Stage...
Jan 31 01:26:00 np0005603622 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 31 01:26:00 np0005603622 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 31 01:26:00 np0005603622 systemd[1]: Reached target NFS client services.
Jan 31 01:26:00 np0005603622 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 31 01:26:00 np0005603622 systemd[1]: Reached target Remote File Systems.
Jan 31 01:26:00 np0005603622 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 31 01:26:01 np0005603622 cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Sat, 31 Jan 2026 06:26:01 +0000. Up 12.49 seconds.
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |  eth0  | True |         38.102.83.45        | 255.255.255.0 | global | fa:16:3e:19:08:b9 |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe19:8b9/64 |       .       |  link  | fa:16:3e:19:08:b9 |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 31 01:26:01 np0005603622 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 31 01:26:05 np0005603622 chronyd[826]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 31 01:26:05 np0005603622 chronyd[826]: System clock TAI offset set to 37 seconds
Jan 31 01:26:07 np0005603622 chronyd[826]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 25 affinity is now unmanaged
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 31 affinity is now unmanaged
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 28 affinity is now unmanaged
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 32 affinity is now unmanaged
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 30 affinity is now unmanaged
Jan 31 01:26:07 np0005603622 irqbalance[813]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 31 01:26:07 np0005603622 irqbalance[813]: IRQ 29 affinity is now unmanaged
Jan 31 01:26:10 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:26:13 np0005603622 cloud-init[920]: Generating public/private rsa key pair.
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key fingerprint is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: SHA256:8UbmWMPTX9w3R+d2fQWg+/uMpjeX3Q49oBkhcsj4vAc root@np0005603622.novalocal
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key's randomart image is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: +---[RSA 3072]----+
Jan 31 01:26:13 np0005603622 cloud-init[920]: |            ....+|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |      o .. o   +=|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |     . +.oO..  .%|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |      o oO.+.. o*|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |       ES =. ..  |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |        o. .+ .. |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |       . . o. .+o|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |        .   ++o.+|
Jan 31 01:26:13 np0005603622 cloud-init[920]: |          .+o+o..|
Jan 31 01:26:13 np0005603622 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:26:13 np0005603622 cloud-init[920]: Generating public/private ecdsa key pair.
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key fingerprint is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: SHA256:jI9gHprIozzIdBeAzoFuqm6qISd+eKU6XES19w2+auc root@np0005603622.novalocal
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key's randomart image is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: +---[ECDSA 256]---+
Jan 31 01:26:13 np0005603622 cloud-init[920]: |. . ..           |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |.o o  .          |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |+ o .. . .       |
Jan 31 01:26:13 np0005603622 cloud-init[920]: | = . ..oo o      |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |o . + o So .     |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |oo B = o  .      |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |@oB = . ..       |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |B@.+   ...       |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |O=*   ..oE       |
Jan 31 01:26:13 np0005603622 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:26:13 np0005603622 cloud-init[920]: Generating public/private ed25519 key pair.
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 31 01:26:13 np0005603622 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key fingerprint is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: SHA256:j+aWkT2RSftgCc4DS8KvZGUHOcJ1uGCTe4wbAYVYpLk root@np0005603622.novalocal
Jan 31 01:26:13 np0005603622 cloud-init[920]: The key's randomart image is:
Jan 31 01:26:13 np0005603622 cloud-init[920]: +--[ED25519 256]--+
Jan 31 01:26:13 np0005603622 cloud-init[920]: | ++*.oo+.        |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |.o. @ O.o .      |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |o  . # O o =     |
Jan 31 01:26:13 np0005603622 cloud-init[920]: | .  * * + O      |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |E  o =  S= +     |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |    o   ooo .    |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |        oo..     |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |       oo        |
Jan 31 01:26:13 np0005603622 cloud-init[920]: |       ..        |
Jan 31 01:26:13 np0005603622 cloud-init[920]: +----[SHA256]-----+
Jan 31 01:26:13 np0005603622 systemd[1]: Finished Cloud-init: Network Stage.
Jan 31 01:26:13 np0005603622 systemd[1]: Reached target Cloud-config availability.
Jan 31 01:26:13 np0005603622 systemd[1]: Reached target Network is Online.
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Cloud-init: Config Stage...
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Crash recovery kernel arming...
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Notify NFS peers of a restart...
Jan 31 01:26:13 np0005603622 systemd[1]: Starting System Logging Service...
Jan 31 01:26:13 np0005603622 systemd[1]: Starting OpenSSH server daemon...
Jan 31 01:26:13 np0005603622 sm-notify[1002]: Version 2.5.4 starting
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Permit User Sessions...
Jan 31 01:26:13 np0005603622 systemd[1]: Started Notify NFS peers of a restart.
Jan 31 01:26:13 np0005603622 systemd[1]: Finished Permit User Sessions.
Jan 31 01:26:13 np0005603622 systemd[1]: Started OpenSSH server daemon.
Jan 31 01:26:13 np0005603622 systemd[1]: Started Command Scheduler.
Jan 31 01:26:13 np0005603622 systemd[1]: Started Getty on tty1.
Jan 31 01:26:13 np0005603622 systemd[1]: Started Serial Getty on ttyS0.
Jan 31 01:26:13 np0005603622 systemd[1]: Reached target Login Prompts.
Jan 31 01:26:13 np0005603622 rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Jan 31 01:26:13 np0005603622 rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 31 01:26:13 np0005603622 systemd[1]: Started System Logging Service.
Jan 31 01:26:13 np0005603622 systemd[1]: Reached target Multi-User System.
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 31 01:26:13 np0005603622 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 31 01:26:13 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 01:26:13 np0005603622 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 31 01:26:13 np0005603622 cloud-init[1063]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Sat, 31 Jan 2026 06:26:13 +0000. Up 25.08 seconds.
Jan 31 01:26:13 np0005603622 kdumpctl[1013]: kdump: No kdump initial ramdisk found.
Jan 31 01:26:13 np0005603622 kdumpctl[1013]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 31 01:26:13 np0005603622 systemd[1]: Finished Cloud-init: Config Stage.
Jan 31 01:26:13 np0005603622 systemd[1]: Starting Cloud-init: Final Stage...
Jan 31 01:26:14 np0005603622 cloud-init[1229]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Sat, 31 Jan 2026 06:26:14 +0000. Up 25.42 seconds.
Jan 31 01:26:14 np0005603622 cloud-init[1273]: #############################################################
Jan 31 01:26:14 np0005603622 cloud-init[1277]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 31 01:26:14 np0005603622 cloud-init[1280]: 256 SHA256:jI9gHprIozzIdBeAzoFuqm6qISd+eKU6XES19w2+auc root@np0005603622.novalocal (ECDSA)
Jan 31 01:26:14 np0005603622 cloud-init[1286]: 256 SHA256:j+aWkT2RSftgCc4DS8KvZGUHOcJ1uGCTe4wbAYVYpLk root@np0005603622.novalocal (ED25519)
Jan 31 01:26:14 np0005603622 cloud-init[1291]: 3072 SHA256:8UbmWMPTX9w3R+d2fQWg+/uMpjeX3Q49oBkhcsj4vAc root@np0005603622.novalocal (RSA)
Jan 31 01:26:14 np0005603622 cloud-init[1292]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 31 01:26:14 np0005603622 dracut[1289]: dracut-057-102.git20250818.el9
Jan 31 01:26:14 np0005603622 cloud-init[1294]: #############################################################
Jan 31 01:26:14 np0005603622 cloud-init[1229]: Cloud-init v. 24.4-8.el9 finished at Sat, 31 Jan 2026 06:26:14 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 25.64 seconds
Jan 31 01:26:14 np0005603622 dracut[1299]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 31 01:26:14 np0005603622 systemd[1]: Finished Cloud-init: Final Stage.
Jan 31 01:26:14 np0005603622 systemd[1]: Reached target Cloud-init target.
Jan 31 01:26:14 np0005603622 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: memstrack is not available
Jan 31 01:26:15 np0005603622 dracut[1299]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 31 01:26:15 np0005603622 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 31 01:26:16 np0005603622 dracut[1299]: memstrack is not available
Jan 31 01:26:16 np0005603622 dracut[1299]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 31 01:26:16 np0005603622 dracut[1299]: *** Including module: systemd ***
Jan 31 01:26:16 np0005603622 dracut[1299]: *** Including module: fips ***
Jan 31 01:26:16 np0005603622 dracut[1299]: *** Including module: systemd-initrd ***
Jan 31 01:26:16 np0005603622 dracut[1299]: *** Including module: i18n ***
Jan 31 01:26:17 np0005603622 dracut[1299]: *** Including module: drm ***
Jan 31 01:26:17 np0005603622 dracut[1299]: *** Including module: prefixdevname ***
Jan 31 01:26:17 np0005603622 dracut[1299]: *** Including module: kernel-modules ***
Jan 31 01:26:17 np0005603622 kernel: block vda: the capability attribute has been deprecated.
Jan 31 01:26:17 np0005603622 dracut[1299]: *** Including module: kernel-modules-extra ***
Jan 31 01:26:17 np0005603622 dracut[1299]: *** Including module: qemu ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: fstab-sys ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: rootfs-block ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: terminfo ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: udev-rules ***
Jan 31 01:26:18 np0005603622 dracut[1299]: Skipping udev rule: 91-permissions.rules
Jan 31 01:26:18 np0005603622 dracut[1299]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: virtiofs ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: dracut-systemd ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: usrmount ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: base ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: fs-lib ***
Jan 31 01:26:18 np0005603622 dracut[1299]: *** Including module: kdumpbase ***
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 31 01:26:19 np0005603622 dracut[1299]:  microcode_ctl module: mangling fw_dir
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 31 01:26:19 np0005603622 dracut[1299]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Including module: openssl ***
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Including module: shutdown ***
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Including module: squash ***
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Including modules done ***
Jan 31 01:26:19 np0005603622 dracut[1299]: *** Installing kernel module dependencies ***
Jan 31 01:26:20 np0005603622 dracut[1299]: *** Installing kernel module dependencies done ***
Jan 31 01:26:20 np0005603622 dracut[1299]: *** Resolving executable dependencies ***
Jan 31 01:26:22 np0005603622 dracut[1299]: *** Resolving executable dependencies done ***
Jan 31 01:26:22 np0005603622 dracut[1299]: *** Generating early-microcode cpio image ***
Jan 31 01:26:22 np0005603622 dracut[1299]: *** Store current command line parameters ***
Jan 31 01:26:22 np0005603622 dracut[1299]: Stored kernel commandline:
Jan 31 01:26:22 np0005603622 dracut[1299]: No dracut internal kernel commandline stored in the initramfs
Jan 31 01:26:23 np0005603622 dracut[1299]: *** Install squash loader ***
Jan 31 01:26:24 np0005603622 dracut[1299]: *** Squashing the files inside the initramfs ***
Jan 31 01:26:25 np0005603622 dracut[1299]: *** Squashing the files inside the initramfs done ***
Jan 31 01:26:25 np0005603622 dracut[1299]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 31 01:26:25 np0005603622 dracut[1299]: *** Hardlinking files ***
Jan 31 01:26:25 np0005603622 dracut[1299]: *** Hardlinking files done ***
Jan 31 01:26:25 np0005603622 dracut[1299]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 31 01:26:26 np0005603622 kdumpctl[1013]: kdump: kexec: loaded kdump kernel
Jan 31 01:26:26 np0005603622 kdumpctl[1013]: kdump: Starting kdump: [OK]
Jan 31 01:26:26 np0005603622 systemd[1]: Finished Crash recovery kernel arming.
Jan 31 01:26:26 np0005603622 systemd[1]: Startup finished in 1.234s (kernel) + 3.580s (initrd) + 32.943s (userspace) = 37.759s.
Jan 31 01:26:30 np0005603622 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:29:22 np0005603622 chronyd[826]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 31 01:31:01 np0005603622 systemd[1]: Created slice User Slice of UID 1000.
Jan 31 01:31:01 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 31 01:31:01 np0005603622 systemd-logind[817]: New session 1 of user zuul.
Jan 31 01:31:01 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 31 01:31:01 np0005603622 systemd[1]: Starting User Manager for UID 1000...
Jan 31 01:31:01 np0005603622 systemd[4310]: Queued start job for default target Main User Target.
Jan 31 01:31:01 np0005603622 systemd[4310]: Created slice User Application Slice.
Jan 31 01:31:01 np0005603622 systemd[4310]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 01:31:01 np0005603622 systemd[4310]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 01:31:01 np0005603622 systemd[4310]: Reached target Paths.
Jan 31 01:31:01 np0005603622 systemd[4310]: Reached target Timers.
Jan 31 01:31:01 np0005603622 systemd[4310]: Starting D-Bus User Message Bus Socket...
Jan 31 01:31:01 np0005603622 systemd[4310]: Starting Create User's Volatile Files and Directories...
Jan 31 01:31:01 np0005603622 systemd[4310]: Finished Create User's Volatile Files and Directories.
Jan 31 01:31:01 np0005603622 systemd[4310]: Listening on D-Bus User Message Bus Socket.
Jan 31 01:31:01 np0005603622 systemd[4310]: Reached target Sockets.
Jan 31 01:31:01 np0005603622 systemd[4310]: Reached target Basic System.
Jan 31 01:31:01 np0005603622 systemd[4310]: Reached target Main User Target.
Jan 31 01:31:01 np0005603622 systemd[4310]: Startup finished in 171ms.
Jan 31 01:31:01 np0005603622 systemd[1]: Started User Manager for UID 1000.
Jan 31 01:31:01 np0005603622 systemd[1]: Started Session 1 of User zuul.
Jan 31 01:31:02 np0005603622 python3[4392]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:35 np0005603622 python3[4421]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:42 np0005603622 python3[4479]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:43 np0005603622 python3[4519]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 31 01:31:45 np0005603622 python3[4545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0NROJyvixYj14yxc9a1mzd1FlH8bHxigBuCuSXZp+XBwK5CWQYNe1kWs8LnwK1EvgGycvb2uWsCqXynoIDepSR4X45xPMVj2xEV2M0gYJN2FioWZRuHFYKZQNIY+ZFpOMgic6vKkz3uR6hw5OogchCCdEPofRUiDvA6imrPii/QP8S3YnwQCYwkeq72uqj4sslD467c/NglKPLZEKdfcnC4ZLM8nrcRiZwRfWls2oF0OWdbFwIn6RiwJGvZAk12ezTFzNyNkHfkadH1PD5F7tLZVrxU1P73llDzfyU8ppwjlIEtvATWFb1y5VF8VkOvjaen+/DMoFYiLvR6MUyI4JAZ7JmXxmvhLxQHPwYFTbzdjdZRYeQvPWAwtH9LWW2cdBkvLA/vGY+PSqXhb3aAM/O6R0lcyTmGVNEMRpwYZYmdoB8Cr9m2jOxZ7Ffwbs94foCUrIVlc3dkcMCTaUTrXBAqnbUteQ/Ctgp2pFJOSEse2AQi52Xm8A87QOl1wYmN8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:46 np0005603622 python3[4569]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:46 np0005603622 python3[4668]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:47 np0005603622 python3[4739]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841106.5479498-252-103583942976853/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=24510bce553d47bd8880c3ff7a9c0ec0_id_rsa follow=False checksum=3ef597b6e54d9d641aaa8554b0f170ef780d386a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:47 np0005603622 python3[4862]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:48 np0005603622 python3[4933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841107.5314217-307-211841121254986/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=24510bce553d47bd8880c3ff7a9c0ec0_id_rsa.pub follow=False checksum=9e6accf0af0859cd95b66274f88d88182d33bf59 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:49 np0005603622 python3[4981]: ansible-ping Invoked with data=pong
Jan 31 01:31:50 np0005603622 python3[5005]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:31:52 np0005603622 python3[5063]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 31 01:31:53 np0005603622 python3[5095]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:53 np0005603622 python3[5119]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:53 np0005603622 python3[5143]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603622 python3[5167]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603622 python3[5191]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:54 np0005603622 python3[5215]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:56 np0005603622 python3[5241]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:57 np0005603622 python3[5319]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:31:58 np0005603622 python3[5392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841117.1505828-32-54995870748610/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:31:59 np0005603622 python3[5440]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:59 np0005603622 python3[5464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:31:59 np0005603622 python3[5488]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603622 python3[5512]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603622 python3[5536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603622 python3[5560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:00 np0005603622 python3[5584]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603622 python3[5608]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603622 python3[5632]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603622 python3[5656]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:01 np0005603622 python3[5680]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603622 python3[5704]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603622 python3[5728]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:02 np0005603622 python3[5752]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603622 python3[5776]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603622 python3[5800]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603622 python3[5824]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:03 np0005603622 python3[5848]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:04 np0005603622 python3[5872]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:04 np0005603622 python3[5896]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603622 python3[5920]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603622 python3[5944]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603622 python3[5968]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:05 np0005603622 python3[5992]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:06 np0005603622 python3[6016]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:06 np0005603622 python3[6040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:32:09 np0005603622 python3[6066]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 01:32:09 np0005603622 systemd[1]: Starting Time & Date Service...
Jan 31 01:32:09 np0005603622 systemd[1]: Started Time & Date Service.
Jan 31 01:32:09 np0005603622 systemd-timedated[6068]: Changed time zone to 'UTC' (UTC).
Jan 31 01:32:10 np0005603622 python3[6097]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:10 np0005603622 python3[6173]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:11 np0005603622 python3[6244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769841130.66757-253-199057460921585/source _original_basename=tmpbcr5mzcz follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:11 np0005603622 python3[6344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:12 np0005603622 python3[6415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769841131.493616-303-180648750627037/source _original_basename=tmpe_b2zw7o follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:12 np0005603622 python3[6517]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:13 np0005603622 python3[6590]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769841132.5980392-382-182910407930573/source _original_basename=tmp1qbyfdsk follow=False checksum=46661bebe1ecf41d3f6ce259e9385cc50a3d3082 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:13 np0005603622 python3[6638]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:14 np0005603622 python3[6664]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:14 np0005603622 python3[6744]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:32:15 np0005603622 python3[6817]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841134.677128-453-267014245546386/source _original_basename=tmpirzms4in follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:16 np0005603622 python3[6868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-b97d-7d19-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:32:16 np0005603622 python3[6896]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b97d-7d19-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 31 01:32:18 np0005603622 python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:32:39 np0005603622 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 01:32:52 np0005603622 python3[6953]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:33:25 np0005603622 systemd[4310]: Starting Mark boot as successful...
Jan 31 01:33:25 np0005603622 systemd[4310]: Finished Mark boot as successful.
Jan 31 01:33:52 np0005603622 systemd-logind[817]: Session 1 logged out. Waiting for processes to exit.
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 31 01:34:01 np0005603622 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 31 01:34:01 np0005603622 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.0753] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 01:34:01 np0005603622 systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.0977] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.0997] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.0999] device (eth1): carrier: link connected
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1001] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1005] policy: auto-activating connection 'Wired connection 1' (41135365-bff3-3539-a318-7de98caf3b09)
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1007] device (eth1): Activation: starting connection 'Wired connection 1' (41135365-bff3-3539-a318-7de98caf3b09)
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1008] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1010] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1014] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:34:01 np0005603622 NetworkManager[855]: <info>  [1769841241.1016] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:02 np0005603622 systemd-logind[817]: New session 3 of user zuul.
Jan 31 01:34:02 np0005603622 systemd[1]: Started Session 3 of User zuul.
Jan 31 01:34:02 np0005603622 python3[6987]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-922d-b924-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:34:12 np0005603622 python3[7067]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:34:12 np0005603622 python3[7140]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769841252.2026215-155-158780070002481/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=67466266fff3d7932a2742890766f878958367c0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:34:13 np0005603622 python3[7190]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 01:34:13 np0005603622 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 01:34:13 np0005603622 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 01:34:13 np0005603622 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.3983] caught SIGTERM, shutting down normally.
Jan 31 01:34:13 np0005603622 systemd[1]: Stopping Network Manager...
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.3989] dhcp4 (eth0): canceled DHCP transaction
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.3989] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.3989] dhcp4 (eth0): state changed no lease
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.3991] manager: NetworkManager state is now CONNECTING
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.4143] dhcp4 (eth1): canceled DHCP transaction
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.4143] dhcp4 (eth1): state changed no lease
Jan 31 01:34:13 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:34:13 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:34:13 np0005603622 NetworkManager[855]: <info>  [1769841253.6387] exiting (success)
Jan 31 01:34:13 np0005603622 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 01:34:13 np0005603622 systemd[1]: Stopped Network Manager.
Jan 31 01:34:13 np0005603622 systemd[1]: NetworkManager.service: Consumed 4.182s CPU time, 10.1M memory peak.
Jan 31 01:34:13 np0005603622 systemd[1]: Starting Network Manager...
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.6805] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:4be470f6-96ef-472f-95f9-87bd5086f491)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.6807] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.6876] manager[0x55ee8144f000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 01:34:13 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 01:34:13 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7453] hostname: hostname: using hostnamed
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7453] hostname: static hostname changed from (none) to "np0005603622.novalocal"
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7457] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7462] manager[0x55ee8144f000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7462] manager[0x55ee8144f000]: rfkill: WWAN hardware radio set enabled
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7488] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7489] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7489] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7490] manager: Networking is enabled by state file
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7492] settings: Loaded settings plugin: keyfile (internal)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7496] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7519] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7527] dhcp: init: Using DHCP client 'internal'
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7530] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7535] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7540] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7547] device (lo): Activation: starting connection 'lo' (77165707-765c-458b-b9f6-329fd98abdae)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7553] device (eth0): carrier: link connected
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7557] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7561] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7562] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7568] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7574] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7580] device (eth1): carrier: link connected
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7584] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7588] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (41135365-bff3-3539-a318-7de98caf3b09) (indicated)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7589] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7593] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7600] device (eth1): Activation: starting connection 'Wired connection 1' (41135365-bff3-3539-a318-7de98caf3b09)
Jan 31 01:34:13 np0005603622 systemd[1]: Started Network Manager.
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7606] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7610] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7612] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7613] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7615] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7620] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7621] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7625] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7627] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7634] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7637] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7644] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7646] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7657] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7663] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7667] device (lo): Activation: successful, device activated.
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7689] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Jan 31 01:34:13 np0005603622 NetworkManager[7207]: <info>  [1769841253.7695] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 01:34:13 np0005603622 systemd[1]: Starting Network Manager Wait Online...
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0398] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0436] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0438] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0440] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0442] device (eth0): Activation: successful, device activated.
Jan 31 01:34:14 np0005603622 NetworkManager[7207]: <info>  [1769841254.0446] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 01:34:14 np0005603622 python3[7255]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-922d-b924-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:34:24 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:34:43 np0005603622 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6070] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 01:34:58 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:34:58 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6321] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6325] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6335] device (eth1): Activation: successful, device activated.
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6342] manager: startup complete
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6344] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <warn>  [1769841298.6352] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6361] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 systemd[1]: Finished Network Manager Wait Online.
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6454] dhcp4 (eth1): canceled DHCP transaction
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6454] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6454] dhcp4 (eth1): state changed no lease
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6474] policy: auto-activating connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24)
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6479] device (eth1): Activation: starting connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24)
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6480] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6486] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6496] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.6505] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.7706] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.7711] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 01:34:58 np0005603622 NetworkManager[7207]: <info>  [1769841298.7723] device (eth1): Activation: successful, device activated.
Jan 31 01:35:08 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:35:14 np0005603622 systemd[1]: session-3.scope: Deactivated successfully.
Jan 31 01:35:14 np0005603622 systemd[1]: session-3.scope: Consumed 1.413s CPU time.
Jan 31 01:35:14 np0005603622 systemd-logind[817]: Session 3 logged out. Waiting for processes to exit.
Jan 31 01:35:14 np0005603622 systemd-logind[817]: Removed session 3.
Jan 31 01:35:56 np0005603622 systemd-logind[817]: New session 4 of user zuul.
Jan 31 01:35:56 np0005603622 systemd[1]: Started Session 4 of User zuul.
Jan 31 01:35:56 np0005603622 python3[7384]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:35:57 np0005603622 python3[7457]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769841356.5598662-373-39486765312454/source _original_basename=tmpml1sessq follow=False checksum=be2f7c16edb43e88d00ebc0882f9b60a044e6a9d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:36:00 np0005603622 systemd[1]: session-4.scope: Deactivated successfully.
Jan 31 01:36:00 np0005603622 systemd-logind[817]: Session 4 logged out. Waiting for processes to exit.
Jan 31 01:36:00 np0005603622 systemd-logind[817]: Removed session 4.
Jan 31 01:36:25 np0005603622 systemd[4310]: Created slice User Background Tasks Slice.
Jan 31 01:36:25 np0005603622 systemd[4310]: Starting Cleanup of User's Temporary Files and Directories...
Jan 31 01:36:25 np0005603622 systemd[4310]: Finished Cleanup of User's Temporary Files and Directories.
Jan 31 01:41:25 np0005603622 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 31 01:41:25 np0005603622 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 31 01:41:25 np0005603622 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 31 01:41:25 np0005603622 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 31 01:46:42 np0005603622 systemd-logind[817]: New session 5 of user zuul.
Jan 31 01:46:42 np0005603622 systemd[1]: Started Session 5 of User zuul.
Jan 31 01:46:42 np0005603622 python3[7520]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-bf7f-c771-000000000cb6-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:43 np0005603622 python3[7549]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:43 np0005603622 python3[7575]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:44 np0005603622 python3[7601]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:44 np0005603622 python3[7627]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:44 np0005603622 python3[7653]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:45 np0005603622 python3[7731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:46:45 np0005603622 python3[7804]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769842004.9304903-377-128141994122550/source _original_basename=tmp1x4xfz6d follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:46:46 np0005603622 python3[7854]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 01:46:46 np0005603622 systemd[1]: Reloading.
Jan 31 01:46:46 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:46:48 np0005603622 python3[7910]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 31 01:46:48 np0005603622 python3[7936]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603622 python3[7964]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603622 python3[7992]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:49 np0005603622 python3[8020]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:50 np0005603622 python3[8047]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-bf7f-c771-000000000cbd-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:46:50 np0005603622 python3[8077]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 31 01:46:53 np0005603622 systemd[1]: session-5.scope: Deactivated successfully.
Jan 31 01:46:53 np0005603622 systemd[1]: session-5.scope: Consumed 3.595s CPU time.
Jan 31 01:46:53 np0005603622 systemd-logind[817]: Session 5 logged out. Waiting for processes to exit.
Jan 31 01:46:53 np0005603622 systemd-logind[817]: Removed session 5.
Jan 31 01:46:55 np0005603622 systemd-logind[817]: New session 6 of user zuul.
Jan 31 01:46:55 np0005603622 systemd[1]: Started Session 6 of User zuul.
Jan 31 01:46:56 np0005603622 python3[8113]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 31 01:46:57 np0005603622 irqbalance[813]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 31 01:46:57 np0005603622 irqbalance[813]: IRQ 27 affinity is now unmanaged
Jan 31 01:47:20 np0005603622 setsebool[8156]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 31 01:47:20 np0005603622 setsebool[8156]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 31 01:47:32 np0005603622 kernel: SELinux:  Converting 385 SID table entries...
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:47:32 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  Converting 388 SID table entries...
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 01:47:42 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 01:48:02 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 01:48:02 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 01:48:02 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 01:48:02 np0005603622 systemd[1]: Reloading.
Jan 31 01:48:02 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 01:48:02 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 01:48:06 np0005603622 python3[11642]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-c34d-28f4-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:48:06 np0005603622 kernel: evm: overlay not supported
Jan 31 01:48:06 np0005603622 systemd[4310]: Starting D-Bus User Message Bus...
Jan 31 01:48:06 np0005603622 dbus-broker-launch[12636]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 31 01:48:06 np0005603622 dbus-broker-launch[12636]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 31 01:48:06 np0005603622 systemd[4310]: Started D-Bus User Message Bus.
Jan 31 01:48:06 np0005603622 dbus-broker-lau[12636]: Ready
Jan 31 01:48:06 np0005603622 systemd[4310]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 31 01:48:06 np0005603622 systemd[4310]: Created slice Slice /user.
Jan 31 01:48:06 np0005603622 systemd[4310]: podman-12520.scope: unit configures an IP firewall, but not running as root.
Jan 31 01:48:06 np0005603622 systemd[4310]: (This warning is only shown for the first unit using IP firewalling.)
Jan 31 01:48:06 np0005603622 systemd[4310]: Started podman-12520.scope.
Jan 31 01:48:07 np0005603622 systemd[4310]: Started podman-pause-b9a60a0b.scope.
Jan 31 01:48:07 np0005603622 python3[13884]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.2:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.2:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:48:07 np0005603622 python3[13884]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 31 01:48:08 np0005603622 systemd-logind[817]: Session 6 logged out. Waiting for processes to exit.
Jan 31 01:48:08 np0005603622 systemd[1]: session-6.scope: Deactivated successfully.
Jan 31 01:48:08 np0005603622 systemd[1]: session-6.scope: Consumed 42.379s CPU time.
Jan 31 01:48:08 np0005603622 systemd-logind[817]: Removed session 6.
Jan 31 01:48:33 np0005603622 systemd-logind[817]: New session 7 of user zuul.
Jan 31 01:48:33 np0005603622 systemd[1]: Started Session 7 of User zuul.
Jan 31 01:48:33 np0005603622 python3[28248]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:34 np0005603622 python3[28808]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:35 np0005603622 python3[29418]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005603622.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 31 01:48:35 np0005603622 python3[29637]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAYzMDYbycOT72ga9wDhD1NtUc7onT0cFXCjwfAnzaB2tvlINsgaQbDQ5ZwqYE9Er0Wi02qKQ4UqK2RbEye6MZA= zuul@np0005603620.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 31 01:48:35 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 01:48:35 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 01:48:35 np0005603622 systemd[1]: man-db-cache-update.service: Consumed 33.644s CPU time.
Jan 31 01:48:35 np0005603622 systemd[1]: run-rb3fad3a484ff4702b663cc6ceca41bce.service: Deactivated successfully.
Jan 31 01:48:36 np0005603622 python3[29859]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:48:36 np0005603622 python3[29932]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769842116.1016507-168-208122143221694/source _original_basename=tmpcl9fxfhh follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:48:37 np0005603622 python3[29982]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 31 01:48:37 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 01:48:37 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 01:48:37 np0005603622 systemd-hostnamed[29986]: Changed pretty hostname to 'compute-1'
Jan 31 01:48:37 np0005603622 systemd-hostnamed[29986]: Hostname set to <compute-1> (static)
Jan 31 01:48:37 np0005603622 NetworkManager[7207]: <info>  [1769842117.7918] hostname: static hostname changed from "np0005603622.novalocal" to "compute-1"
Jan 31 01:48:37 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 01:48:37 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 01:48:38 np0005603622 systemd[1]: session-7.scope: Deactivated successfully.
Jan 31 01:48:38 np0005603622 systemd[1]: session-7.scope: Consumed 1.835s CPU time.
Jan 31 01:48:38 np0005603622 systemd-logind[817]: Session 7 logged out. Waiting for processes to exit.
Jan 31 01:48:38 np0005603622 systemd-logind[817]: Removed session 7.
Jan 31 01:48:47 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 01:49:07 np0005603622 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 01:53:20 np0005603622 systemd-logind[817]: New session 8 of user zuul.
Jan 31 01:53:20 np0005603622 systemd[1]: Started Session 8 of User zuul.
Jan 31 01:53:21 np0005603622 python3[30081]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 01:53:22 np0005603622 python3[30197]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:23 np0005603622 python3[30270]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:23 np0005603622 python3[30296]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:23 np0005603622 python3[30369]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:23 np0005603622 python3[30395]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:24 np0005603622 python3[30468]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:24 np0005603622 python3[30494]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:24 np0005603622 python3[30567]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:25 np0005603622 python3[30593]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:25 np0005603622 python3[30666]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:25 np0005603622 python3[30692]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:25 np0005603622 systemd[1]: Starting dnf makecache...
Jan 31 01:53:25 np0005603622 python3[30766]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:25 np0005603622 dnf[30765]: Failed determining last makecache time.
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-barbican-42b4c41831408a8e323 387 kB/s |  13 kB     00:00
Jan 31 01:53:26 np0005603622 python3[30792]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-glean-642fffe0203a8ffcc2443db52 2.4 MB/s |  65 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-stevedore-c4acc5639fd2329372142 4.8 MB/s | 131 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-cloudkitty-tests-tempest-783703 1.3 MB/s |  32 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-diskimage-builder-61b717cc45660834fe9a  10 MB/s | 349 kB     00:00
Jan 31 01:53:26 np0005603622 python3[30877]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769842402.5780797-34074-36286406191479/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-nova-eaa65f0b85123a4ee343246 1.7 MB/s |  42 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-designate-tests-tempest-347fdbc 552 kB/s |  18 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-glance-1fd12c29b339f30fe823e 833 kB/s |  18 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.4 MB/s |  29 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-manila-d783d10e75495b73866db 1.3 MB/s |  25 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-neutron-95cadbd379667c8520c8 6.9 MB/s | 154 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-octavia-5975097dd4b021385178 1.1 MB/s |  26 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-watcher-c014f81a8647287f6dcc 674 kB/s |  16 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-tcib-78032d201b02cee27e8e644c61 374 kB/s | 7.4 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.4 MB/s | 144 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-swift-dc98a8463506ac520c469a 633 kB/s |  14 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-python-tempestconf-8515371b7cceebd4282 2.3 MB/s |  53 kB     00:00
Jan 31 01:53:26 np0005603622 dnf[30765]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.4 MB/s |  96 kB     00:00
Jan 31 01:53:27 np0005603622 dnf[30765]: CentOS Stream 9 - BaseOS                         63 kB/s | 6.1 kB     00:00
Jan 31 01:53:27 np0005603622 dnf[30765]: CentOS Stream 9 - AppStream                      69 kB/s | 6.5 kB     00:00
Jan 31 01:53:27 np0005603622 dnf[30765]: CentOS Stream 9 - CRB                            63 kB/s | 6.0 kB     00:00
Jan 31 01:53:27 np0005603622 dnf[30765]: CentOS Stream 9 - Extras packages                73 kB/s | 7.3 kB     00:00
Jan 31 01:53:27 np0005603622 dnf[30765]: dlrn-antelope-testing                            28 MB/s | 1.1 MB     00:00
Jan 31 01:53:28 np0005603622 dnf[30765]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Jan 31 01:53:28 np0005603622 dnf[30765]: centos9-rabbitmq                                7.5 MB/s | 123 kB     00:00
Jan 31 01:53:28 np0005603622 dnf[30765]: centos9-storage                                  20 MB/s | 415 kB     00:00
Jan 31 01:53:28 np0005603622 dnf[30765]: centos9-opstools                                4.9 MB/s |  51 kB     00:00
Jan 31 01:53:28 np0005603622 dnf[30765]: NFV SIG OpenvSwitch                              19 MB/s | 461 kB     00:00
Jan 31 01:53:29 np0005603622 dnf[30765]: repo-setup-centos-appstream                      84 MB/s |  26 MB     00:00
Jan 31 01:53:34 np0005603622 dnf[30765]: repo-setup-centos-baseos                         74 MB/s | 8.9 MB     00:00
Jan 31 01:53:36 np0005603622 dnf[30765]: repo-setup-centos-highavailability               15 MB/s | 744 kB     00:00
Jan 31 01:53:36 np0005603622 dnf[30765]: repo-setup-centos-powertools                     70 MB/s | 7.6 MB     00:00
Jan 31 01:53:38 np0005603622 python3[31008]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 01:53:39 np0005603622 dnf[30765]: Extra Packages for Enterprise Linux 9 - x86_64   14 MB/s |  20 MB     00:01
Jan 31 01:53:52 np0005603622 dnf[30765]: Metadata cache created.
Jan 31 01:53:52 np0005603622 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 01:53:52 np0005603622 systemd[1]: Finished dnf makecache.
Jan 31 01:53:52 np0005603622 systemd[1]: dnf-makecache.service: Consumed 24.295s CPU time.
Jan 31 01:58:37 np0005603622 systemd[1]: session-8.scope: Deactivated successfully.
Jan 31 01:58:37 np0005603622 systemd[1]: session-8.scope: Consumed 3.981s CPU time.
Jan 31 01:58:37 np0005603622 systemd-logind[817]: Session 8 logged out. Waiting for processes to exit.
Jan 31 01:58:37 np0005603622 systemd-logind[817]: Removed session 8.
Jan 31 02:10:44 np0005603622 systemd-logind[817]: New session 9 of user zuul.
Jan 31 02:10:44 np0005603622 systemd[1]: Started Session 9 of User zuul.
Jan 31 02:10:45 np0005603622 python3.9[31193]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:10:46 np0005603622 python3.9[31374]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:10:55 np0005603622 systemd[1]: session-9.scope: Deactivated successfully.
Jan 31 02:10:55 np0005603622 systemd[1]: session-9.scope: Consumed 7.225s CPU time.
Jan 31 02:10:55 np0005603622 systemd-logind[817]: Session 9 logged out. Waiting for processes to exit.
Jan 31 02:10:55 np0005603622 systemd-logind[817]: Removed session 9.
Jan 31 02:11:15 np0005603622 systemd-logind[817]: New session 10 of user zuul.
Jan 31 02:11:15 np0005603622 systemd[1]: Started Session 10 of User zuul.
Jan 31 02:11:15 np0005603622 python3.9[31586]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 31 02:11:17 np0005603622 python3.9[31760]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:18 np0005603622 python3.9[31912]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:11:20 np0005603622 python3.9[32065]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:11:21 np0005603622 python3.9[32217]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:21 np0005603622 python3.9[32369]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:11:22 np0005603622 python3.9[32492]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843481.3093476-178-90296418526869/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:23 np0005603622 python3.9[32644]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:23 np0005603622 python3.9[32800]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:11:24 np0005603622 python3.9[32952]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:11:25 np0005603622 python3.9[33102]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:11:29 np0005603622 python3.9[33355]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:11:30 np0005603622 python3.9[33505]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:31 np0005603622 python3.9[33659]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:11:32 np0005603622 python3.9[33817]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:11:33 np0005603622 python3.9[33901]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:11:47 np0005603622 irqbalance[813]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 31 02:11:47 np0005603622 irqbalance[813]: IRQ 26 affinity is now unmanaged
Jan 31 02:12:18 np0005603622 systemd[1]: Reloading.
Jan 31 02:12:18 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:18 np0005603622 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 31 02:12:19 np0005603622 systemd[1]: Reloading.
Jan 31 02:12:20 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:20 np0005603622 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 31 02:12:20 np0005603622 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 31 02:12:20 np0005603622 systemd[1]: Reloading.
Jan 31 02:12:20 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:12:20 np0005603622 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 31 02:12:20 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:12:20 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:12:20 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:13:40 np0005603622 kernel: SELinux:  Converting 2729 SID table entries...
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:13:40 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:13:40 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 31 02:13:41 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:13:41 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:13:41 np0005603622 systemd[1]: Reloading.
Jan 31 02:13:41 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:13:41 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:13:41 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:13:41 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:13:41 np0005603622 systemd[1]: run-r7bb47e7c45b247b19205f1fef142c702.service: Deactivated successfully.
Jan 31 02:13:42 np0005603622 python3.9[35412]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:13:45 np0005603622 python3.9[35693]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 31 02:13:46 np0005603622 python3.9[35845]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 31 02:13:49 np0005603622 python3.9[35999]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:13:50 np0005603622 python3.9[36151]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 31 02:13:52 np0005603622 python3.9[36303]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:13:52 np0005603622 python3.9[36455]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:13:53 np0005603622 python3.9[36578]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843632.2956913-667-226101247393780/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:14:00 np0005603622 python3.9[36731]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:01 np0005603622 python3.9[36883]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:02 np0005603622 python3.9[37036]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:14:03 np0005603622 python3.9[37188]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 31 02:14:03 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:14:05 np0005603622 python3.9[37342]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:06 np0005603622 python3.9[37500]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:14:06 np0005603622 python3.9[37660]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 31 02:14:07 np0005603622 python3.9[37813]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:08 np0005603622 python3.9[37971]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 31 02:14:09 np0005603622 python3.9[38123]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:14:13 np0005603622 python3.9[38276]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:14 np0005603622 python3.9[38428]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:14:14 np0005603622 python3.9[38551]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843653.6451302-1024-12309842997585/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:15 np0005603622 python3.9[38703]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:14:15 np0005603622 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:14:15 np0005603622 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 31 02:14:15 np0005603622 kernel: Bridge firewalling registered
Jan 31 02:14:15 np0005603622 systemd-modules-load[38707]: Inserted module 'br_netfilter'
Jan 31 02:14:15 np0005603622 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:14:16 np0005603622 python3.9[38863]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:14:17 np0005603622 python3.9[38986]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843656.1094897-1093-129215209233478/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:14:18 np0005603622 python3.9[39138]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:14:20 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:14:21 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:14:21 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:14:21 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:14:21 np0005603622 systemd[1]: Reloading.
Jan 31 02:14:21 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:21 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:14:24 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:14:24 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:14:24 np0005603622 systemd[1]: man-db-cache-update.service: Consumed 3.427s CPU time.
Jan 31 02:14:24 np0005603622 systemd[1]: run-r25a43154e3344d79bd55c11ff67902c1.service: Deactivated successfully.
Jan 31 02:14:25 np0005603622 python3.9[42889]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:26 np0005603622 python3.9[43041]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 31 02:14:26 np0005603622 python3.9[43191]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:14:27 np0005603622 python3.9[43343]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:27 np0005603622 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:14:28 np0005603622 systemd[1]: Starting Authorization Manager...
Jan 31 02:14:28 np0005603622 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:14:28 np0005603622 polkitd[43560]: Started polkitd version 0.117
Jan 31 02:14:28 np0005603622 systemd[1]: Started Authorization Manager.
Jan 31 02:14:29 np0005603622 python3.9[43730]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:29 np0005603622 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 31 02:14:29 np0005603622 systemd[1]: tuned.service: Deactivated successfully.
Jan 31 02:14:29 np0005603622 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 31 02:14:29 np0005603622 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:14:29 np0005603622 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:14:30 np0005603622 python3.9[43891]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 31 02:14:34 np0005603622 python3.9[44043]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:34 np0005603622 systemd[1]: Reloading.
Jan 31 02:14:34 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:35 np0005603622 python3.9[44231]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:14:35 np0005603622 systemd[1]: Reloading.
Jan 31 02:14:35 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:14:36 np0005603622 python3.9[44421]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:37 np0005603622 python3.9[44574]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:37 np0005603622 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 31 02:14:37 np0005603622 python3.9[44727]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:39 np0005603622 python3.9[44889]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:14:40 np0005603622 python3.9[45042]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:14:40 np0005603622 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 31 02:14:40 np0005603622 systemd[1]: Stopped Apply Kernel Variables.
Jan 31 02:14:40 np0005603622 systemd[1]: Stopping Apply Kernel Variables...
Jan 31 02:14:40 np0005603622 systemd[1]: Starting Apply Kernel Variables...
Jan 31 02:14:40 np0005603622 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 31 02:14:40 np0005603622 systemd[1]: Finished Apply Kernel Variables.
Jan 31 02:14:41 np0005603622 systemd[1]: session-10.scope: Deactivated successfully.
Jan 31 02:14:41 np0005603622 systemd[1]: session-10.scope: Consumed 2min 6.547s CPU time.
Jan 31 02:14:41 np0005603622 systemd-logind[817]: Session 10 logged out. Waiting for processes to exit.
Jan 31 02:14:41 np0005603622 systemd-logind[817]: Removed session 10.
Jan 31 02:14:46 np0005603622 systemd-logind[817]: New session 11 of user zuul.
Jan 31 02:14:46 np0005603622 systemd[1]: Started Session 11 of User zuul.
Jan 31 02:14:47 np0005603622 python3.9[45225]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:14:48 np0005603622 python3.9[45381]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 31 02:14:49 np0005603622 python3.9[45534]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:14:50 np0005603622 python3.9[45692]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:14:51 np0005603622 python3.9[45852]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:14:52 np0005603622 python3.9[45936]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:14:56 np0005603622 python3.9[46100]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:11 np0005603622 kernel: SELinux:  Converting 2741 SID table entries...
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:15:11 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:15:11 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 31 02:15:11 np0005603622 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 31 02:15:13 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:13 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:13 np0005603622 systemd[1]: Reloading.
Jan 31 02:15:13 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:13 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:13 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:14 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:14 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:14 np0005603622 systemd[1]: run-rcd5d38fdd3fe4edbabeb40e19aa46649.service: Deactivated successfully.
Jan 31 02:15:15 np0005603622 python3.9[47200]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:15:15 np0005603622 systemd[1]: Reloading.
Jan 31 02:15:15 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:15 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:15 np0005603622 systemd[1]: Starting Open vSwitch Database Unit...
Jan 31 02:15:15 np0005603622 chown[47242]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 31 02:15:15 np0005603622 ovs-ctl[47247]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 31 02:15:15 np0005603622 ovs-ctl[47247]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 31 02:15:15 np0005603622 ovs-ctl[47247]: Starting ovsdb-server [  OK  ]
Jan 31 02:15:15 np0005603622 ovs-vsctl[47296]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 31 02:15:16 np0005603622 ovs-vsctl[47316]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"bd097fed-e54b-4ed7-90f0-078b39b8b13a\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 31 02:15:16 np0005603622 ovs-ctl[47247]: Configuring Open vSwitch system IDs [  OK  ]
Jan 31 02:15:16 np0005603622 ovs-ctl[47247]: Enabling remote OVSDB managers [  OK  ]
Jan 31 02:15:16 np0005603622 systemd[1]: Started Open vSwitch Database Unit.
Jan 31 02:15:16 np0005603622 ovs-vsctl[47322]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 31 02:15:16 np0005603622 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 31 02:15:16 np0005603622 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 31 02:15:16 np0005603622 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 31 02:15:16 np0005603622 kernel: openvswitch: Open vSwitch switching datapath
Jan 31 02:15:16 np0005603622 ovs-ctl[47366]: Inserting openvswitch module [  OK  ]
Jan 31 02:15:16 np0005603622 ovs-ctl[47335]: Starting ovs-vswitchd [  OK  ]
Jan 31 02:15:16 np0005603622 ovs-vsctl[47387]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 31 02:15:16 np0005603622 ovs-ctl[47335]: Enabling remote OVSDB managers [  OK  ]
Jan 31 02:15:16 np0005603622 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 31 02:15:16 np0005603622 systemd[1]: Starting Open vSwitch...
Jan 31 02:15:16 np0005603622 systemd[1]: Finished Open vSwitch.
Jan 31 02:15:17 np0005603622 python3.9[47538]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:15:18 np0005603622 python3.9[47690]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 31 02:15:19 np0005603622 kernel: SELinux:  Converting 2755 SID table entries...
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:15:19 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:15:22 np0005603622 python3.9[47845]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:15:23 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 31 02:15:23 np0005603622 python3.9[48003]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:25 np0005603622 python3.9[48156]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:15:27 np0005603622 python3.9[48443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 31 02:15:28 np0005603622 python3.9[48593]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:15:29 np0005603622 python3.9[48747]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:31 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:31 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:31 np0005603622 systemd[1]: Reloading.
Jan 31 02:15:31 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:31 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:31 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:31 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:31 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:31 np0005603622 systemd[1]: run-rbefaabee43344873b326f6c5aba11bb1.service: Deactivated successfully.
Jan 31 02:15:32 np0005603622 python3.9[49064]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:15:32 np0005603622 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 31 02:15:32 np0005603622 systemd[1]: Stopped Network Manager Wait Online.
Jan 31 02:15:32 np0005603622 systemd[1]: Stopping Network Manager Wait Online...
Jan 31 02:15:32 np0005603622 systemd[1]: Stopping Network Manager...
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9396] caught SIGTERM, shutting down normally.
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9406] dhcp4 (eth0): canceled DHCP transaction
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9407] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9407] dhcp4 (eth0): state changed no lease
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9409] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 02:15:32 np0005603622 NetworkManager[7207]: <info>  [1769843732.9455] exiting (success)
Jan 31 02:15:32 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 02:15:32 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 02:15:32 np0005603622 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 31 02:15:32 np0005603622 systemd[1]: Stopped Network Manager.
Jan 31 02:15:32 np0005603622 systemd[1]: NetworkManager.service: Consumed 20.132s CPU time, 4.1M memory peak, read 0B from disk, written 31.0K to disk.
Jan 31 02:15:32 np0005603622 systemd[1]: Starting Network Manager...
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0088] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:4be470f6-96ef-472f-95f9-87bd5086f491)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0090] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0133] manager[0x55e933ccb000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 31 02:15:33 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 02:15:33 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0811] hostname: hostname: using hostnamed
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0811] hostname: static hostname changed from (none) to "compute-1"
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0816] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0819] manager[0x55e933ccb000]: rfkill: Wi-Fi hardware radio set enabled
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0819] manager[0x55e933ccb000]: rfkill: WWAN hardware radio set enabled
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0839] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0847] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0848] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0848] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0848] manager: Networking is enabled by state file
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0850] settings: Loaded settings plugin: keyfile (internal)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0854] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0879] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0887] dhcp: init: Using DHCP client 'internal'
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0889] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0893] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0897] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0904] device (lo): Activation: starting connection 'lo' (77165707-765c-458b-b9f6-329fd98abdae)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0910] device (eth0): carrier: link connected
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0913] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0917] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0917] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0921] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0933] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0937] device (eth1): carrier: link connected
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0940] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0944] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24) (indicated)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0945] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0949] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0955] device (eth1): Activation: starting connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24)
Jan 31 02:15:33 np0005603622 systemd[1]: Started Network Manager.
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0965] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0972] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0974] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0977] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0979] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0983] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0985] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0987] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0990] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0996] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.0998] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1006] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1016] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1033] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1035] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1039] device (lo): Activation: successful, device activated.
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1047] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1049] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1053] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1057] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1059] device (eth1): Activation: successful, device activated.
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1074] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 31 02:15:33 np0005603622 systemd[1]: Starting Network Manager Wait Online...
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1226] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1378] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1380] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1384] manager: NetworkManager state is now CONNECTED_SITE
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1387] device (eth0): Activation: successful, device activated.
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1391] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 31 02:15:33 np0005603622 NetworkManager[49080]: <info>  [1769843733.1394] manager: startup complete
Jan 31 02:15:33 np0005603622 systemd[1]: Finished Network Manager Wait Online.
Jan 31 02:15:33 np0005603622 python3.9[49290]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:15:40 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:15:40 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:15:40 np0005603622 systemd[1]: Reloading.
Jan 31 02:15:40 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:15:40 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:15:40 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:15:42 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:15:42 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:15:42 np0005603622 systemd[1]: run-rfd1b121dee3846edb7ebf6b99810acf2.service: Deactivated successfully.
Jan 31 02:15:43 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 02:15:43 np0005603622 python3.9[49749]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:15:44 np0005603622 python3.9[49901]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:45 np0005603622 python3.9[50055]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:46 np0005603622 python3.9[50207]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:46 np0005603622 python3.9[50359]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:47 np0005603622 python3.9[50511]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:48 np0005603622 python3.9[50663]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:15:49 np0005603622 python3.9[50786]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843747.810063-648-134813257005640/.source _original_basename=.3l8qm8d2 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:50 np0005603622 python3.9[50938]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:51 np0005603622 python3.9[51090]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 31 02:15:52 np0005603622 python3.9[51242]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:15:55 np0005603622 python3.9[51669]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 31 02:15:56 np0005603622 ansible-async_wrapper.py[51844]: Invoked with j477126242870 300 /home/zuul/.ansible/tmp/ansible-tmp-1769843755.718501-846-240511731939920/AnsiballZ_edpm_os_net_config.py _
Jan 31 02:15:56 np0005603622 ansible-async_wrapper.py[51847]: Starting module and watcher
Jan 31 02:15:56 np0005603622 ansible-async_wrapper.py[51847]: Start watching 51848 (300)
Jan 31 02:15:56 np0005603622 ansible-async_wrapper.py[51848]: Start module (51848)
Jan 31 02:15:56 np0005603622 ansible-async_wrapper.py[51844]: Return async_wrapper task started.
Jan 31 02:15:57 np0005603622 python3.9[51849]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 31 02:15:57 np0005603622 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 31 02:15:57 np0005603622 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 31 02:15:57 np0005603622 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 31 02:15:57 np0005603622 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 31 02:15:57 np0005603622 kernel: cfg80211: failed to load regulatory.db
Jan 31 02:15:58 np0005603622 NetworkManager[49080]: <info>  [1769843758.9523] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51850 uid=0 result="success"
Jan 31 02:15:58 np0005603622 NetworkManager[49080]: <info>  [1769843758.9536] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0021] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0024] audit: op="connection-add" uuid="bff5575e-f1ee-4858-84f8-82f66b68dc94" name="br-ex-br" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0038] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0040] audit: op="connection-add" uuid="dd54f91e-687b-4aae-a6a9-9419c0d561ab" name="br-ex-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0051] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0054] audit: op="connection-add" uuid="cb472d1d-44d0-4cfa-887f-ced0a7aa5d5b" name="eth1-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0064] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0066] audit: op="connection-add" uuid="22a9e37a-527f-469e-bb53-c047bb05bc77" name="vlan20-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0078] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0080] audit: op="connection-add" uuid="504bf66b-0ad0-4025-af11-19ab351e8390" name="vlan21-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0092] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0094] audit: op="connection-add" uuid="0917909f-fbfc-4362-91bd-7b3ce35bdb4a" name="vlan22-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0104] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0106] audit: op="connection-add" uuid="33031495-5a49-4184-b04b-95ef96071c9d" name="vlan23-port" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0126] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.timestamp,connection.autoconnect-priority" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0140] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0143] audit: op="connection-add" uuid="fcc990ab-6473-4961-be59-40601203705d" name="br-ex-if" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0540] audit: op="connection-update" uuid="1904bc9a-b607-5e45-8cb7-a48d9ada9b24" name="ci-private-network" args="ipv6.routes,ipv6.method,ipv6.dns,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.addresses,ovs-external-ids.data,ipv4.routing-rules,ipv4.method,ipv4.dns,ipv4.never-default,ipv4.routes,ipv4.addresses,connection.slave-type,connection.timestamp,connection.controller,connection.master,connection.port-type,ovs-interface.type" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0562] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0565] audit: op="connection-add" uuid="3df10793-7d8a-41de-a038-3f40493c21dc" name="vlan20-if" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0583] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0585] audit: op="connection-add" uuid="419d44b5-fb39-4d43-b2db-491bb0bbacd6" name="vlan21-if" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0600] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0603] audit: op="connection-add" uuid="0e840d95-fb95-4441-b66d-25cbabdce5a8" name="vlan22-if" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0618] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0621] audit: op="connection-add" uuid="08620803-c9b1-4939-b30c-46d4f03234b5" name="vlan23-if" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0632] audit: op="connection-delete" uuid="41135365-bff3-3539-a318-7de98caf3b09" name="Wired connection 1" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0643] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0647] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0656] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0662] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (bff5575e-f1ee-4858-84f8-82f66b68dc94)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0664] audit: op="connection-activate" uuid="bff5575e-f1ee-4858-84f8-82f66b68dc94" name="br-ex-br" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0667] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0668] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0674] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0678] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (dd54f91e-687b-4aae-a6a9-9419c0d561ab)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0679] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0680] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0684] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0688] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (cb472d1d-44d0-4cfa-887f-ced0a7aa5d5b)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0690] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0691] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0695] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0699] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (22a9e37a-527f-469e-bb53-c047bb05bc77)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0700] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0701] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0706] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0709] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (504bf66b-0ad0-4025-af11-19ab351e8390)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0711] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0711] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0716] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0720] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0917909f-fbfc-4362-91bd-7b3ce35bdb4a)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0721] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0730] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0736] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0740] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (33031495-5a49-4184-b04b-95ef96071c9d)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0741] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0744] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0746] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0752] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0754] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0757] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0761] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (fcc990ab-6473-4961-be59-40601203705d)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0762] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0765] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0767] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0769] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0770] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0780] device (eth1): disconnecting for new activation request.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0780] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0783] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0785] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0787] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0789] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0790] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0793] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0796] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (3df10793-7d8a-41de-a038-3f40493c21dc)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0797] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0800] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0801] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0803] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0805] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0805] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0808] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0811] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (419d44b5-fb39-4d43-b2db-491bb0bbacd6)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0812] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0814] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0816] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0817] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0819] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0820] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0823] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0826] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (0e840d95-fb95-4441-b66d-25cbabdce5a8)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0827] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0829] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0831] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0832] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0834] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <warn>  [1769843759.0836] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0839] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0850] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (08620803-c9b1-4939-b30c-46d4f03234b5)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0851] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0855] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0857] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0858] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0860] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0872] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0875] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0878] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0879] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0887] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0891] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0895] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0898] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0899] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0904] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0908] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0912] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0914] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: ovs-system: entered promiscuous mode
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0929] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0933] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0937] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0939] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: Timeout policy base is empty
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0943] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0947] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0949] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0950] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0954] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0957] dhcp4 (eth0): canceled DHCP transaction
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0957] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0957] dhcp4 (eth0): state changed no lease
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.0959] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 31 02:15:59 np0005603622 systemd-udevd[51854]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1002] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1006] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51850 uid=0 result="fail" reason="Device is not activated"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1015] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1239] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1243] dhcp4 (eth0): state changed new lease, address=38.102.83.45
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1252] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1610] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: br-ex: entered promiscuous mode
Jan 31 02:15:59 np0005603622 kernel: vlan22: entered promiscuous mode
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1705] device (eth1): Activation: starting connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1710] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1712] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1713] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1714] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1716] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1717] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1718] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 systemd-udevd[51856]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1720] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1733] device (eth1): disconnecting for new activation request.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1734] audit: op="connection-activate" uuid="1904bc9a-b607-5e45-8cb7-a48d9ada9b24" name="ci-private-network" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1738] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1743] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1748] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1754] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: vlan21: entered promiscuous mode
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1777] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1780] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1784] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1788] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1793] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1797] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1801] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1805] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1809] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1814] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: vlan23: entered promiscuous mode
Jan 31 02:15:59 np0005603622 systemd-udevd[51855]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1838] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1851] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603622 kernel: vlan20: entered promiscuous mode
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1881] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1892] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1892] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51850 uid=0 result="success"
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1893] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1900] device (eth1): Activation: starting connection 'ci-private-network' (1904bc9a-b607-5e45-8cb7-a48d9ada9b24)
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1916] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1920] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1935] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1943] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1957] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1958] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1961] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1967] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1978] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1991] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.1993] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2008] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2227] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2231] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2236] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2240] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2243] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2248] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2258] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2264] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2271] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2278] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2285] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2292] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2298] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2305] device (eth1): Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2313] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2321] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2330] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 31 02:15:59 np0005603622 NetworkManager[49080]: <info>  [1769843759.2338] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 31 02:16:00 np0005603622 NetworkManager[49080]: <info>  [1769843760.3798] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51850 uid=0 result="success"
Jan 31 02:16:00 np0005603622 NetworkManager[49080]: <info>  [1769843760.5370] checkpoint[0x55e933c9f950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 31 02:16:00 np0005603622 NetworkManager[49080]: <info>  [1769843760.5373] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51850 uid=0 result="success"
Jan 31 02:16:01 np0005603622 ansible-async_wrapper.py[51847]: 51848 still running (300)
Jan 31 02:16:01 np0005603622 NetworkManager[49080]: <info>  [1769843761.9368] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51850 uid=0 result="success"
Jan 31 02:16:01 np0005603622 NetworkManager[49080]: <info>  [1769843761.9384] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51850 uid=0 result="success"
Jan 31 02:16:02 np0005603622 python3.9[52216]: ansible-ansible.legacy.async_status Invoked with jid=j477126242870.51844 mode=status _async_dir=/root/.ansible_async
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.1633] audit: op="networking-control" arg="global-dns-configuration" pid=51850 uid=0 result="success"
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.1693] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.1749] audit: op="networking-control" arg="global-dns-configuration" pid=51850 uid=0 result="success"
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.1783] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51850 uid=0 result="success"
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.3293] checkpoint[0x55e933c9fa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 31 02:16:02 np0005603622 NetworkManager[49080]: <info>  [1769843762.3296] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51850 uid=0 result="success"
Jan 31 02:16:02 np0005603622 ansible-async_wrapper.py[51848]: Module complete (51848)
Jan 31 02:16:03 np0005603622 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 02:16:05 np0005603622 python3.9[52321]: ansible-ansible.legacy.async_status Invoked with jid=j477126242870.51844 mode=status _async_dir=/root/.ansible_async
Jan 31 02:16:06 np0005603622 python3.9[52420]: ansible-ansible.legacy.async_status Invoked with jid=j477126242870.51844 mode=cleanup _async_dir=/root/.ansible_async
Jan 31 02:16:06 np0005603622 python3.9[52572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:06 np0005603622 ansible-async_wrapper.py[51847]: Done in kid B.
Jan 31 02:16:07 np0005603622 python3.9[52695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843766.2704155-927-160753537637824/.source.returncode _original_basename=._h2p1_75 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:10 np0005603622 python3.9[52848]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:11 np0005603622 python3.9[52972]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843770.275928-975-197459935589013/.source.cfg _original_basename=.frngs4hb follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:12 np0005603622 python3.9[53124]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:16:12 np0005603622 systemd[1]: Reloading Network Manager...
Jan 31 02:16:12 np0005603622 NetworkManager[49080]: <info>  [1769843772.1001] audit: op="reload" arg="0" pid=53128 uid=0 result="success"
Jan 31 02:16:12 np0005603622 NetworkManager[49080]: <info>  [1769843772.1008] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 31 02:16:12 np0005603622 systemd[1]: Reloaded Network Manager.
Jan 31 02:16:12 np0005603622 systemd[1]: session-11.scope: Deactivated successfully.
Jan 31 02:16:12 np0005603622 systemd[1]: session-11.scope: Consumed 50.039s CPU time.
Jan 31 02:16:12 np0005603622 systemd-logind[817]: Session 11 logged out. Waiting for processes to exit.
Jan 31 02:16:12 np0005603622 systemd-logind[817]: Removed session 11.
Jan 31 02:16:18 np0005603622 systemd-logind[817]: New session 12 of user zuul.
Jan 31 02:16:18 np0005603622 systemd[1]: Started Session 12 of User zuul.
Jan 31 02:16:19 np0005603622 python3.9[53312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:20 np0005603622 python3.9[53466]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:21 np0005603622 python3.9[53660]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:21 np0005603622 systemd[1]: session-12.scope: Deactivated successfully.
Jan 31 02:16:21 np0005603622 systemd[1]: session-12.scope: Consumed 1.859s CPU time.
Jan 31 02:16:21 np0005603622 systemd-logind[817]: Session 12 logged out. Waiting for processes to exit.
Jan 31 02:16:21 np0005603622 systemd-logind[817]: Removed session 12.
Jan 31 02:16:22 np0005603622 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 31 02:16:28 np0005603622 systemd-logind[817]: New session 13 of user zuul.
Jan 31 02:16:28 np0005603622 systemd[1]: Started Session 13 of User zuul.
Jan 31 02:16:29 np0005603622 python3.9[53843]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:30 np0005603622 python3.9[53998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:31 np0005603622 python3.9[54154]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:32 np0005603622 python3.9[54238]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:16:34 np0005603622 python3.9[54391]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:16:35 np0005603622 python3.9[54586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:36 np0005603622 python3.9[54738]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:36 np0005603622 systemd[1]: var-lib-containers-storage-overlay-compat4070485719-merged.mount: Deactivated successfully.
Jan 31 02:16:37 np0005603622 podman[54739]: 2026-01-31 07:16:37.231255753 +0000 UTC m=+0.436393728 system refresh
Jan 31 02:16:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:16:38 np0005603622 python3.9[54902]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:38 np0005603622 python3.9[55025]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843797.4190443-197-57201267715157/.source.json follow=False _original_basename=podman_network_config.j2 checksum=7eabfd3cd218f2880e2af029833c019b38a90213 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:16:39 np0005603622 python3.9[55177]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:16:39 np0005603622 python3.9[55300]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843798.913114-242-30798465880856/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51dca2f6e7d675b0597f23a4e044edd3f4faff03 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:40 np0005603622 python3.9[55452]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:41 np0005603622 python3.9[55604]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:42 np0005603622 python3.9[55756]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:42 np0005603622 python3.9[55908]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:16:43 np0005603622 python3.9[56060]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:16:46 np0005603622 python3.9[56213]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:16:47 np0005603622 python3.9[56367]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:16:48 np0005603622 python3.9[56519]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:16:48 np0005603622 python3.9[56671]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:16:49 np0005603622 python3.9[56824]: ansible-service_facts Invoked
Jan 31 02:16:49 np0005603622 network[56841]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:16:49 np0005603622 network[56842]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:16:49 np0005603622 network[56843]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:16:58 np0005603622 python3.9[57295]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:17:02 np0005603622 python3.9[57448]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 31 02:17:03 np0005603622 python3.9[57600]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:04 np0005603622 python3.9[57725]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843823.28594-675-63672346946727/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:05 np0005603622 python3.9[57879]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:05 np0005603622 python3.9[58004]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843824.5912378-720-225165551175766/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:07 np0005603622 python3.9[58158]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:08 np0005603622 python3.9[58312]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:17:09 np0005603622 python3.9[58396]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:12 np0005603622 python3.9[58550]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:17:13 np0005603622 python3.9[58634]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:17:13 np0005603622 chronyd[826]: chronyd exiting
Jan 31 02:17:13 np0005603622 systemd[1]: Stopping NTP client/server...
Jan 31 02:17:13 np0005603622 systemd[1]: chronyd.service: Deactivated successfully.
Jan 31 02:17:13 np0005603622 systemd[1]: Stopped NTP client/server.
Jan 31 02:17:13 np0005603622 systemd[1]: Starting NTP client/server...
Jan 31 02:17:13 np0005603622 chronyd[58642]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 31 02:17:13 np0005603622 chronyd[58642]: Frequency -27.123 +/- 0.128 ppm read from /var/lib/chrony/drift
Jan 31 02:17:13 np0005603622 chronyd[58642]: Loaded seccomp filter (level 2)
Jan 31 02:17:13 np0005603622 systemd[1]: Started NTP client/server.
Jan 31 02:17:13 np0005603622 systemd[1]: session-13.scope: Deactivated successfully.
Jan 31 02:17:13 np0005603622 systemd[1]: session-13.scope: Consumed 22.839s CPU time.
Jan 31 02:17:13 np0005603622 systemd-logind[817]: Session 13 logged out. Waiting for processes to exit.
Jan 31 02:17:13 np0005603622 systemd-logind[817]: Removed session 13.
Jan 31 02:17:19 np0005603622 systemd-logind[817]: New session 14 of user zuul.
Jan 31 02:17:19 np0005603622 systemd[1]: Started Session 14 of User zuul.
Jan 31 02:17:20 np0005603622 python3.9[58823]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:21 np0005603622 python3.9[58975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:21 np0005603622 python3.9[59098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843840.4532287-63-233628859016956/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:22 np0005603622 systemd[1]: session-14.scope: Deactivated successfully.
Jan 31 02:17:22 np0005603622 systemd[1]: session-14.scope: Consumed 1.408s CPU time.
Jan 31 02:17:22 np0005603622 systemd-logind[817]: Session 14 logged out. Waiting for processes to exit.
Jan 31 02:17:22 np0005603622 systemd-logind[817]: Removed session 14.
Jan 31 02:17:27 np0005603622 systemd-logind[817]: New session 15 of user zuul.
Jan 31 02:17:27 np0005603622 systemd[1]: Started Session 15 of User zuul.
Jan 31 02:17:28 np0005603622 python3.9[59276]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:17:29 np0005603622 python3.9[59432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:30 np0005603622 python3.9[59607]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:30 np0005603622 python3.9[59730]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769843849.6345437-84-264646401290399/.source.json _original_basename=.d1zrtv6k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:31 np0005603622 python3.9[59882]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:32 np0005603622 python3.9[60005]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843851.4099557-153-63464893908863/.source _original_basename=.2f4cx6yb follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:33 np0005603622 python3.9[60157]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:33 np0005603622 python3.9[60309]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:34 np0005603622 python3.9[60432]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843853.28819-225-238035520681549/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:34 np0005603622 python3.9[60584]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:35 np0005603622 python3.9[60707]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769843854.3363965-225-192034410459873/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:17:36 np0005603622 python3.9[60859]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:36 np0005603622 python3.9[61011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:37 np0005603622 python3.9[61134]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843856.5375383-336-242830445188861/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:38 np0005603622 python3.9[61286]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:38 np0005603622 python3.9[61409]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843857.7388496-381-222000904018487/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:39 np0005603622 python3.9[61561]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:39 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:40 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:40 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:40 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:40 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:40 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:40 np0005603622 systemd[1]: Starting EDPM Container Shutdown...
Jan 31 02:17:40 np0005603622 systemd[1]: Finished EDPM Container Shutdown.
Jan 31 02:17:41 np0005603622 python3.9[61788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:41 np0005603622 python3.9[61911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843860.8442755-450-272050496767597/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:42 np0005603622 python3.9[62063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:42 np0005603622 python3.9[62186]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843861.961687-495-153034157759500/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:43 np0005603622 python3.9[62338]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:43 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:43 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:43 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:43 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:43 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:43 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:44 np0005603622 systemd[1]: Starting Create netns directory...
Jan 31 02:17:44 np0005603622 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:17:44 np0005603622 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:17:44 np0005603622 systemd[1]: Finished Create netns directory.
Jan 31 02:17:45 np0005603622 python3.9[62564]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:17:45 np0005603622 network[62581]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:17:45 np0005603622 network[62582]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:17:45 np0005603622 network[62583]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:17:49 np0005603622 python3.9[62845]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:49 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:49 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:49 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:49 np0005603622 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 31 02:17:49 np0005603622 iptables.init[62885]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 31 02:17:49 np0005603622 iptables.init[62885]: iptables: Flushing firewall rules: [  OK  ]
Jan 31 02:17:49 np0005603622 systemd[1]: iptables.service: Deactivated successfully.
Jan 31 02:17:49 np0005603622 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 31 02:17:50 np0005603622 python3.9[63082]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:51 np0005603622 python3.9[63236]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:17:51 np0005603622 systemd[1]: Reloading.
Jan 31 02:17:51 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:17:51 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:17:51 np0005603622 systemd[1]: Starting Netfilter Tables...
Jan 31 02:17:51 np0005603622 systemd[1]: Finished Netfilter Tables.
Jan 31 02:17:53 np0005603622 python3.9[63427]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:17:54 np0005603622 python3.9[63580]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:54 np0005603622 python3.9[63705]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843873.9069574-702-117073869448196/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:55 np0005603622 python3.9[63858]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:17:55 np0005603622 systemd[1]: Reloading OpenSSH server daemon...
Jan 31 02:17:55 np0005603622 systemd[1]: Reloaded OpenSSH server daemon.
Jan 31 02:17:56 np0005603622 python3.9[64014]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:57 np0005603622 python3.9[64166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:17:57 np0005603622 python3.9[64289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843876.7654257-795-187004051898590/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:17:58 np0005603622 python3.9[64441]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 02:17:58 np0005603622 systemd[1]: Starting Time & Date Service...
Jan 31 02:17:58 np0005603622 systemd[1]: Started Time & Date Service.
Jan 31 02:17:59 np0005603622 python3.9[64597]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:01 np0005603622 python3.9[64749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:01 np0005603622 python3.9[64872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843880.2797408-900-237999816491301/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:02 np0005603622 python3.9[65024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:02 np0005603622 python3.9[65147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769843881.950086-945-53651084212763/.source.yaml _original_basename=.io1_10g6 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:03 np0005603622 python3.9[65299]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:03 np0005603622 python3.9[65422]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843883.0482476-990-225935906334797/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:04 np0005603622 python3.9[65574]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:05 np0005603622 python3.9[65727]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:06 np0005603622 python3[65880]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:18:06 np0005603622 python3.9[66032]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:07 np0005603622 python3.9[66155]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843886.3384438-1107-58301602533743/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:08 np0005603622 python3.9[66307]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:08 np0005603622 python3.9[66430]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843888.0352447-1152-229558428260384/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:10 np0005603622 python3.9[66582]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:10 np0005603622 python3.9[66705]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843889.6515267-1197-131371590514512/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:11 np0005603622 python3.9[66857]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:11 np0005603622 python3.9[66980]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843890.8534756-1242-139692389634144/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:12 np0005603622 python3.9[67132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:18:13 np0005603622 python3.9[67255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769843892.1201196-1287-13294502824509/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:14 np0005603622 python3.9[67407]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:14 np0005603622 python3.9[67559]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:15 np0005603622 python3.9[67718]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:16 np0005603622 python3.9[67871]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:17 np0005603622 python3.9[68023]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:18 np0005603622 python3.9[68175]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:18:18 np0005603622 python3.9[68328]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:18:19 np0005603622 systemd[1]: session-15.scope: Deactivated successfully.
Jan 31 02:18:19 np0005603622 systemd[1]: session-15.scope: Consumed 28.885s CPU time.
Jan 31 02:18:19 np0005603622 systemd-logind[817]: Session 15 logged out. Waiting for processes to exit.
Jan 31 02:18:19 np0005603622 systemd-logind[817]: Removed session 15.
Jan 31 02:18:24 np0005603622 systemd-logind[817]: New session 16 of user zuul.
Jan 31 02:18:24 np0005603622 systemd[1]: Started Session 16 of User zuul.
Jan 31 02:18:25 np0005603622 python3.9[68509]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 31 02:18:26 np0005603622 python3.9[68661]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:27 np0005603622 python3.9[68813]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:28 np0005603622 python3.9[68965]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCSEo2WrFN8DnR2/d+p3YtsWos96nHz1MZInXN3md5cJXE0icMDwEWJuGIDUd5e0SA6Q7i33i/WIEmt/wGMoNhoTI+f3plB2NyAn5vyVQGTZv7m+tOLQI3/k50Kxnpu0c5gO509yln6RcLe4MutF0imS/fINCM+Nznh7oKbn6hELTDlxDz0JH8dNsZGmtVmgnhwIrglpxAg/WpeOWkCmuuXmysx1JcAhIK5016MzaM9cOtHAGzj5s0GE7nQoH4yG0Ak3zMU/DPKr91Xq/m9PCnGKautoHmHgrEG6u+1WubtakbBxlfmroKbvrIFL6KKQzY0SiTrBsH3nZRaFGCqE0ZEyHvJz8AO3quWg2oaXRJWN98f7k3l5dtVJIuwyJxVnv6fUGuLbGxOp4T6UDPqC7b2Eg17EtpUjy77F/+8yrX6NH+hXwcWBwHelRCDSiceGQTm1uexb8Xo1R1Wt9h24H2yRKPFrqzf1R9J2vipDouDo7RLefAiCXEJDdlewdKUM5c=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILwGOCpzCDE8uIHb4RBldbKfEvxhUdsBT4K7sPU4vZLU#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLS8teLqq0Lmt8g22OKhtEhLCXd5cBLM6W2oDJcWxQl8DloBMMFjgDlHt0rzjMKEL0SpxkPbH7sPV1zbWKKJI9M=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAgshePGnD7oc3Zg8kfD9lUGSfPfE1OzPUGBHE12jLoyHnXwKTxYFYSMTWRcYgdFu4HaP0ShO1gEQF+1nDXxrozH/m2qxK/YPC5cVYCPvscwRdlyUNPOV0rpiruVZptTQ1iibsmRwMbxliXD2t13CtsrNjy9iuLgtvvnkfUh0wZKcZ8Jglg6E4vRTBPgXo3fJCfPF9Iz7GE50DpWAU8OnoLNlOf54/tcd8CyOrmLF9RwHTgNtN9FXscdQ3/A8avCF0WPWNUmfLFc20yOtfrq/xxjJMLn4KOZu1D1yjK5BSJu2pv/j0NPrTFKgPKYWjiXPdttcyubkXNZP96jkK9dgTgsEGRKuM83QpDIu7823wv4/GtEi+IsJeyqCN+3VAJo9hDB9eES8qlX4jAg6Kxen1oNkL9M+tz7N0BSdnxbS3skWEw6MsHlsBLOw7KMYe8gq8JoqHLBKBFQZZbjwaK5kNTeu6l5zAYERpt8uAEZkplq2vV5+4EOh7RPncmKuH0Xs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF//s6MNfOt3MK/jBcrJ5VkyeSY5eg1jUHN32BLTGZtT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEtIHGkRVmmqcsRoXLuIEWyuaX3BoKld3DircbfvRpdFLzOwbxRaZ6uUN5f7sBun3oAcQLdnixnG3R/YK8L7HpM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCXc4C5rCDfOfKEuMVHI9SatZ+NRO9lp335K0yZ19CDCOGSUNO2lblpRlgxO3tw3S+UGGiC/7/HHeZBA2Zd+SUVMb7ytbl5c3+XuZIIQF6DyIIDSELf0FoE0NhuSjKFilPsxyxxGYgH+gVaTZkuGhDoljaywQBSPGZdDwejVKWPVuui5xe0X4T0WVfT5avLSpIL3WjJ9hmzEaR0dUqrbKvPUAXJPDqQOZbQZbpXDIi48NPUDFwByej1xHWHRQaPJ/M6AsyrZKP/hiF2xt0mCIk1FANldusq4OUs9r/0KTVrPRCpSrsSimKBtEMJVdxqxAasE7H07sSdwFcWNC21LtsH8+/LM0oofIZ3D0Lom0NoLaC+Ocy2vqbIhOPYJ6c7Q8J/p4NFiA/lD+bgyjOOnm3Ls4VaaHXUyknu259henkVzJ+iZuRNY8ki345nrzPLoLYyxVwRkSuONyYlRp36jjp0QIL9kXLFlJ2OTHvb9FUhlG7RnxzPeHZhsihSHJv1rgU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAT2MDVMbPz3xtbIO31qZj2gzOQiz4a8pTNWAmd0+CUW#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFU8ym/rLGJxMpEsk09j3JHOh1hW4Vrm23tIOjn4/YJIrK1UFRFiQLDm+yZuj1NhWfbg71SK8ZuZ2miEJ20BHno=#012 create=True mode=0644 path=/tmp/ansible.gd_aep1g state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:28 np0005603622 python3.9[69117]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.gd_aep1g' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:29 np0005603622 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 02:18:29 np0005603622 python3.9[69273]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.gd_aep1g state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:30 np0005603622 systemd[1]: session-16.scope: Deactivated successfully.
Jan 31 02:18:30 np0005603622 systemd[1]: session-16.scope: Consumed 2.771s CPU time.
Jan 31 02:18:30 np0005603622 systemd-logind[817]: Session 16 logged out. Waiting for processes to exit.
Jan 31 02:18:30 np0005603622 systemd-logind[817]: Removed session 16.
Jan 31 02:18:35 np0005603622 systemd-logind[817]: New session 17 of user zuul.
Jan 31 02:18:35 np0005603622 systemd[1]: Started Session 17 of User zuul.
Jan 31 02:18:36 np0005603622 python3.9[69451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:37 np0005603622 python3.9[69607]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:18:38 np0005603622 python3.9[69761]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:18:38 np0005603622 python3.9[69914]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:39 np0005603622 python3.9[70067]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:40 np0005603622 python3.9[70221]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:40 np0005603622 python3.9[70376]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:18:41 np0005603622 systemd[1]: session-17.scope: Deactivated successfully.
Jan 31 02:18:41 np0005603622 systemd[1]: session-17.scope: Consumed 3.547s CPU time.
Jan 31 02:18:41 np0005603622 systemd-logind[817]: Session 17 logged out. Waiting for processes to exit.
Jan 31 02:18:41 np0005603622 systemd-logind[817]: Removed session 17.
Jan 31 02:18:47 np0005603622 systemd-logind[817]: New session 18 of user zuul.
Jan 31 02:18:47 np0005603622 systemd[1]: Started Session 18 of User zuul.
Jan 31 02:18:48 np0005603622 python3.9[70556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:18:49 np0005603622 python3.9[70712]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:18:50 np0005603622 python3.9[70796]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:18:52 np0005603622 python3.9[70947]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:18:53 np0005603622 python3.9[71098]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:18:54 np0005603622 python3.9[71248]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:54 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:18:55 np0005603622 python3.9[71399]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:18:55 np0005603622 systemd[1]: session-18.scope: Deactivated successfully.
Jan 31 02:18:55 np0005603622 systemd[1]: session-18.scope: Consumed 5.425s CPU time.
Jan 31 02:18:55 np0005603622 systemd-logind[817]: Session 18 logged out. Waiting for processes to exit.
Jan 31 02:18:55 np0005603622 systemd-logind[817]: Removed session 18.
Jan 31 02:19:03 np0005603622 systemd-logind[817]: New session 19 of user zuul.
Jan 31 02:19:03 np0005603622 systemd[1]: Started Session 19 of User zuul.
Jan 31 02:19:09 np0005603622 python3[72165]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:19:11 np0005603622 python3[72260]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 31 02:19:12 np0005603622 python3[72287]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 31 02:19:13 np0005603622 python3[72313]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:19:13 np0005603622 kernel: loop: module loaded
Jan 31 02:19:13 np0005603622 kernel: loop3: detected capacity change from 0 to 14680064
Jan 31 02:19:13 np0005603622 python3[72348]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:19:13 np0005603622 lvm[72351]: PV /dev/loop3 not used.
Jan 31 02:19:13 np0005603622 lvm[72360]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:19:13 np0005603622 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 31 02:19:13 np0005603622 lvm[72362]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 31 02:19:13 np0005603622 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 31 02:19:14 np0005603622 python3[72440]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 31 02:19:14 np0005603622 python3[72513]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769843953.864937-37124-17631210811438/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:19:15 np0005603622 python3[72563]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:19:15 np0005603622 systemd[1]: Reloading.
Jan 31 02:19:15 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:19:15 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:19:15 np0005603622 systemd[1]: Starting Ceph OSD losetup...
Jan 31 02:19:15 np0005603622 bash[72602]: /dev/loop3: [64513]:4355756 (/var/lib/ceph-osd-0.img)
Jan 31 02:19:15 np0005603622 systemd[1]: Finished Ceph OSD losetup.
Jan 31 02:19:15 np0005603622 lvm[72603]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:19:15 np0005603622 lvm[72603]: VG ceph_vg0 finished
Jan 31 02:19:17 np0005603622 python3[72627]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:19:23 np0005603622 chronyd[58642]: Selected source 23.133.168.245 (pool.ntp.org)
Jan 31 02:21:19 np0005603622 systemd[1]: Created slice User Slice of UID 42477.
Jan 31 02:21:19 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 31 02:21:19 np0005603622 systemd-logind[817]: New session 20 of user ceph-admin.
Jan 31 02:21:19 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 31 02:21:19 np0005603622 systemd[1]: Starting User Manager for UID 42477...
Jan 31 02:21:20 np0005603622 systemd[72676]: Queued start job for default target Main User Target.
Jan 31 02:21:20 np0005603622 systemd[72676]: Created slice User Application Slice.
Jan 31 02:21:20 np0005603622 systemd[72676]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:21:20 np0005603622 systemd[72676]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:21:20 np0005603622 systemd[72676]: Reached target Paths.
Jan 31 02:21:20 np0005603622 systemd[72676]: Reached target Timers.
Jan 31 02:21:20 np0005603622 systemd[72676]: Starting D-Bus User Message Bus Socket...
Jan 31 02:21:20 np0005603622 systemd[72676]: Starting Create User's Volatile Files and Directories...
Jan 31 02:21:20 np0005603622 systemd[72676]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:21:20 np0005603622 systemd[72676]: Reached target Sockets.
Jan 31 02:21:20 np0005603622 systemd-logind[817]: New session 22 of user ceph-admin.
Jan 31 02:21:20 np0005603622 systemd[72676]: Finished Create User's Volatile Files and Directories.
Jan 31 02:21:20 np0005603622 systemd[72676]: Reached target Basic System.
Jan 31 02:21:20 np0005603622 systemd[72676]: Reached target Main User Target.
Jan 31 02:21:20 np0005603622 systemd[72676]: Startup finished in 107ms.
Jan 31 02:21:20 np0005603622 systemd[1]: Started User Manager for UID 42477.
Jan 31 02:21:20 np0005603622 systemd[1]: Started Session 20 of User ceph-admin.
Jan 31 02:21:20 np0005603622 systemd[1]: Started Session 22 of User ceph-admin.
Jan 31 02:21:20 np0005603622 systemd-logind[817]: New session 23 of user ceph-admin.
Jan 31 02:21:20 np0005603622 systemd[1]: Started Session 23 of User ceph-admin.
Jan 31 02:21:20 np0005603622 systemd-logind[817]: New session 24 of user ceph-admin.
Jan 31 02:21:20 np0005603622 systemd[1]: Started Session 24 of User ceph-admin.
Jan 31 02:21:21 np0005603622 systemd-logind[817]: New session 25 of user ceph-admin.
Jan 31 02:21:21 np0005603622 systemd[1]: Started Session 25 of User ceph-admin.
Jan 31 02:21:21 np0005603622 systemd-logind[817]: New session 26 of user ceph-admin.
Jan 31 02:21:21 np0005603622 systemd[1]: Started Session 26 of User ceph-admin.
Jan 31 02:21:21 np0005603622 systemd-logind[817]: New session 27 of user ceph-admin.
Jan 31 02:21:21 np0005603622 systemd[1]: Started Session 27 of User ceph-admin.
Jan 31 02:21:22 np0005603622 systemd-logind[817]: New session 28 of user ceph-admin.
Jan 31 02:21:22 np0005603622 systemd[1]: Started Session 28 of User ceph-admin.
Jan 31 02:21:22 np0005603622 systemd-logind[817]: New session 29 of user ceph-admin.
Jan 31 02:21:22 np0005603622 systemd[1]: Started Session 29 of User ceph-admin.
Jan 31 02:21:23 np0005603622 systemd-logind[817]: New session 30 of user ceph-admin.
Jan 31 02:21:23 np0005603622 systemd[1]: Started Session 30 of User ceph-admin.
Jan 31 02:21:23 np0005603622 systemd-logind[817]: New session 31 of user ceph-admin.
Jan 31 02:21:23 np0005603622 systemd[1]: Started Session 31 of User ceph-admin.
Jan 31 02:21:23 np0005603622 systemd-logind[817]: New session 32 of user ceph-admin.
Jan 31 02:21:23 np0005603622 systemd[1]: Started Session 32 of User ceph-admin.
Jan 31 02:21:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:25 np0005603622 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73644 (sysctl)
Jan 31 02:21:26 np0005603622 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 31 02:21:26 np0005603622 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 31 02:21:29 np0005603622 systemd[1]: var-lib-containers-storage-overlay-compat2002430661-lower\x2dmapped.mount: Deactivated successfully.
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.76486487 +0000 UTC m=+14.559969927 container create b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 02:21:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1871797866-merged.mount: Deactivated successfully.
Jan 31 02:21:41 np0005603622 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 31 02:21:41 np0005603622 systemd[1]: Started libpod-conmon-b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585.scope.
Jan 31 02:21:41 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.837773108 +0000 UTC m=+14.632878205 container init b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.752519686 +0000 UTC m=+14.547624763 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.843856273 +0000 UTC m=+14.638961330 container start b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.846749431 +0000 UTC m=+14.641854508 container attach b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 31 02:21:41 np0005603622 dazzling_lederberg[73982]: 167 167
Jan 31 02:21:41 np0005603622 systemd[1]: libpod-b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585.scope: Deactivated successfully.
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.849738612 +0000 UTC m=+14.644843659 container died b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2)
Jan 31 02:21:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay-a76b1a3c82fb132f11c58fc4dfe8b8340d4f5635845cc5a7c253ad6f01e51060-merged.mount: Deactivated successfully.
Jan 31 02:21:41 np0005603622 podman[73922]: 2026-01-31 07:21:41.879965472 +0000 UTC m=+14.675070519 container remove b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_lederberg, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 31 02:21:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:41 np0005603622 systemd[1]: libpod-conmon-b2b9f5abbd8e863f7ffb8d88324d495ee163ffbd726259db80af6760d4254585.scope: Deactivated successfully.
Jan 31 02:21:41 np0005603622 podman[74006]: 2026-01-31 07:21:41.990557781 +0000 UTC m=+0.036666425 container create 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:21:42 np0005603622 systemd[1]: Started libpod-conmon-6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563.scope.
Jan 31 02:21:42 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:42 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8e47663b20dd160fcc6eb81fbf468ddacb92fa2bd89babac84590498e85c352/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:42 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8e47663b20dd160fcc6eb81fbf468ddacb92fa2bd89babac84590498e85c352/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:42 np0005603622 podman[74006]: 2026-01-31 07:21:42.045644905 +0000 UTC m=+0.091753549 container init 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:42 np0005603622 podman[74006]: 2026-01-31 07:21:42.04986475 +0000 UTC m=+0.095973394 container start 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:21:42 np0005603622 podman[74006]: 2026-01-31 07:21:42.053450037 +0000 UTC m=+0.099558691 container attach 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS)
Jan 31 02:21:42 np0005603622 podman[74006]: 2026-01-31 07:21:41.974726462 +0000 UTC m=+0.020835146 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]: [
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:    {
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "available": false,
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "ceph_device": false,
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "lsm_data": {},
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "lvs": [],
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "path": "/dev/sr0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "rejected_reasons": [
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "Insufficient space (<5GB)",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "Has a FileSystem"
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        ],
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        "sys_api": {
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "actuators": null,
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "device_nodes": "sr0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "devname": "sr0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "human_readable_size": "482.00 KB",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "id_bus": "ata",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "model": "QEMU DVD-ROM",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "nr_requests": "2",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "parent": "/dev/sr0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "partitions": {},
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "path": "/dev/sr0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "removable": "1",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "rev": "2.5+",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "ro": "0",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "rotational": "1",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "sas_address": "",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "sas_device_handle": "",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "scheduler_mode": "mq-deadline",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "sectors": 0,
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "sectorsize": "2048",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "size": 493568.0,
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "support_discard": "2048",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "type": "disk",
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:            "vendor": "QEMU"
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:        }
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]:    }
Jan 31 02:21:42 np0005603622 optimistic_cerf[74023]: ]
Jan 31 02:21:43 np0005603622 systemd[1]: libpod-6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563.scope: Deactivated successfully.
Jan 31 02:21:43 np0005603622 podman[74006]: 2026-01-31 07:21:43.017517233 +0000 UTC m=+1.063625897 container died 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 31 02:21:43 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f8e47663b20dd160fcc6eb81fbf468ddacb92fa2bd89babac84590498e85c352-merged.mount: Deactivated successfully.
Jan 31 02:21:43 np0005603622 podman[74006]: 2026-01-31 07:21:43.088302982 +0000 UTC m=+1.134411626 container remove 6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_cerf, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3)
Jan 31 02:21:43 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:43 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:43 np0005603622 systemd[1]: libpod-conmon-6c195c0ebd20179b13860b66f38cfbe788dcb874c14d86f8c219e24f5a72c563.scope: Deactivated successfully.
Jan 31 02:21:46 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:46 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:46 np0005603622 podman[76847]: 2026-01-31 07:21:46.96062133 +0000 UTC m=+0.044947010 container create 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 31 02:21:47 np0005603622 systemd[1]: Started libpod-conmon-3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd.scope.
Jan 31 02:21:47 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:46.939566679 +0000 UTC m=+0.023892379 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:47.040154857 +0000 UTC m=+0.124480557 container init 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:47.046758086 +0000 UTC m=+0.131083766 container start 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:47 np0005603622 flamboyant_buck[76864]: 167 167
Jan 31 02:21:47 np0005603622 systemd[1]: libpod-3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd.scope: Deactivated successfully.
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:47.053367696 +0000 UTC m=+0.137693406 container attach 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:47.053926781 +0000 UTC m=+0.138252461 container died 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 31 02:21:47 np0005603622 podman[76847]: 2026-01-31 07:21:47.116264851 +0000 UTC m=+0.200590531 container remove 3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_buck, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 31 02:21:47 np0005603622 systemd[1]: libpod-conmon-3f73794b2beec4170aa184474bbc430355c923901b65fb86d98e7828b6a40dcd.scope: Deactivated successfully.
Jan 31 02:21:47 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:47 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:47 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:47 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:47 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:47 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:47 np0005603622 systemd[1]: Reached target All Ceph clusters and services.
Jan 31 02:21:47 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:47 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:47 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:47 np0005603622 systemd[1]: Reached target Ceph cluster 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:21:47 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:47 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:47 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:48 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:48 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:48 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:48 np0005603622 systemd[1]: Created slice Slice /system/ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:21:48 np0005603622 systemd[1]: Reached target System Time Set.
Jan 31 02:21:48 np0005603622 systemd[1]: Reached target System Time Synchronized.
Jan 31 02:21:48 np0005603622 systemd[1]: Starting Ceph crash.compute-1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:21:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 31 02:21:48 np0005603622 podman[77120]: 2026-01-31 07:21:48.571084136 +0000 UTC m=+0.039448861 container create 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:21:48 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc00132c4fcf29bee79e7170158050fe05162a7f04d0e70aec3b6f016aea39af/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:48 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc00132c4fcf29bee79e7170158050fe05162a7f04d0e70aec3b6f016aea39af/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:48 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc00132c4fcf29bee79e7170158050fe05162a7f04d0e70aec3b6f016aea39af/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:48 np0005603622 podman[77120]: 2026-01-31 07:21:48.639186023 +0000 UTC m=+0.107550778 container init 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 31 02:21:48 np0005603622 podman[77120]: 2026-01-31 07:21:48.642728939 +0000 UTC m=+0.111093654 container start 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:21:48 np0005603622 bash[77120]: 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79
Jan 31 02:21:48 np0005603622 podman[77120]: 2026-01-31 07:21:48.55316246 +0000 UTC m=+0.021527245 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:48 np0005603622 systemd[1]: Started Ceph crash.compute-1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:21:48 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.026+0000 7fb432b74640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.026+0000 7fb432b74640 -1 AuthRegistry(0x7fb42c066fe0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.027+0000 7fb432b74640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.027+0000 7fb432b74640 -1 AuthRegistry(0x7fb432b73000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.029+0000 7fb4308e9640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: 2026-01-31T07:21:49.029+0000 7fb432b74640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 31 02:21:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1[77135]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.225030701 +0000 UTC m=+0.045022022 container create 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 31 02:21:49 np0005603622 systemd[1]: Started libpod-conmon-87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11.scope.
Jan 31 02:21:49 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.199663323 +0000 UTC m=+0.019654674 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.299382017 +0000 UTC m=+0.119373388 container init 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.310228922 +0000 UTC m=+0.130220243 container start 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.315289329 +0000 UTC m=+0.135280680 container attach 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:21:49 np0005603622 brave_sutherland[77308]: 167 167
Jan 31 02:21:49 np0005603622 systemd[1]: libpod-87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11.scope: Deactivated successfully.
Jan 31 02:21:49 np0005603622 conmon[77308]: conmon 87ff298815892a20f95f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11.scope/container/memory.events
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.316735228 +0000 UTC m=+0.136726589 container died 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 31 02:21:49 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8533c2df68d9b63430aedc3eb41e5f68c587532802e213f16370f7eeb098e434-merged.mount: Deactivated successfully.
Jan 31 02:21:49 np0005603622 podman[77291]: 2026-01-31 07:21:49.364505203 +0000 UTC m=+0.184496564 container remove 87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_sutherland, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 31 02:21:49 np0005603622 systemd[1]: libpod-conmon-87ff298815892a20f95feec24c4c0d0b6327c5147144b76610b7a68c8f334e11.scope: Deactivated successfully.
Jan 31 02:21:49 np0005603622 podman[77332]: 2026-01-31 07:21:49.480461598 +0000 UTC m=+0.043955153 container create 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 31 02:21:49 np0005603622 systemd[1]: Started libpod-conmon-08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d.scope.
Jan 31 02:21:49 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:49 np0005603622 podman[77332]: 2026-01-31 07:21:49.45986543 +0000 UTC m=+0.023358995 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:49 np0005603622 podman[77332]: 2026-01-31 07:21:49.553781707 +0000 UTC m=+0.117275222 container init 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:21:49 np0005603622 podman[77332]: 2026-01-31 07:21:49.563942482 +0000 UTC m=+0.127435997 container start 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:21:49 np0005603622 podman[77332]: 2026-01-31 07:21:49.568179517 +0000 UTC m=+0.131673032 container attach 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: --> passed data devices: 0 physical, 1 LVM
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: --> relative data size: 1.0
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new c7b96aaa-43a0-4c7e-ac49-508c01d627b5
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 31 02:21:50 np0005603622 lvm[77396]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:21:50 np0005603622 lvm[77396]: VG ceph_vg0 finished
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:50 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Jan 31 02:21:51 np0005603622 competent_hofstadter[77348]: stderr: got monmap epoch 1
Jan 31 02:21:51 np0005603622 competent_hofstadter[77348]: --> Creating keyring file for osd.1
Jan 31 02:21:51 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Jan 31 02:21:51 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Jan 31 02:21:51 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid c7b96aaa-43a0-4c7e-ac49-508c01d627b5 --setuser ceph --setgroup ceph
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: stderr: 2026-01-31T07:21:51.467+0000 7fb9e8a4d740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: stderr: 2026-01-31T07:21:51.467+0000 7fb9e8a4d740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: stderr: 2026-01-31T07:21:51.467+0000 7fb9e8a4d740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: stderr: 2026-01-31T07:21:51.467+0000 7fb9e8a4d740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 31 02:21:53 np0005603622 competent_hofstadter[77348]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 31 02:21:54 np0005603622 systemd[1]: libpod-08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d.scope: Deactivated successfully.
Jan 31 02:21:54 np0005603622 systemd[1]: libpod-08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d.scope: Consumed 2.270s CPU time.
Jan 31 02:21:54 np0005603622 podman[77332]: 2026-01-31 07:21:54.005069427 +0000 UTC m=+4.568562952 container died 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:21:54 np0005603622 systemd[1]: var-lib-containers-storage-overlay-27f97623c8975bff1a837d82a8ebe0dcdeb32a5ce640a2d45add06e184b01460-merged.mount: Deactivated successfully.
Jan 31 02:21:54 np0005603622 podman[77332]: 2026-01-31 07:21:54.065392163 +0000 UTC m=+4.628885688 container remove 08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 31 02:21:54 np0005603622 systemd[1]: libpod-conmon-08ceeed369358b69606027818b62f590463278f147af6af7453b4ef81b79753d.scope: Deactivated successfully.
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.588322285 +0000 UTC m=+0.036954433 container create e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:21:54 np0005603622 systemd[1]: Started libpod-conmon-e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd.scope.
Jan 31 02:21:54 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.655552729 +0000 UTC m=+0.104184897 container init e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.661631873 +0000 UTC m=+0.110264021 container start e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.664696137 +0000 UTC m=+0.113328305 container attach e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 31 02:21:54 np0005603622 eager_aryabhata[78480]: 167 167
Jan 31 02:21:54 np0005603622 systemd[1]: libpod-e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd.scope: Deactivated successfully.
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.666184237 +0000 UTC m=+0.114816385 container died e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.573440862 +0000 UTC m=+0.022073030 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:54 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8a2692532a946a5a3500c4f0eb859061d4444da7e09f82f3906f55e9c3f21302-merged.mount: Deactivated successfully.
Jan 31 02:21:54 np0005603622 podman[78463]: 2026-01-31 07:21:54.702719988 +0000 UTC m=+0.151352136 container remove e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_aryabhata, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 31 02:21:54 np0005603622 systemd[1]: libpod-conmon-e065e2c0b01fa1fc17b54c0f627ef2e0c1474040ddea6eb1abc8fc71b3da22fd.scope: Deactivated successfully.
Jan 31 02:21:54 np0005603622 podman[78506]: 2026-01-31 07:21:54.822064104 +0000 UTC m=+0.040750396 container create 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 31 02:21:54 np0005603622 systemd[1]: Started libpod-conmon-49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e.scope.
Jan 31 02:21:54 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6bc557cc8c9ae5ffed63e338712e44ac2f940318ed7ff16dfcbc8570a4a9d0d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6bc557cc8c9ae5ffed63e338712e44ac2f940318ed7ff16dfcbc8570a4a9d0d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6bc557cc8c9ae5ffed63e338712e44ac2f940318ed7ff16dfcbc8570a4a9d0d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6bc557cc8c9ae5ffed63e338712e44ac2f940318ed7ff16dfcbc8570a4a9d0d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:54 np0005603622 podman[78506]: 2026-01-31 07:21:54.806171953 +0000 UTC m=+0.024858265 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:54 np0005603622 podman[78506]: 2026-01-31 07:21:54.903973936 +0000 UTC m=+0.122660248 container init 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 31 02:21:54 np0005603622 podman[78506]: 2026-01-31 07:21:54.909220578 +0000 UTC m=+0.127906870 container start 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:21:54 np0005603622 podman[78506]: 2026-01-31 07:21:54.912747624 +0000 UTC m=+0.131433926 container attach 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:55 np0005603622 strange_haslett[78524]: {
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:    "1": [
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:        {
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "devices": [
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "/dev/loop3"
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            ],
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "lv_name": "ceph_lv0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "lv_size": "7511998464",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=lIltGR-hfBv-myl9-02pm-S5mx-5eSq-SbyHnW,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2f5ab832-5f2e-5a84-bd93-cf8bab960ee2,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c7b96aaa-43a0-4c7e-ac49-508c01d627b5,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "lv_uuid": "lIltGR-hfBv-myl9-02pm-S5mx-5eSq-SbyHnW",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "name": "ceph_lv0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "tags": {
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.block_uuid": "lIltGR-hfBv-myl9-02pm-S5mx-5eSq-SbyHnW",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.cephx_lockbox_secret": "",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.cluster_fsid": "2f5ab832-5f2e-5a84-bd93-cf8bab960ee2",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.cluster_name": "ceph",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.crush_device_class": "",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.encrypted": "0",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.osd_fsid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.osd_id": "1",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.type": "block",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:                "ceph.vdo": "0"
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            },
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "type": "block",
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:            "vg_name": "ceph_vg0"
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:        }
Jan 31 02:21:55 np0005603622 strange_haslett[78524]:    ]
Jan 31 02:21:55 np0005603622 strange_haslett[78524]: }
Jan 31 02:21:55 np0005603622 systemd[1]: libpod-49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e.scope: Deactivated successfully.
Jan 31 02:21:55 np0005603622 podman[78506]: 2026-01-31 07:21:55.592744676 +0000 UTC m=+0.811430978 container died 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:21:55 np0005603622 systemd[1]: var-lib-containers-storage-overlay-e6bc557cc8c9ae5ffed63e338712e44ac2f940318ed7ff16dfcbc8570a4a9d0d-merged.mount: Deactivated successfully.
Jan 31 02:21:55 np0005603622 podman[78506]: 2026-01-31 07:21:55.658011825 +0000 UTC m=+0.876698157 container remove 49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_haslett, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 31 02:21:55 np0005603622 systemd[1]: libpod-conmon-49a59dd14fe560ed9265fb46dd3168c146367153e6a08ff7f22507116365964e.scope: Deactivated successfully.
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.172131638 +0000 UTC m=+0.037694413 container create 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:21:56 np0005603622 systemd[1]: Started libpod-conmon-78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d.scope.
Jan 31 02:21:56 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.238303292 +0000 UTC m=+0.103866067 container init 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.242556818 +0000 UTC m=+0.108119583 container start 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 31 02:21:56 np0005603622 systemd[1]: libpod-78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d.scope: Deactivated successfully.
Jan 31 02:21:56 np0005603622 mystifying_neumann[78704]: 167 167
Jan 31 02:21:56 np0005603622 conmon[78704]: conmon 78bad1d5417aeaf65097 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d.scope/container/memory.events
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.246661999 +0000 UTC m=+0.112224764 container attach 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.247139862 +0000 UTC m=+0.112702627 container died 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.156128754 +0000 UTC m=+0.021691569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:56 np0005603622 systemd[1]: var-lib-containers-storage-overlay-928bc93dcbdaea1b95923f5b7c0e493bd2a0cfca9d5cedf737abace62f0715a9-merged.mount: Deactivated successfully.
Jan 31 02:21:56 np0005603622 podman[78688]: 2026-01-31 07:21:56.279093469 +0000 UTC m=+0.144656254 container remove 78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_neumann, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 31 02:21:56 np0005603622 systemd[1]: libpod-conmon-78bad1d5417aeaf6509792580b474fc7e96f41408c90310b99dbbe0fc440267d.scope: Deactivated successfully.
Jan 31 02:21:56 np0005603622 podman[78737]: 2026-01-31 07:21:56.445382878 +0000 UTC m=+0.039373668 container create 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:21:56 np0005603622 systemd[1]: Started libpod-conmon-26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c.scope.
Jan 31 02:21:56 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:56 np0005603622 podman[78737]: 2026-01-31 07:21:56.512064297 +0000 UTC m=+0.106055137 container init 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 31 02:21:56 np0005603622 podman[78737]: 2026-01-31 07:21:56.520555757 +0000 UTC m=+0.114546587 container start 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 31 02:21:56 np0005603622 podman[78737]: 2026-01-31 07:21:56.524168385 +0000 UTC m=+0.118159205 container attach 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Jan 31 02:21:56 np0005603622 podman[78737]: 2026-01-31 07:21:56.429395205 +0000 UTC m=+0.023386005 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:57 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test[78753]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 31 02:21:57 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test[78753]:                            [--no-systemd] [--no-tmpfs]
Jan 31 02:21:57 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test[78753]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 31 02:21:57 np0005603622 systemd[1]: libpod-26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c.scope: Deactivated successfully.
Jan 31 02:21:57 np0005603622 podman[78737]: 2026-01-31 07:21:57.161835279 +0000 UTC m=+0.755826069 container died 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 31 02:21:57 np0005603622 systemd[1]: var-lib-containers-storage-overlay-42f1ddf54b651992b4f04b26d5567f3aaafbe60e79f802f8601135def9a32ea1-merged.mount: Deactivated successfully.
Jan 31 02:21:57 np0005603622 podman[78737]: 2026-01-31 07:21:57.279022137 +0000 UTC m=+0.873013007 container remove 26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate-test, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 31 02:21:57 np0005603622 systemd[1]: libpod-conmon-26f32d52beb0d791d04faf266832e1896130002cf670f8fac0405671be33744c.scope: Deactivated successfully.
Jan 31 02:21:57 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:57 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:57 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:57 np0005603622 systemd[1]: Reloading.
Jan 31 02:21:57 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:21:57 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:21:57 np0005603622 systemd[1]: Starting Ceph osd.1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:21:58 np0005603622 podman[78917]: 2026-01-31 07:21:58.155496777 +0000 UTC m=+0.039997925 container create 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:21:58 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:21:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:58 np0005603622 podman[78917]: 2026-01-31 07:21:58.229440433 +0000 UTC m=+0.113941581 container init 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 02:21:58 np0005603622 podman[78917]: 2026-01-31 07:21:58.234599363 +0000 UTC m=+0.119100491 container start 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 31 02:21:58 np0005603622 podman[78917]: 2026-01-31 07:21:58.138912348 +0000 UTC m=+0.023413526 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:58 np0005603622 podman[78917]: 2026-01-31 07:21:58.238380635 +0000 UTC m=+0.122881783 container attach 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:59 np0005603622 bash[78917]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 31 02:21:59 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate[78932]: --> ceph-volume raw activate successful for osd ID: 1
Jan 31 02:21:59 np0005603622 bash[78917]: --> ceph-volume raw activate successful for osd ID: 1
Jan 31 02:21:59 np0005603622 systemd[1]: libpod-71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209.scope: Deactivated successfully.
Jan 31 02:21:59 np0005603622 podman[78917]: 2026-01-31 07:21:59.178899353 +0000 UTC m=+1.063400541 container died 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 31 02:21:59 np0005603622 systemd[1]: var-lib-containers-storage-overlay-6b82f189e21b1d7281fe75e02b36c259dbba4c499e4a94128d6f29d6ee0787ba-merged.mount: Deactivated successfully.
Jan 31 02:21:59 np0005603622 podman[78917]: 2026-01-31 07:21:59.242727183 +0000 UTC m=+1.127228351 container remove 71ac555a78bfc0dbb99d4c36be645e6a8b55df38f2fccec0fcd1adf65551d209 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1-activate, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 31 02:21:59 np0005603622 podman[79088]: 2026-01-31 07:21:59.382933565 +0000 UTC m=+0.032008789 container create 7f5c9da71c46861b41f6f4ebf3638ee9f260f46055af2103a024a3f4a0d2f735 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:21:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4513d66735df0b51968ef0a511a048d7f4ef566886eb2106266378243e2103f1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4513d66735df0b51968ef0a511a048d7f4ef566886eb2106266378243e2103f1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4513d66735df0b51968ef0a511a048d7f4ef566886eb2106266378243e2103f1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4513d66735df0b51968ef0a511a048d7f4ef566886eb2106266378243e2103f1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4513d66735df0b51968ef0a511a048d7f4ef566886eb2106266378243e2103f1/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:21:59 np0005603622 podman[79088]: 2026-01-31 07:21:59.43731291 +0000 UTC m=+0.086388184 container init 7f5c9da71c46861b41f6f4ebf3638ee9f260f46055af2103a024a3f4a0d2f735 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:21:59 np0005603622 podman[79088]: 2026-01-31 07:21:59.449247484 +0000 UTC m=+0.098322738 container start 7f5c9da71c46861b41f6f4ebf3638ee9f260f46055af2103a024a3f4a0d2f735 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 31 02:21:59 np0005603622 bash[79088]: 7f5c9da71c46861b41f6f4ebf3638ee9f260f46055af2103a024a3f4a0d2f735
Jan 31 02:21:59 np0005603622 podman[79088]: 2026-01-31 07:21:59.368773441 +0000 UTC m=+0.017848685 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:21:59 np0005603622 systemd[1]: Started Ceph osd.1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: pidfile_write: ignore empty --pid-file
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb25e61800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb25e61800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb25e61800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb25e61800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb26ca3800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb26ca3800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb26ca3800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb26ca3800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb26ca3800 /var/lib/ceph/osd/ceph-1/block) close
Jan 31 02:21:59 np0005603622 ceph-osd[79107]: bdev(0x55bb25e61800 /var/lib/ceph/osd/ceph-1/block) close
Jan 31 02:21:59 np0005603622 podman[79262]: 2026-01-31 07:21:59.939406467 +0000 UTC m=+0.038037733 container create 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 31 02:21:59 np0005603622 systemd[1]: Started libpod-conmon-5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7.scope.
Jan 31 02:21:59 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:22:00.004651036 +0000 UTC m=+0.103282322 container init 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:22:00.012813408 +0000 UTC m=+0.111444674 container start 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:22:00.015617794 +0000 UTC m=+0.114249110 container attach 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 31 02:22:00 np0005603622 vibrant_gould[79278]: 167 167
Jan 31 02:22:00 np0005603622 systemd[1]: libpod-5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7.scope: Deactivated successfully.
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:22:00.017119905 +0000 UTC m=+0.115751171 container died 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:21:59.92254252 +0000 UTC m=+0.021173836 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Jan 31 02:22:00 np0005603622 systemd[1]: var-lib-containers-storage-overlay-50207484335fae06a6e5a03530822754318816991f49c940bf88cd8f5b7e56b8-merged.mount: Deactivated successfully.
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: load: jerasure load: lrc 
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:00 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 31 02:22:00 np0005603622 podman[79262]: 2026-01-31 07:22:00.051847256 +0000 UTC m=+0.150478512 container remove 5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_gould, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 31 02:22:00 np0005603622 systemd[1]: libpod-conmon-5e9e7355ff9d3cbbff85f89dc93565d20e44e934d832380cff630474443e3eb7.scope: Deactivated successfully.
Jan 31 02:22:00 np0005603622 podman[79307]: 2026-01-31 07:22:00.158890979 +0000 UTC m=+0.023637892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:01 np0005603622 podman[79307]: 2026-01-31 07:22:01.477800269 +0000 UTC m=+1.342547162 container create 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 31 02:22:01 np0005603622 systemd[1]: Started libpod-conmon-095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a.scope.
Jan 31 02:22:01 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c6a57bb7f3792e7f616006aed18b03ef6a7df56388b67ff560232d3df0b3602/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c6a57bb7f3792e7f616006aed18b03ef6a7df56388b67ff560232d3df0b3602/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c6a57bb7f3792e7f616006aed18b03ef6a7df56388b67ff560232d3df0b3602/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c6a57bb7f3792e7f616006aed18b03ef6a7df56388b67ff560232d3df0b3602/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:01 np0005603622 podman[79307]: 2026-01-31 07:22:01.559116934 +0000 UTC m=+1.423863847 container init 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 31 02:22:01 np0005603622 podman[79307]: 2026-01-31 07:22:01.565478017 +0000 UTC m=+1.430224910 container start 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 31 02:22:01 np0005603622 podman[79307]: 2026-01-31 07:22:01.572672932 +0000 UTC m=+1.437419885 container attach 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d24c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluefs mount
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluefs mount shared_bdev_used = 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Git sha 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: DB SUMMARY
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: DB Session ID:  F9G8GZFFX8EPLQ404KA2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                                     Options.env: 0x55bb26cf5d50
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                                Options.info_log: 0x55bb25edeb60
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                                 Options.wal_dir: db.wal
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.write_buffer_manager: 0x55bb26dfe460
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_background_jobs: 4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Compression algorithms supported:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ede5a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed4430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f53909fb-f15c-4a76-835d-d8696fc06527
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844121787604, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844121787776, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: freelist init
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: freelist _read_cfg
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bluefs umount
Jan 31 02:22:01 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) close
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bdev(0x55bb26d25400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluefs mount
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluefs mount shared_bdev_used = 4718592
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Git sha 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: DB SUMMARY
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: DB Session ID:  F9G8GZFFX8EPLQ404KA3
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                                     Options.env: 0x55bb2602c310
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                                Options.info_log: 0x55bb25edf860
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                                 Options.wal_dir: db.wal
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.write_buffer_manager: 0x55bb26dfe460
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_background_jobs: 4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Compression algorithms supported:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ee80c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ebbb40)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ebbb40)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:           Options.merge_operator: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bb25ebbb40)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bb25ed5770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.write_buffer_size: 16777216
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.max_write_buffer_number: 64
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.compression: LZ4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f53909fb-f15c-4a76-835d-d8696fc06527
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844122060510, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844122068804, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844122, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f53909fb-f15c-4a76-835d-d8696fc06527", "db_session_id": "F9G8GZFFX8EPLQ404KA3", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844122078389, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844122, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f53909fb-f15c-4a76-835d-d8696fc06527", "db_session_id": "F9G8GZFFX8EPLQ404KA3", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844122083222, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844122, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f53909fb-f15c-4a76-835d-d8696fc06527", "db_session_id": "F9G8GZFFX8EPLQ404KA3", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844122087070, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bb25fa7c00
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: DB pointer 0x55bb26de7a00
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 460.80 MB usag
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: _get_class not permitted to load lua
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: _get_class not permitted to load sdk
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: _get_class not permitted to load test_remote_reads
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 load_pgs
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 load_pgs opened 0 pgs
Jan 31 02:22:02 np0005603622 ceph-osd[79107]: osd.1 0 log_to_monitors true
Jan 31 02:22:02 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1[79103]: 2026-01-31T07:22:02.136+0000 7fc5e230e740 -1 osd.1 0 log_to_monitors true
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]: {
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:    "c7b96aaa-43a0-4c7e-ac49-508c01d627b5": {
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:        "ceph_fsid": "2f5ab832-5f2e-5a84-bd93-cf8bab960ee2",
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:        "osd_id": 1,
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:        "osd_uuid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5",
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:        "type": "bluestore"
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]:    }
Jan 31 02:22:02 np0005603622 kind_bhabha[79328]: }
Jan 31 02:22:02 np0005603622 systemd[1]: libpod-095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a.scope: Deactivated successfully.
Jan 31 02:22:02 np0005603622 podman[79307]: 2026-01-31 07:22:02.405720694 +0000 UTC m=+2.270467617 container died 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 31 02:22:02 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9c6a57bb7f3792e7f616006aed18b03ef6a7df56388b67ff560232d3df0b3602-merged.mount: Deactivated successfully.
Jan 31 02:22:02 np0005603622 podman[79307]: 2026-01-31 07:22:02.462020061 +0000 UTC m=+2.326766954 container remove 095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_bhabha, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Jan 31 02:22:02 np0005603622 systemd[1]: libpod-conmon-095d819460e497efedf8071762a36eb9674a3901743b472af7ec57c730d7659a.scope: Deactivated successfully.
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 31 02:22:03 np0005603622 podman[79994]: 2026-01-31 07:22:03.483556985 +0000 UTC m=+0.089595761 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True)
Jan 31 02:22:03 np0005603622 podman[79994]: 2026-01-31 07:22:03.665703104 +0000 UTC m=+0.271741880 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 done with init, starting boot process
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 start_boot
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 31 02:22:03 np0005603622 ceph-osd[79107]: osd.1 0  bench count 12288000 bsize 4 KiB
Jan 31 02:22:04 np0005603622 podman[80186]: 2026-01-31 07:22:04.820984716 +0000 UTC m=+0.017286280 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:04 np0005603622 podman[80186]: 2026-01-31 07:22:04.934856364 +0000 UTC m=+0.131157898 container create d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 31 02:22:04 np0005603622 systemd[1]: Started libpod-conmon-d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e.scope.
Jan 31 02:22:05 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:05 np0005603622 podman[80186]: 2026-01-31 07:22:05.056634587 +0000 UTC m=+0.252936131 container init d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:05 np0005603622 podman[80186]: 2026-01-31 07:22:05.065658482 +0000 UTC m=+0.261960026 container start d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:22:05 np0005603622 eloquent_liskov[80202]: 167 167
Jan 31 02:22:05 np0005603622 systemd[1]: libpod-d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e.scope: Deactivated successfully.
Jan 31 02:22:05 np0005603622 podman[80186]: 2026-01-31 07:22:05.103325323 +0000 UTC m=+0.299626857 container attach d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:05 np0005603622 podman[80186]: 2026-01-31 07:22:05.104496045 +0000 UTC m=+0.300797609 container died d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:05 np0005603622 systemd[1]: var-lib-containers-storage-overlay-22e0c8775fc327498afe782a5d29baa3531b54852869afedf0b362ece9aec986-merged.mount: Deactivated successfully.
Jan 31 02:22:05 np0005603622 podman[80186]: 2026-01-31 07:22:05.258831 +0000 UTC m=+0.455132524 container remove d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:05 np0005603622 systemd[1]: libpod-conmon-d94726cc0d55fbd1778ea4fa5ed6e90691ca1280d363709ed7d8e449c8f2684e.scope: Deactivated successfully.
Jan 31 02:22:05 np0005603622 podman[80226]: 2026-01-31 07:22:05.382430832 +0000 UTC m=+0.050831499 container create 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:05 np0005603622 systemd[1]: Started libpod-conmon-62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a.scope.
Jan 31 02:22:05 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e456d6e928f131f9d23d571998407c2bade874827966d6ef76616de6457cb06e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e456d6e928f131f9d23d571998407c2bade874827966d6ef76616de6457cb06e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e456d6e928f131f9d23d571998407c2bade874827966d6ef76616de6457cb06e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e456d6e928f131f9d23d571998407c2bade874827966d6ef76616de6457cb06e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:05 np0005603622 podman[80226]: 2026-01-31 07:22:05.353316183 +0000 UTC m=+0.021716870 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:05 np0005603622 podman[80226]: 2026-01-31 07:22:05.483582166 +0000 UTC m=+0.151982843 container init 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 31 02:22:05 np0005603622 podman[80226]: 2026-01-31 07:22:05.489702632 +0000 UTC m=+0.158103299 container start 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:05 np0005603622 podman[80226]: 2026-01-31 07:22:05.524493935 +0000 UTC m=+0.192894612 container attach 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]: [
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:    {
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "available": false,
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "ceph_device": false,
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "lsm_data": {},
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "lvs": [],
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "path": "/dev/sr0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "rejected_reasons": [
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "Has a FileSystem",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "Insufficient space (<5GB)"
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        ],
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        "sys_api": {
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "actuators": null,
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "device_nodes": "sr0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "devname": "sr0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "human_readable_size": "482.00 KB",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "id_bus": "ata",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "model": "QEMU DVD-ROM",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "nr_requests": "2",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "parent": "/dev/sr0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "partitions": {},
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "path": "/dev/sr0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "removable": "1",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "rev": "2.5+",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "ro": "0",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "rotational": "1",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "sas_address": "",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "sas_device_handle": "",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "scheduler_mode": "mq-deadline",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "sectors": 0,
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "sectorsize": "2048",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "size": 493568.0,
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "support_discard": "2048",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "type": "disk",
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:            "vendor": "QEMU"
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:        }
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]:    }
Jan 31 02:22:06 np0005603622 zen_lovelace[80242]: ]
Jan 31 02:22:06 np0005603622 systemd[1]: libpod-62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a.scope: Deactivated successfully.
Jan 31 02:22:06 np0005603622 podman[80226]: 2026-01-31 07:22:06.48707241 +0000 UTC m=+1.155473077 container died 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 31 02:22:06 np0005603622 systemd[1]: var-lib-containers-storage-overlay-e456d6e928f131f9d23d571998407c2bade874827966d6ef76616de6457cb06e-merged.mount: Deactivated successfully.
Jan 31 02:22:06 np0005603622 podman[80226]: 2026-01-31 07:22:06.627469038 +0000 UTC m=+1.295869705 container remove 62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:22:06 np0005603622 systemd[1]: libpod-conmon-62a46282f7bed8a9f43cd265ac4e9514c9806698df10a2139486d89c1be1705a.scope: Deactivated successfully.
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 24.077 iops: 6163.600 elapsed_sec: 0.487
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: log_channel(cluster) log [WRN] : OSD bench result of 6163.600447 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 0 waiting for initial osdmap
Jan 31 02:22:08 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1[79103]: 2026-01-31T07:22:08.086+0000 7fc5de28e640 -1 osd.1 0 waiting for initial osdmap
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 crush map has features 288514050185494528, adjusting msgr requires for clients
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 crush map has features 3314932999778484224, adjusting msgr requires for osds
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 check_osdmap_features require_osd_release unknown -> reef
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 set_numa_affinity not setting numa affinity
Jan 31 02:22:08 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-osd-1[79103]: 2026-01-31T07:22:08.116+0000 7fc5d98b6640 -1 osd.1 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 10 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 11 state: booting -> active
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 31 02:22:08 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 11 pg[1.0( empty local-lis/les=0/0 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=11) [1] r=0 lpr=11 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:22:10 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 12 pg[1.0( empty local-lis/les=11/12 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=11) [1] r=0 lpr=11 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.02493598 +0000 UTC m=+0.043220953 container create cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:26 np0005603622 systemd[1]: Started libpod-conmon-cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226.scope.
Jan 31 02:22:26 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.092535644 +0000 UTC m=+0.110820627 container init cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.096748068 +0000 UTC m=+0.115033041 container start cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.099715798 +0000 UTC m=+0.118000791 container attach cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:26 np0005603622 vibrant_greider[81349]: 167 167
Jan 31 02:22:26 np0005603622 systemd[1]: libpod-cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226.scope: Deactivated successfully.
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.101641941 +0000 UTC m=+0.119926944 container died cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.009379578 +0000 UTC m=+0.027664581 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay-13a6615f9286cce13cc84826713d3262c60ab64d12455635d5bb8b353f6eb266-merged.mount: Deactivated successfully.
Jan 31 02:22:26 np0005603622 podman[81332]: 2026-01-31 07:22:26.133545926 +0000 UTC m=+0.151830899 container remove cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vibrant_greider, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:22:26 np0005603622 systemd[1]: libpod-conmon-cff04ff3b3fe43f8794b7b6590a5eb3ef4e2832f59cce82db632ff46cb9f5226.scope: Deactivated successfully.
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.185776912 +0000 UTC m=+0.030553249 container create 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 31 02:22:26 np0005603622 systemd[1]: Started libpod-conmon-7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a.scope.
Jan 31 02:22:26 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b525e28ed1f8188a590fcfa970a3b421325fb96c05702080a10d52c8af6520/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b525e28ed1f8188a590fcfa970a3b421325fb96c05702080a10d52c8af6520/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b525e28ed1f8188a590fcfa970a3b421325fb96c05702080a10d52c8af6520/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b525e28ed1f8188a590fcfa970a3b421325fb96c05702080a10d52c8af6520/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.239544971 +0000 UTC m=+0.084321338 container init 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.246059397 +0000 UTC m=+0.090835744 container start 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.24984229 +0000 UTC m=+0.094618647 container attach 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.171841894 +0000 UTC m=+0.016618251 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:26 np0005603622 systemd[1]: libpod-7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a.scope: Deactivated successfully.
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.330268371 +0000 UTC m=+0.175044718 container died 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 31 02:22:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay-50b525e28ed1f8188a590fcfa970a3b421325fb96c05702080a10d52c8af6520-merged.mount: Deactivated successfully.
Jan 31 02:22:26 np0005603622 podman[81370]: 2026-01-31 07:22:26.364575532 +0000 UTC m=+0.209351869 container remove 7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_wilson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:22:26 np0005603622 systemd[1]: libpod-conmon-7174a95399c3586e90c558ac4f78d7dba5733eb93852c4b340df1eed95352c2a.scope: Deactivated successfully.
Jan 31 02:22:26 np0005603622 systemd[1]: Reloading.
Jan 31 02:22:26 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:26 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:26 np0005603622 systemd[1]: Reloading.
Jan 31 02:22:26 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:26 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:26 np0005603622 systemd[1]: Starting Ceph mon.compute-1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:27 np0005603622 podman[81551]: 2026-01-31 07:22:27.030781829 +0000 UTC m=+0.031672200 container create f7c1f81768ff0e7789a4f37505aec287808f828c82b8358d5b5899ad88f9baea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:22:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de8ff6866d2244f488a6ff02c41c247acae5984c774df7436782b2dbec6d32f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de8ff6866d2244f488a6ff02c41c247acae5984c774df7436782b2dbec6d32f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de8ff6866d2244f488a6ff02c41c247acae5984c774df7436782b2dbec6d32f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de8ff6866d2244f488a6ff02c41c247acae5984c774df7436782b2dbec6d32f/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:27 np0005603622 podman[81551]: 2026-01-31 07:22:27.085824322 +0000 UTC m=+0.086714673 container init f7c1f81768ff0e7789a4f37505aec287808f828c82b8358d5b5899ad88f9baea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 31 02:22:27 np0005603622 podman[81551]: 2026-01-31 07:22:27.090821058 +0000 UTC m=+0.091711419 container start f7c1f81768ff0e7789a4f37505aec287808f828c82b8358d5b5899ad88f9baea (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 31 02:22:27 np0005603622 bash[81551]: f7c1f81768ff0e7789a4f37505aec287808f828c82b8358d5b5899ad88f9baea
Jan 31 02:22:27 np0005603622 podman[81551]: 2026-01-31 07:22:27.016601024 +0000 UTC m=+0.017491435 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:27 np0005603622 systemd[1]: Started Ceph mon.compute-1 for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: pidfile_write: ignore empty --pid-file
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: load: jerasure load: lrc 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: RocksDB version: 7.9.2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Git sha 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: DB SUMMARY
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: DB Session ID:  06P5A570C25H9SCKA4A2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: CURRENT file:  CURRENT
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: IDENTITY file:  IDENTITY
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                         Options.error_if_exists: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.create_if_missing: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                         Options.paranoid_checks: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                                     Options.env: 0x55cec7aa8c40
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                                Options.info_log: 0x55cec8fe0fc0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.max_file_opening_threads: 16
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                              Options.statistics: (nil)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                               Options.use_fsync: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.max_log_file_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                         Options.allow_fallocate: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.use_direct_reads: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.create_missing_column_families: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                              Options.db_log_dir: 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                                 Options.wal_dir: 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.advise_random_on_open: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                    Options.write_buffer_manager: 0x55cec8ff0b40
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                            Options.rate_limiter: (nil)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.unordered_write: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                               Options.row_cache: None
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                              Options.wal_filter: None
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.allow_ingest_behind: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.two_write_queues: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.manual_wal_flush: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.wal_compression: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.atomic_flush: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.log_readahead_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.allow_data_in_errors: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.db_host_id: __hostname__
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.max_background_jobs: 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.max_background_compactions: -1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.max_subcompactions: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.max_total_wal_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                          Options.max_open_files: -1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                          Options.bytes_per_sync: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:       Options.compaction_readahead_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.max_background_flushes: -1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Compression algorithms supported:
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kZSTD supported: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kXpressCompression supported: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kBZip2Compression supported: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kLZ4Compression supported: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kZlibCompression supported: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: #011kSnappyCompression supported: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:           Options.merge_operator: 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:        Options.compaction_filter: None
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:        Options.compaction_filter_factory: None
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:  Options.sst_partitioner_factory: None
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cec8fe0c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cec8fd91f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:        Options.write_buffer_size: 33554432
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:  Options.max_write_buffer_number: 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.compression: NoCompression
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:       Options.prefix_extractor: nullptr
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.num_levels: 7
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.compression_opts.level: 32767
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:               Options.compression_opts.strategy: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                  Options.compression_opts.enabled: false
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.arena_block_size: 1048576
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.disable_auto_compactions: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.inplace_update_support: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                           Options.bloom_locality: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                    Options.max_successive_merges: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.paranoid_file_checks: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.force_consistency_checks: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.report_bg_io_stats: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                               Options.ttl: 2592000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                       Options.enable_blob_files: false
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                           Options.min_blob_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                          Options.blob_file_size: 268435456
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb:                Options.blob_file_starting_level: 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1cbf6aaa-ac01-4fca-be5c-88ee6f516697
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844147130704, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844147132614, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844147132821, "job": 1, "event": "recovery_finished"}
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cec9002e00
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: DB pointer 0x55cec908c000
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(???) e0 preinit fsid 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 3314933000852226048, adjusting msgr requires
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Added host compute-0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying cephadm binary to compute-1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Added host compute-1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying cephadm binary to compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Added host compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Saving service mon spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Saving service mgr spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Marking host: compute-0 for OSDSpec preview refresh.
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Marking host: compute-1 for OSDSpec preview refresh.
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Saving service osd.default_drive_group spec with placement compute-0;compute-1;compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-1:/etc/ceph/ceph.conf
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.client.admin.keyring
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon#012service_name: mon#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr#012service_name: mgr#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying daemon crash.compute-1 on compute-1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/2942024132' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "69ce1ba1-37ea-44ee-8e02-ae107b60d956"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/2942024132' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "69ce1ba1-37ea-44ee-8e02-ae107b60d956"}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/3660948089' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/3660948089' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c7b96aaa-43a0-4c7e-ac49-508c01d627b5"}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying daemon osd.1 on compute-1
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying daemon osd.0 on compute-0
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0068, "args": ["host=compute-0", "root=default"]}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0068, "args": ["host=compute-1", "root=default"]}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: OSD bench result of 8596.349487 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Adjusting osd_memory_target on compute-1 to  5247M
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: osd.0 [v2:192.168.122.100:6802/1347694087,v1:192.168.122.100:6803/1347694087] boot
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Unable to set osd_memory_target on compute-0 to 134197657: error parsing value: Value '134197657' is below minimum 939524096
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: OSD bench result of 6163.600447 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: osd.1 [v2:192.168.122.101:6800/785741871,v1:192.168.122.101:6801/785741871] boot
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-2:/etc/ceph/ceph.conf
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.client.admin.keyring
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Deploying daemon mon.compute-2 on compute-2
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: Cluster is now healthy
Jan 31 02:22:27 np0005603622 ceph-mon[81571]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 31 02:22:33 np0005603622 ceph-mon[81571]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 31 02:22:33 np0005603622 ceph-mon[81571]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 31 02:22:33 np0005603622 ceph-mon[81571]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 31 02:22:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: Deploying daemon mon.compute-1 on compute-1
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-0 calling monitor election
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-2 calling monitor election
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cdjvtw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-31T07:22:26.274119Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864300,os=Linux}
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-0 calling monitor election
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-2 calling monitor election
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-1 calling monitor election
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.803929863 +0000 UTC m=+0.031106094 container create 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:22:36 np0005603622 systemd[1]: Started libpod-conmon-5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f.scope.
Jan 31 02:22:36 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.879104571 +0000 UTC m=+0.106280832 container init 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.789494942 +0000 UTC m=+0.016671193 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.888348632 +0000 UTC m=+0.115524873 container start 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 02:22:36 np0005603622 infallible_bell[81766]: 167 167
Jan 31 02:22:36 np0005603622 systemd[1]: libpod-5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f.scope: Deactivated successfully.
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.895525956 +0000 UTC m=+0.122702207 container attach 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.897203142 +0000 UTC m=+0.124379393 container died 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:22:36 np0005603622 systemd[1]: var-lib-containers-storage-overlay-208ee0235b623005e2e75702fd04c0a45c4b0dad69dc0bb43a4e07ce1f2bddbb-merged.mount: Deactivated successfully.
Jan 31 02:22:36 np0005603622 podman[81749]: 2026-01-31 07:22:36.933928588 +0000 UTC m=+0.161104819 container remove 5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_bell, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 31 02:22:36 np0005603622 systemd[1]: libpod-conmon-5e38446886cccb2d7aa3e1f3caeb5f928e7033a934dded990674b9a6f145c56f.scope: Deactivated successfully.
Jan 31 02:22:36 np0005603622 systemd[1]: Reloading.
Jan 31 02:22:37 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:37 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e13 _set_new_cache_sizes cache_size:1019939042 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:37 np0005603622 systemd[1]: Reloading.
Jan 31 02:22:37 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:22:37 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:22:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gxjgok", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:22:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gxjgok", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 31 02:22:37 np0005603622 ceph-mon[81571]: Deploying daemon mgr.compute-1.gxjgok on compute-1
Jan 31 02:22:37 np0005603622 systemd[1]: Starting Ceph mgr.compute-1.gxjgok for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:22:37 np0005603622 podman[81905]: 2026-01-31 07:22:37.57582421 +0000 UTC m=+0.038511645 container create fd7ea65af39928b8d842fc80b896d488e571e5dd20c6bf3c1d03c01d2b143159 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:37 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07b9180d6e083d68065df92869073a606ce61b70e18cb399d894bbd05cdf5113/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:37 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07b9180d6e083d68065df92869073a606ce61b70e18cb399d894bbd05cdf5113/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:37 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07b9180d6e083d68065df92869073a606ce61b70e18cb399d894bbd05cdf5113/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:37 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07b9180d6e083d68065df92869073a606ce61b70e18cb399d894bbd05cdf5113/merged/var/lib/ceph/mgr/ceph-compute-1.gxjgok supports timestamps until 2038 (0x7fffffff)
Jan 31 02:22:37 np0005603622 podman[81905]: 2026-01-31 07:22:37.636523516 +0000 UTC m=+0.099210961 container init fd7ea65af39928b8d842fc80b896d488e571e5dd20c6bf3c1d03c01d2b143159 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 31 02:22:37 np0005603622 podman[81905]: 2026-01-31 07:22:37.640473083 +0000 UTC m=+0.103160498 container start fd7ea65af39928b8d842fc80b896d488e571e5dd20c6bf3c1d03c01d2b143159 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 31 02:22:37 np0005603622 bash[81905]: fd7ea65af39928b8d842fc80b896d488e571e5dd20c6bf3c1d03c01d2b143159
Jan 31 02:22:37 np0005603622 podman[81905]: 2026-01-31 07:22:37.555995882 +0000 UTC m=+0.018683367 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:22:37 np0005603622 systemd[1]: Started Ceph mgr.compute-1.gxjgok for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:22:37 np0005603622 ceph-mgr[81925]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:22:37 np0005603622 ceph-mgr[81925]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 31 02:22:37 np0005603622 ceph-mgr[81925]: pidfile_write: ignore empty --pid-file
Jan 31 02:22:37 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'alerts'
Jan 31 02:22:38 np0005603622 ceph-mgr[81925]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 31 02:22:38 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'balancer'
Jan 31 02:22:38 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:38.127+0000 7f380059a140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 31 02:22:38 np0005603622 ceph-mgr[81925]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 31 02:22:38 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:38.412+0000 7f380059a140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 31 02:22:38 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'cephadm'
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: Deploying daemon crash.compute-2 on compute-2
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1597819399' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 31 02:22:39 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 14 pg[2.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=14) [1] r=0 lpr=14 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:22:40 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1597819399' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 31 02:22:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 15 pg[2.0( empty local-lis/les=14/15 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=14) [1] r=0 lpr=14 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:22:40 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'crash'
Jan 31 02:22:40 np0005603622 ceph-mgr[81925]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 31 02:22:40 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'dashboard'
Jan 31 02:22:40 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:40.705+0000 7f380059a140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e16 e16: 3 total, 2 up, 3 in
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3898076589' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/935814870' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]: dispatch
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]: dispatch
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3898076589' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d561c1d2-064b-46a8-af35-64503a234a3c"}]': finished
Jan 31 02:22:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e16 _set_new_cache_sizes cache_size:1020053350 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e17 e17: 3 total, 2 up, 3 in
Jan 31 02:22:42 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'devicehealth'
Jan 31 02:22:42 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3326848097' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:42 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3326848097' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:42 np0005603622 ceph-mon[81571]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:42 np0005603622 ceph-mgr[81925]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 31 02:22:42 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'diskprediction_local'
Jan 31 02:22:42 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:42.533+0000 7f380059a140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]:  from numpy import show_config as show_numpy_config
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'influx'
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:43.075+0000 7f380059a140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e18 e18: 3 total, 2 up, 3 in
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:43.324+0000 7f380059a140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'insights'
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'iostat'
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:43.809+0000 7f380059a140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 31 02:22:43 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'k8sevents'
Jan 31 02:22:44 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3012310796' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 31 02:22:45 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3012310796' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:45 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1249872425' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 31 02:22:45 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'localpool'
Jan 31 02:22:45 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'mds_autoscaler'
Jan 31 02:22:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 31 02:22:46 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1249872425' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 31 02:22:46 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/839230673' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 31 02:22:46 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'mirroring'
Jan 31 02:22:46 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'nfs'
Jan 31 02:22:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 21 pg[7.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [1] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:22:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e21 _set_new_cache_sizes cache_size:1020054715 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 31 02:22:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 22 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [1] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:22:47 np0005603622 ceph-mon[81571]: Deploying daemon osd.2 on compute-2
Jan 31 02:22:47 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/839230673' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 31 02:22:47 np0005603622 ceph-mgr[81925]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 31 02:22:47 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'orchestrator'
Jan 31 02:22:47 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:47.512+0000 7f380059a140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'osd_perf_query'
Jan 31 02:22:48 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:48.200+0000 7f380059a140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'osd_support'
Jan 31 02:22:48 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:48.469+0000 7f380059a140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mon[81571]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:48 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1320339162' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:48.748+0000 7f380059a140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 31 02:22:48 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'pg_autoscaler'
Jan 31 02:22:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 31 02:22:49 np0005603622 ceph-mgr[81925]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:49.081+0000 7f380059a140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'progress'
Jan 31 02:22:49 np0005603622 ceph-mgr[81925]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:49.322+0000 7f380059a140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 31 02:22:49 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'prometheus'
Jan 31 02:22:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 31 02:22:50 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1320339162' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 31 02:22:50 np0005603622 ceph-mgr[81925]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 31 02:22:50 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:50.415+0000 7f380059a140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 31 02:22:50 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'rbd_support'
Jan 31 02:22:50 np0005603622 ceph-mgr[81925]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 31 02:22:50 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'restful'
Jan 31 02:22:50 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:50.752+0000 7f380059a140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/218093747' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/218093747' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1045044254' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 31 02:22:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 31 02:22:51 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'rgw'
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e25 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1045044254' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1692381800' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 31 02:22:52 np0005603622 ceph-mgr[81925]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 31 02:22:52 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:52.307+0000 7f380059a140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 31 02:22:52 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'rook'
Jan 31 02:22:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 31 02:22:53 np0005603622 podman[82181]: 2026-01-31 07:22:53.314749871 +0000 UTC m=+0.065603809 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/1692381800' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3921316751' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 31 02:22:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 31 02:22:53 np0005603622 podman[82181]: 2026-01-31 07:22:53.447251974 +0000 UTC m=+0.198105952 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'selftest'
Jan 31 02:22:54 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:54.349+0000 7f380059a140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3921316751' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'snap_schedule'
Jan 31 02:22:54 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:54.581+0000 7f380059a140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:54.815+0000 7f380059a140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 31 02:22:54 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'stats'
Jan 31 02:22:55 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'status'
Jan 31 02:22:55 np0005603622 ceph-mgr[81925]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:55.300+0000 7f380059a140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'telegraf'
Jan 31 02:22:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 31 02:22:55 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/4251382841' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 31 02:22:55 np0005603622 ceph-mgr[81925]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:55.570+0000 7f380059a140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 31 02:22:55 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'telemetry'
Jan 31 02:22:56 np0005603622 ceph-mgr[81925]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'test_orchestrator'
Jan 31 02:22:56 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:56.138+0000 7f380059a140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/4251382841' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:22:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e29 e29: 3 total, 3 up, 3 in
Jan 31 02:22:56 np0005603622 ceph-mgr[81925]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:56 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'volumes'
Jan 31 02:22:56 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:56.833+0000 7f380059a140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: OSD bench result of 8242.819324 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: Unable to set osd_memory_target on compute-2 to 134203392: error parsing value: Value '134203392' is below minimum 939524096
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: Updating compute-0:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: Updating compute-1:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: Updating compute-2:/etc/ceph/ceph.conf
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: osd.2 [v2:192.168.122.102:6800/1205784752,v1:192.168.122.102:6801/1205784752] boot
Jan 31 02:22:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e30 e30: 3 total, 3 up, 3 in
Jan 31 02:22:57 np0005603622 ceph-mgr[81925]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 31 02:22:57 np0005603622 ceph-mgr[81925]: mgr[py] Loading python module 'zabbix'
Jan 31 02:22:57 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:57.582+0000 7f380059a140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 31 02:22:57 np0005603622 ceph-mgr[81925]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 31 02:22:57 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mgr-compute-1-gxjgok[81921]: 2026-01-31T07:22:57.868+0000 7f380059a140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 31 02:22:57 np0005603622 ceph-mgr[81925]: ms_deliver_dispatch: unhandled message 0x559cac8291e0 mon_map magic: 0 v1 from mon.2 v2:192.168.122.101:3300/0
Jan 31 02:22:57 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: Updating compute-0:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: Updating compute-2:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: Updating compute-1:/var/lib/ceph/2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/config/ceph.conf
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 31 02:22:58 np0005603622 ceph-mon[81571]: Cluster is now healthy
Jan 31 02:22:58 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:22:59 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3074196547' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 31 02:22:59 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 31 02:22:59 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 31 02:23:01 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3569125940' entity='client.admin' 
Jan 31 02:23:02 np0005603622 ceph-mon[81571]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:02 np0005603622 ceph-mon[81571]: Saving service ingress.rgw.default spec with placement count:2
Jan 31 02:23:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e2 new map
Jan 31 02:23:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:03.855587+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 31 02:23:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e31 e31: 3 total, 3 up, 3 in
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.ddmhwk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: Reconfiguring mgr.compute-0.ddmhwk (monmap changed)...
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: Reconfiguring daemon mgr.compute-0.ddmhwk on compute-0
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.392847971 +0000 UTC m=+0.037135728 container create 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 31 02:23:06 np0005603622 systemd[1]: Started libpod-conmon-5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f.scope.
Jan 31 02:23:06 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.377011342 +0000 UTC m=+0.021299129 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.481569876 +0000 UTC m=+0.125857663 container init 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.488675379 +0000 UTC m=+0.132963146 container start 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:06 np0005603622 practical_chandrasekhar[83427]: 167 167
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.492876393 +0000 UTC m=+0.137164170 container attach 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:06 np0005603622 systemd[1]: libpod-5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f.scope: Deactivated successfully.
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.494410194 +0000 UTC m=+0.138697971 container died 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 31 02:23:06 np0005603622 systemd[1]: var-lib-containers-storage-overlay-ce10cb916a0c342d27d50ff8adc981e82bf9f35eef739a916642a2982bd5130b-merged.mount: Deactivated successfully.
Jan 31 02:23:06 np0005603622 podman[83411]: 2026-01-31 07:23:06.540835483 +0000 UTC m=+0.185123240 container remove 5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_chandrasekhar, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True)
Jan 31 02:23:06 np0005603622 systemd[1]: libpod-conmon-5895c5365e56fa8b36793f592de9ad1a483614b29ea217776ffcbafefdd72a2f.scope: Deactivated successfully.
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: Reconfiguring osd.0 (monmap changed)...
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: Reconfiguring daemon osd.0 on compute-0
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.034533578 +0000 UTC m=+0.047560381 container create 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:23:07 np0005603622 systemd[1]: Started libpod-conmon-38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a.scope.
Jan 31 02:23:07 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.101134164 +0000 UTC m=+0.114160967 container init 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.106428827 +0000 UTC m=+0.119455640 container start 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.109709956 +0000 UTC m=+0.122736779 container attach 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:07 np0005603622 friendly_yonath[83581]: 167 167
Jan 31 02:23:07 np0005603622 systemd[1]: libpod-38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a.scope: Deactivated successfully.
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.11171125 +0000 UTC m=+0.124738083 container died 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.017545557 +0000 UTC m=+0.030572400 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:07 np0005603622 systemd[1]: var-lib-containers-storage-overlay-e041c687baf8d5d653b6ca7c6299048a27bdeb377ae21327c1e3b8311129bbc9-merged.mount: Deactivated successfully.
Jan 31 02:23:07 np0005603622 podman[83565]: 2026-01-31 07:23:07.150099221 +0000 UTC m=+0.163126044 container remove 38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_yonath, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:07 np0005603622 systemd[1]: libpod-conmon-38f9550c8410c64919afaecc91a6c05fdba33ebbc68af538864ce6947db64a1a.scope: Deactivated successfully.
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: Reconfiguring osd.1 (monmap changed)...
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: Reconfiguring daemon osd.1 on compute-1
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.709834035 +0000 UTC m=+0.036259674 container create 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:07 np0005603622 systemd[1]: Started libpod-conmon-2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23.scope.
Jan 31 02:23:07 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.77380984 +0000 UTC m=+0.100235509 container init 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.780616774 +0000 UTC m=+0.107042413 container start 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.783511783 +0000 UTC m=+0.109937442 container attach 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:07 np0005603622 pedantic_poincare[83741]: 167 167
Jan 31 02:23:07 np0005603622 systemd[1]: libpod-2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23.scope: Deactivated successfully.
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.786184435 +0000 UTC m=+0.112610074 container died 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.693172223 +0000 UTC m=+0.019597892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:07 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0eb91ec93e2a9b0dd014be82ca5a7130019ab21afb3d382460c95cf77a8be4b2-merged.mount: Deactivated successfully.
Jan 31 02:23:07 np0005603622 podman[83725]: 2026-01-31 07:23:07.81474644 +0000 UTC m=+0.141172079 container remove 2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_poincare, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 02:23:07 np0005603622 systemd[1]: libpod-conmon-2663587e13ac20208db2853f9cde8fe5eff51a4baa19428a26e676a11e5a4c23.scope: Deactivated successfully.
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/720950555' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/720950555' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring mgr.compute-2.cdjvtw (monmap changed)...
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cdjvtw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 31 02:23:08 np0005603622 ceph-mon[81571]: Reconfiguring daemon mgr.compute-2.cdjvtw on compute-2
Jan 31 02:23:09 np0005603622 podman[83930]: 2026-01-31 07:23:09.62916733 +0000 UTC m=+0.054621142 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 31 02:23:09 np0005603622 podman[83930]: 2026-01-31 07:23:09.714606116 +0000 UTC m=+0.140059868 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:23:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:13 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/992138592' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 31 02:23:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aejomu", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.aejomu", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:17 np0005603622 ceph-mon[81571]: Deploying daemon rgw.rgw.compute-2.aejomu on compute-2
Jan 31 02:23:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.08202184 +0000 UTC m=+0.047169620 container create 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:18 np0005603622 systemd[1]: Started libpod-conmon-6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6.scope.
Jan 31 02:23:18 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.057396543 +0000 UTC m=+0.022544393 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.160721544 +0000 UTC m=+0.125869364 container init 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.168041652 +0000 UTC m=+0.133189512 container start 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.172656977 +0000 UTC m=+0.137804837 container attach 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 31 02:23:18 np0005603622 busy_villani[84176]: 167 167
Jan 31 02:23:18 np0005603622 systemd[1]: libpod-6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6.scope: Deactivated successfully.
Jan 31 02:23:18 np0005603622 conmon[84176]: conmon 6ab2af510720b13ee47e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6.scope/container/memory.events
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.175415812 +0000 UTC m=+0.140563582 container died 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:23:18 np0005603622 systemd[1]: var-lib-containers-storage-overlay-370c1fc9f71f441a2d6a360fabd9c09a76c50c8609beebb28251e10ee25dd9ca-merged.mount: Deactivated successfully.
Jan 31 02:23:18 np0005603622 podman[84159]: 2026-01-31 07:23:18.220182746 +0000 UTC m=+0.185330536 container remove 6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_villani, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:23:18 np0005603622 systemd[1]: libpod-conmon-6ab2af510720b13ee47e8ae947a71ba1287a7a2c9fc7c233d3616fc411524ec6.scope: Deactivated successfully.
Jan 31 02:23:18 np0005603622 systemd[1]: Reloading.
Jan 31 02:23:18 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:18 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.bjsbdg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.bjsbdg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: Deploying daemon rgw.rgw.compute-1.bjsbdg on compute-1
Jan 31 02:23:18 np0005603622 systemd[1]: Reloading.
Jan 31 02:23:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e32 e32: 3 total, 3 up, 3 in
Jan 31 02:23:18 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:18 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:18 np0005603622 systemd[1]: Starting Ceph rgw.rgw.compute-1.bjsbdg for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:19 np0005603622 podman[84318]: 2026-01-31 07:23:18.999235817 +0000 UTC m=+0.035320619 container create 6d11f847e30268a96d8ce77469288248f2d4a98bc84d874b027f544b57f63126 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-1-bjsbdg, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 31 02:23:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02db33e3fc03d4703391d78e5a7d47affd4a1de269e10a2a4e88463b8cd297f0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02db33e3fc03d4703391d78e5a7d47affd4a1de269e10a2a4e88463b8cd297f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02db33e3fc03d4703391d78e5a7d47affd4a1de269e10a2a4e88463b8cd297f0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02db33e3fc03d4703391d78e5a7d47affd4a1de269e10a2a4e88463b8cd297f0/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.bjsbdg supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:19 np0005603622 podman[84318]: 2026-01-31 07:23:19.072606086 +0000 UTC m=+0.108690888 container init 6d11f847e30268a96d8ce77469288248f2d4a98bc84d874b027f544b57f63126 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-1-bjsbdg, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 31 02:23:19 np0005603622 podman[84318]: 2026-01-31 07:23:18.980693404 +0000 UTC m=+0.016778156 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:19 np0005603622 podman[84318]: 2026-01-31 07:23:19.08088692 +0000 UTC m=+0.116971632 container start 6d11f847e30268a96d8ce77469288248f2d4a98bc84d874b027f544b57f63126 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-1-bjsbdg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:19 np0005603622 bash[84318]: 6d11f847e30268a96d8ce77469288248f2d4a98bc84d874b027f544b57f63126
Jan 31 02:23:19 np0005603622 systemd[1]: Started Ceph rgw.rgw.compute-1.bjsbdg for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:19 np0005603622 radosgw[84338]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:19 np0005603622 radosgw[84338]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 31 02:23:19 np0005603622 radosgw[84338]: framework: beast
Jan 31 02:23:19 np0005603622 radosgw[84338]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 31 02:23:19 np0005603622 radosgw[84338]: init_numa not setting numa affinity
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.pnpmok", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.pnpmok", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 31 02:23:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: Deploying daemon rgw.rgw.compute-0.pnpmok on compute-0
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:20 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.asgtzy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.asgtzy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e3 new map
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:03.855587+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.asgtzy{-1:24157} state up:standby seq 1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: Deploying daemon mds.cephfs.compute-2.asgtzy on compute-2
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:22 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 36 pg[10.0( empty local-lis/les=0/0 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [1] r=0 lpr=36 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e4 new map
Jan 31 02:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:22.914433+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:creating seq 1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 31 02:23:23 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 37 pg[10.0( empty local-lis/les=36/37 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [1] r=0 lpr=36 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jroeqh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: daemon mds.cephfs.compute-2.asgtzy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3859572337' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/4082344861' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/665557881' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jroeqh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: Deploying daemon mds.cephfs.compute-0.jroeqh on compute-0
Jan 31 02:23:23 np0005603622 ceph-mon[81571]: daemon mds.cephfs.compute-2.asgtzy is now active in filesystem cephfs as rank 0
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e5 new map
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/3859572337' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bkrghs", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bkrghs", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: Deploying daemon mds.cephfs.compute-1.bkrghs on compute-1
Jan 31 02:23:24 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e6 new map
Jan 31 02:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e7 new map
Jan 31 02:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:23.967653+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:25 np0005603622 systemd[72676]: Starting Mark boot as successful...
Jan 31 02:23:25 np0005603622 systemd[72676]: Finished Mark boot as successful.
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.138830135 +0000 UTC m=+0.035401380 container create 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 31 02:23:25 np0005603622 systemd[1]: Started libpod-conmon-26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1.scope.
Jan 31 02:23:25 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.195375588 +0000 UTC m=+0.091946843 container init 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.201197596 +0000 UTC m=+0.097768841 container start 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.204011252 +0000 UTC m=+0.100582497 container attach 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 31 02:23:25 np0005603622 youthful_tu[84567]: 167 167
Jan 31 02:23:25 np0005603622 systemd[1]: libpod-26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1.scope: Deactivated successfully.
Jan 31 02:23:25 np0005603622 conmon[84567]: conmon 26821b84559f417fdc29 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1.scope/container/memory.events
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.206594642 +0000 UTC m=+0.103165917 container died 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.122103681 +0000 UTC m=+0.018674946 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay-d9e5c02f95f8ac5913669a3369290166f35d066d9bb9cb15e1388ae6141b9bb5-merged.mount: Deactivated successfully.
Jan 31 02:23:25 np0005603622 podman[84551]: 2026-01-31 07:23:25.24967291 +0000 UTC m=+0.146244155 container remove 26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_tu, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Jan 31 02:23:25 np0005603622 systemd[1]: libpod-conmon-26821b84559f417fdc29b98bc10d46675b0d269a2b0f95b95af229aff137e0b1.scope: Deactivated successfully.
Jan 31 02:23:25 np0005603622 systemd[1]: Reloading.
Jan 31 02:23:25 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:25 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:25 np0005603622 systemd[1]: Reloading.
Jan 31 02:23:25 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:23:25 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:23:25 np0005603622 systemd[1]: Starting Ceph mds.cephfs.compute-1.bkrghs for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2...
Jan 31 02:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 31 02:23:26 np0005603622 podman[84711]: 2026-01-31 07:23:26.004649198 +0000 UTC m=+0.111180395 container create ba465f398f03238b92eeb8602e4578c5b18f0c6f5000c99cebb00fd945a774b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-1-bkrghs, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 31 02:23:26 np0005603622 podman[84711]: 2026-01-31 07:23:25.924036722 +0000 UTC m=+0.030567959 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c0bb35c060012d2abe4379a728e021d2b5661585041edac5fe002d9b5e20a2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c0bb35c060012d2abe4379a728e021d2b5661585041edac5fe002d9b5e20a2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c0bb35c060012d2abe4379a728e021d2b5661585041edac5fe002d9b5e20a2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c0bb35c060012d2abe4379a728e021d2b5661585041edac5fe002d9b5e20a2/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.bkrghs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:23:26 np0005603622 podman[84711]: 2026-01-31 07:23:26.072536138 +0000 UTC m=+0.179067395 container init ba465f398f03238b92eeb8602e4578c5b18f0c6f5000c99cebb00fd945a774b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-1-bkrghs, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:26 np0005603622 podman[84711]: 2026-01-31 07:23:26.07703893 +0000 UTC m=+0.183570147 container start ba465f398f03238b92eeb8602e4578c5b18f0c6f5000c99cebb00fd945a774b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-1-bkrghs, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 31 02:23:26 np0005603622 bash[84711]: ba465f398f03238b92eeb8602e4578c5b18f0c6f5000c99cebb00fd945a774b0
Jan 31 02:23:26 np0005603622 systemd[1]: Started Ceph mds.cephfs.compute-1.bkrghs for 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2.
Jan 31 02:23:26 np0005603622 ceph-mds[84731]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:26 np0005603622 ceph-mds[84731]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 31 02:23:26 np0005603622 ceph-mds[84731]: main not setting numa affinity
Jan 31 02:23:26 np0005603622 ceph-mds[84731]: pidfile_write: ignore empty --pid-file
Jan 31 02:23:26 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mds-cephfs-compute-1-bkrghs[84727]: starting mds.cephfs.compute-1.bkrghs at 
Jan 31 02:23:26 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Updating MDS map to version 7 from mon.2
Jan 31 02:23:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.102:0/1907859104' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.101:0/1698301580' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: Deploying daemon haproxy.rgw.default.compute-0.evwczw on compute-0
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? 192.168.122.100:0/487760849' entity='client.rgw.rgw.compute-0.pnpmok' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-2.aejomu' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: from='client.? ' entity='client.rgw.rgw.compute-1.bjsbdg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e8 new map
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:27 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Updating MDS map to version 8 from mon.2
Jan 31 02:23:27 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Monitors have assigned me to become a standby.
Jan 31 02:23:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:27 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-rgw-rgw-compute-1-bjsbdg[84334]: 2026-01-31T07:23:27.237+0000 7f52aadf9940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 31 02:23:27 np0005603622 radosgw[84338]: LDAP not started since no server URIs were provided in the configuration.
Jan 31 02:23:27 np0005603622 radosgw[84338]: framework: beast
Jan 31 02:23:27 np0005603622 radosgw[84338]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 31 02:23:27 np0005603622 radosgw[84338]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 31 02:23:27 np0005603622 radosgw[84338]: starting handler: beast
Jan 31 02:23:27 np0005603622 radosgw[84338]: set uid:gid to 167:167 (ceph:ceph)
Jan 31 02:23:27 np0005603622 radosgw[84338]: mgrc service_daemon_register rgw.24134 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.bjsbdg,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864300,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=4a080c62-ffea-408b-958a-f1cf7a54b487,zone_name=default,zonegroup_id=4b3ae999-cc0f-4a4e-9689-957f89598a27,zonegroup_name=default}
Jan 31 02:23:29 np0005603622 ceph-mon[81571]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 31 02:23:29 np0005603622 ceph-mon[81571]: Cluster is now healthy
Jan 31 02:23:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e9 new map
Jan 31 02:23:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.003000081s ======
Jan 31 02:23:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:30.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Jan 31 02:23:31 np0005603622 ceph-mon[81571]: Deploying daemon haproxy.rgw.default.compute-2.yyrexo on compute-2
Jan 31 02:23:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e10 new map
Jan 31 02:23:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-31T07:23:03.855545+0000#012modified#0112026-01-31T07:23:27.037579+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.asgtzy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2751451154,v1:192.168.122.102:6805/2751451154] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jroeqh{-1:14409} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/105956008,v1:192.168.122.100:6807/105956008] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.bkrghs{-1:24146} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4027255140,v1:192.168.122.101:6805/4027255140] compat {c=[1],r=[1],i=[7ff]}]
Jan 31 02:23:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Updating MDS map to version 10 from mon.2
Jan 31 02:23:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:23:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:32.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:23:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:34.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 31 02:23:34 np0005603622 ceph-mon[81571]: Deploying daemon keepalived.rgw.default.compute-0.wujrgc on compute-0
Jan 31 02:23:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:34.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:36.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:23:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:36.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:23:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:38.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:38.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: Deploying daemon keepalived.rgw.default.compute-2.voilty on compute-2
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:23:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:40.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:23:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 31 02:23:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:40.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 42 pg[2.0( empty local-lis/les=14/15 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=42 pruub=11.717067719s) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active pruub 110.295532227s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 42 pg[2.0( empty local-lis/les=14/15 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=42 pruub=11.717067719s) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown pruub 110.295532227s@ mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1f( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1e( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1d( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1c( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1b( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.7( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1a( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.6( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.4( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.9( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.2( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.8( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.5( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.3( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.c( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.b( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.a( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.d( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.e( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.f( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.10( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.11( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.12( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.13( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.14( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.16( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.15( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.17( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.18( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.19( empty local-lis/les=14/15 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.7( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.9( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.4( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.8( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.2( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1a( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.6( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.0( empty local-lis/les=42/43 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.1( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.5( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.3( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.a( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.11( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.10( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.12( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.14( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.16( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.13( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.17( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.15( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.18( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 43 pg[2.19( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=14/14 les/c/f=15/15/0 sis=42) [1] r=0 lpr=42 pi=[14,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:42.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:42.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Jan 31 02:23:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Jan 31 02:23:42 np0005603622 podman[85517]: 2026-01-31 07:23:42.940535735 +0000 UTC m=+0.049614612 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:23:43 np0005603622 podman[85517]: 2026-01-31 07:23:43.02592504 +0000 UTC m=+0.135003907 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Jan 31 02:23:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:43 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.2 deep-scrub starts
Jan 31 02:23:43 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.2 deep-scrub ok
Jan 31 02:23:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:44.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 31 02:23:45 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 47 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=47 pruub=14.033850670s) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active pruub 117.321350098s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:45 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 47 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=47 pruub=14.033850670s) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown pruub 117.321350098s@ mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:46.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1c( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.12( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.13( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1d( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.11( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.10( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.16( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.17( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.14( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.15( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.a( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.8( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.b( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.9( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.f( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.6( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.5( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.e( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.d( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.4( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.7( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.2( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.c( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1f( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.3( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1e( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.19( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1b( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.18( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1a( empty local-lis/les=21/22 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1c( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.12( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.10( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1d( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.11( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.17( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.13( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.14( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.15( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.a( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.16( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.9( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.0( empty local-lis/les=47/48 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.6( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.5( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.d( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.8( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.7( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.4( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.2( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.c( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.3( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.19( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.1a( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 48 pg[7.18( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=21/21 les/c/f=22/22/0 sis=47) [1] r=0 lpr=47 pi=[21,47)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:46.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 31 02:23:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:48.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 31 02:23:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 31 02:23:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Jan 31 02:23:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Jan 31 02:23:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:50.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 31 02:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:50 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 51 pg[10.0( v 37'48 (0'0,37'48] local-lis/les=36/37 n=8 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=13.434968948s) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 37'47 mlcod 37'47 active pruub 121.749481201s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:50 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 51 pg[10.0( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=13.434968948s) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 37'47 mlcod 0'0 unknown pruub 121.749481201s@ mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:50.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:51 np0005603622 podman[85859]: 2026-01-31 07:23:51.19425422 +0000 UTC m=+0.046153415 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 02:23:51 np0005603622 podman[85859]: 2026-01-31 07:23:51.271800211 +0000 UTC m=+0.123699366 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 31 02:23:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.10( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1e( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.11( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1f( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1d( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1b( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1c( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1a( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.19( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.18( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.7( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.6( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.5( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.4( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.3( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.b( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.8( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.d( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.2( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.a( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.9( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.c( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.e( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.f( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1( v 37'48 (0'0,37'48] local-lis/les=36/37 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.12( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.13( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.15( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.16( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.17( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.14( v 37'48 lc 0'0 (0'0,37'48] local-lis/les=36/37 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1e( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.10( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1c( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1b( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1a( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.11( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1d( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.19( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.18( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.6( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1f( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.5( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.7( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.4( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.b( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.3( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.d( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.2( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.0( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 37'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.c( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.a( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.e( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.f( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.8( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.12( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.13( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.15( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.16( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.1( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.14( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.17( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 52 pg[10.9( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=36/36 les/c/f=37/37/0 sis=51) [1] r=0 lpr=51 pi=[36,51)/1 crt=37'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000044s ======
Jan 31 02:23:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:52.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000044s
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:23:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:52.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:54.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:23:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:54.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:23:54 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Jan 31 02:23:54 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.12( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.1f( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1e( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.10( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.16( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.11( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.13( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1c( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.14( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1d( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.19( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1a( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.18( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1b( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.13( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.15( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.1b( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.16( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.10( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.7( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.4( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.f( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.9( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.4( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.c( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.5( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.a( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.d( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.d( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.a( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.c( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.5( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.7( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.2( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.e( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.1( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.f( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.8( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.3( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.5( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.f( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.1( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.10( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.12( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.1c( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.1a( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.1b( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.17( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.1b( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[11.14( empty local-lis/les=0/0 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[3.1c( empty local-lis/les=0/0 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[4.18( empty local-lis/les=0/0 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[8.14( empty local-lis/les=0/0 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[5.18( empty local-lis/les=0/0 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.611216545s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.207130432s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.611182213s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.207130432s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715688705s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311752319s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610976219s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.207084656s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715654373s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311752319s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.18( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715687752s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311813354s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610944748s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.207084656s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.18( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715658188s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311813354s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.15( v 52'51 (0'0,52'51] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986928940s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 52'50 active pruub 125.583145142s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610887527s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.207115173s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610874176s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.207115173s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.15( v 52'51 (0'0,52'51] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986892700s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 0'0 unknown NOTIFY pruub 125.583145142s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.14( v 52'51 (0'0,52'51] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986915588s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 52'50 active pruub 125.583198547s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610795975s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.207122803s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610780716s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.207122803s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.14( v 52'51 (0'0,52'51] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986861229s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 0'0 unknown NOTIFY pruub 125.583198547s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715311050s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311691284s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715291977s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311691284s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.13( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986670494s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.583122253s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715247154s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311737061s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.13( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986628532s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.583122253s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.715226173s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311737061s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.12( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986595154s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.583114624s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.12( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986574173s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.583114624s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986549377s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.583183289s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.9( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654713631s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251358032s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.2( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714959145s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311599731s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986527443s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.583183289s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.9( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654693604s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251358032s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.2( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714938164s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311599731s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.f( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986330986s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.583084106s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.3( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714871407s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311660767s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.6( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654600143s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251411438s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.3( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714829445s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311660767s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.f( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.986269951s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.583084106s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.6( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654583931s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251411438s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.4( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654481888s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251365662s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.4( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654449463s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251365662s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610182762s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.207168579s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654419899s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251441956s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610154152s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.207168579s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.1( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654383659s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251441956s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.5( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654357910s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251441956s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.5( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714308739s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311416626s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.5( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654340744s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251441956s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.5( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714292526s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311416626s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.6( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714003563s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311203003s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.4( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714382172s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311599731s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713786125s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311019897s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.6( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713986397s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311203003s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.a( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654211998s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251464844s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.f( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713764191s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311019897s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.a( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654193878s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251464844s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.8( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.985802650s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.583091736s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.2( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.985073090s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582405090s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.8( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.985782623s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.583091736s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.2( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.985058784s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582405090s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654104233s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251457214s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714199066s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311569214s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.e( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714185715s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311569214s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.b( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.654088974s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251457214s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.9( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713752747s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311187744s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.3( v 52'51 (0'0,52'51] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984917641s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 52'50 active pruub 125.582359314s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.9( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713737488s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311187744s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653993607s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251480103s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.4( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984863281s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582359314s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.3( v 52'51 (0'0,52'51] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984877586s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=52'49 lcod 52'50 mlcod 0'0 unknown NOTIFY pruub 125.582359314s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.c( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653971672s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251480103s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.4( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984829903s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582359314s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.8( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714004517s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311553955s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.4( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.714049339s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311599731s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653898239s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251495361s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.8( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713951111s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311553955s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.d( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653878212s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251495361s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.5( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984670639s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582328796s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653857231s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251556396s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713306427s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311004639s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.5( v 37'48 (0'0,37'48] local-lis/les=51/52 n=1 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984628677s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582328796s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.e( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653842926s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251556396s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.a( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713234901s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310989380s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.a( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713204384s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310989380s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.b( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.713286400s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311004639s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.10( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653864861s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251716614s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.10( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653840065s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251716614s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.14( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712954521s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310867310s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.18( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984394073s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582275391s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.14( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712921143s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310867310s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.18( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984327316s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582275391s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653887749s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251708984s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.12( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653736115s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251739502s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.12( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653711319s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251739502s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.19( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984213829s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582267761s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.13( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653688431s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251800537s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.16( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712890625s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.311004639s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.19( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984189034s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582267761s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.16( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712864876s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.311004639s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1b( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984045029s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582191467s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.13( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653665543s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251800537s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1b( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.984028816s) [0] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582191467s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.f( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.653705597s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251708984s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.11( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712532043s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310760498s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.11( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.712506294s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310760498s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.10( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669882774s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310806274s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.15( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610917091s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251853943s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.15( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610854149s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251853943s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.13( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669760704s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310852051s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.13( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669735909s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310852051s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1e( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.890715599s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.531875610s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.1e( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.890667915s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.531875610s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1d( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669491768s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 active pruub 128.310745239s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.10( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.940843582s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582115173s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.18( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610625267s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251899719s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.1d( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669463158s) [2] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310745239s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.19( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610637665s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 active pruub 123.251976013s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[7.10( empty local-lis/les=47/48 n=0 ec=47/21 lis/c=47/47 les/c/f=48/48/0 sis=53 pruub=14.669471741s) [0] r=-1 lpr=53 pi=[47,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 128.310806274s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.10( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.940777779s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582115173s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.19( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610622406s) [0] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251976013s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[2.18( empty local-lis/les=42/43 n=0 ec=42/14 lis/c=42/42 les/c/f=43/43/0 sis=53 pruub=9.610540390s) [2] r=-1 lpr=53 pi=[42,53)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 123.251899719s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.11( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.940556526s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 active pruub 125.582115173s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 53 pg[10.11( v 37'48 (0'0,37'48] local-lis/les=51/52 n=0 ec=51/36 lis/c=51/51 les/c/f=52/52/0 sis=53 pruub=11.940537453s) [2] r=-1 lpr=53 pi=[51,53)/1 crt=37'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 125.582115173s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Jan 31 02:23:55 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Jan 31 02:23:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:56.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:56.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1e( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.10( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.1f( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.16( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.11( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.12( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1c( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1d( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.19( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1a( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.18( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.14( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1b( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.13( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.13( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.1b( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.7( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.10( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.4( v 33'8 (0'0,33'8] local-lis/les=53/54 n=1 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.15( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.f( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.16( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.9( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.5( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.a( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.c( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.d( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.a( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.5( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.7( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.c( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.d( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.f( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.1( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.e( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.3( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.8( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.2( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.1( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.5( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.f( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.10( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.1c( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.1a( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.1b( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.1b( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.17( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.14( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.12( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[3.1c( empty local-lis/les=53/54 n=0 ec=44/16 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[8.14( v 33'8 (0'0,33'8] local-lis/les=53/54 n=0 ec=49/32 lis/c=49/49 les/c/f=50/50/0 sis=53) [1] r=0 lpr=53 pi=[49,53)/1 crt=33'8 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[4.18( empty local-lis/les=53/54 n=0 ec=44/17 lis/c=44/44 les/c/f=45/45/0 sis=53) [1] r=0 lpr=53 pi=[44,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[11.4( empty local-lis/les=53/54 n=0 ec=51/38 lis/c=51/51 les/c/f=52/52/0 sis=53) [1] r=0 lpr=53 pi=[51,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 54 pg[5.18( empty local-lis/les=53/54 n=0 ec=46/19 lis/c=46/46 les/c/f=47/47/0 sis=53) [1] r=0 lpr=53 pi=[46,53)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 55 pg[6.a( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 55 pg[6.e( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 55 pg[6.6( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 55 pg[6.2( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:23:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:23:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:23:58.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:23:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:23:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:23:58.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:23:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 31 02:23:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 31 02:23:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 31 02:23:58 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 56 pg[6.a( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:58 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 56 pg[6.2( v 40'39 (0'0,40'39] local-lis/les=55/56 n=2 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:58 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 56 pg[6.e( v 40'39 lc 36'19 (0'0,40'39] local-lis/les=55/56 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:23:58 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 56 pg[6.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=55/56 n=2 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=55) [1] r=0 lpr=55 pi=[46,55)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:00.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:00 np0005603622 ceph-mon[81571]: Health check failed: Degraded data redundancy: 10/215 objects degraded (4.651%), 7 pgs degraded (PG_DEGRADED)
Jan 31 02:24:01 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:02.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:02.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 31 02:24:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 31 02:24:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:04.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:04.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:04 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 31 02:24:04 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 31 02:24:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:06.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 31 02:24:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 31 02:24:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 31 02:24:06 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 57 pg[6.b( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 57 pg[6.f( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 57 pg[6.7( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 57 pg[6.3( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:06.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 1/215 objects degraded (0.465%), 1 pg degraded)
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: Cluster is now healthy
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 31 02:24:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 31 02:24:07 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 58 pg[6.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:07 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 58 pg[6.f( v 40'39 lc 36'1 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:07 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 58 pg[6.7( v 40'39 lc 36'21 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:07 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 58 pg[6.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=57/58 n=2 ec=46/20 lis/c=53/53 les/c/f=54/55/0 sis=57) [1] r=0 lpr=57 pi=[53,57)/1 crt=40'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:08.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 31 02:24:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 31 02:24:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 31 02:24:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 31 02:24:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:08.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:09 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 31 02:24:09 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 31 02:24:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 31 02:24:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 31 02:24:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 31 02:24:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 31 02:24:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 31 02:24:09 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 61 pg[6.d( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61) [1] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:09 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 61 pg[6.5( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61) [1] r=0 lpr=61 pi=[53,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:10.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:10.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 31 02:24:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 31 02:24:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 31 02:24:10 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 62 pg[6.d( v 40'39 lc 36'13 (0'0,40'39] local-lis/les=61/62 n=1 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61) [1] r=0 lpr=61 pi=[53,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 62 pg[6.5( v 40'39 lc 36'11 (0'0,40'39] local-lis/les=61/62 n=2 ec=46/20 lis/c=53/53 les/c/f=54/54/0 sis=61) [1] r=0 lpr=61 pi=[53,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Jan 31 02:24:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 31 02:24:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 31 02:24:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[9.1e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=63) [1] r=0 lpr=63 pi=[49,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[9.6( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=63) [1] r=0 lpr=63 pi=[49,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[9.e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=63) [1] r=0 lpr=63 pi=[49,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=63) [1] r=0 lpr=63 pi=[49,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[6.6( v 40'39 (0'0,40'39] local-lis/les=55/56 n=2 ec=46/20 lis/c=55/55 les/c/f=56/56/0 sis=63 pruub=11.048409462s) [0] r=-1 lpr=63 pi=[55,63)/1 crt=40'39 mlcod 40'39 active pruub 140.768051147s@ mbc={255={}}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[6.6( v 40'39 (0'0,40'39] local-lis/les=55/56 n=2 ec=46/20 lis/c=55/55 les/c/f=56/56/0 sis=63 pruub=11.048357964s) [0] r=-1 lpr=63 pi=[55,63)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 140.768051147s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[6.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=46/20 lis/c=55/55 les/c/f=56/56/0 sis=63 pruub=11.048009872s) [0] r=-1 lpr=63 pi=[55,63)/1 crt=40'39 mlcod 40'39 active pruub 140.767898560s@ mbc={255={}}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 63 pg[6.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=46/20 lis/c=55/55 les/c/f=56/56/0 sis=63 pruub=11.047974586s) [0] r=-1 lpr=63 pi=[55,63)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 140.767898560s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:12.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 31 02:24:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:12.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 31 02:24:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 31 02:24:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.16( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.6( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.6( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.1e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 64 pg[9.1e( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=64) [1]/[0] r=-1 lpr=64 pi=[49,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:13 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 31 02:24:13 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 31 02:24:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 31 02:24:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:14.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.6( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.e( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.6( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 66 pg[9.e( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:15 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 31 02:24:15 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 31 02:24:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 31 02:24:15 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 67 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:15 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 67 pg[9.e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:15 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 67 pg[9.6( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=6 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:16 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 67 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=64/49 les/c/f=65/50/0 sis=66) [1] r=0 lpr=66 pi=[49,66)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:16.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:16.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:16 np0005603622 systemd[1]: session-19.scope: Deactivated successfully.
Jan 31 02:24:16 np0005603622 systemd[1]: session-19.scope: Consumed 7.589s CPU time.
Jan 31 02:24:16 np0005603622 systemd-logind[817]: Session 19 logged out. Waiting for processes to exit.
Jan 31 02:24:16 np0005603622 systemd-logind[817]: Removed session 19.
Jan 31 02:24:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:18.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:18.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 31 02:24:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 31 02:24:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 31 02:24:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:20.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 31 02:24:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 31 02:24:20 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.d deep-scrub starts
Jan 31 02:24:20 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.d deep-scrub ok
Jan 31 02:24:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:20.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 31 02:24:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 31 02:24:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 31 02:24:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:22.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:22 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 69 pg[6.8( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=69) [1] r=0 lpr=69 pi=[46,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:22.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 31 02:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 31 02:24:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 31 02:24:22 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 70 pg[6.8( v 40'39 (0'0,40'39] local-lis/les=69/70 n=1 ec=46/20 lis/c=46/46 les/c/f=47/47/0 sis=69) [1] r=0 lpr=69 pi=[46,69)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 31 02:24:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 31 02:24:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 31 02:24:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000022s ======
Jan 31 02:24:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:24.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 31 02:24:24 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 31 02:24:24 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 31 02:24:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:24.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 31 02:24:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 31 02:24:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 31 02:24:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 31 02:24:25 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 73 pg[9.1a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=73) [1] r=0 lpr=73 pi=[49,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:25 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 73 pg[9.a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=73) [1] r=0 lpr=73 pi=[49,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 31 02:24:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 31 02:24:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 31 02:24:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 31 02:24:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:26.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:26.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 31 02:24:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 74 pg[9.a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=74) [1]/[0] r=-1 lpr=74 pi=[49,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 74 pg[9.a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=74) [1]/[0] r=-1 lpr=74 pi=[49,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 74 pg[9.1a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=74) [1]/[0] r=-1 lpr=74 pi=[49,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 74 pg[9.1a( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=74) [1]/[0] r=-1 lpr=74 pi=[49,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 31 02:24:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:28.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 31 02:24:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 76 pg[9.a( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 76 pg[9.a( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 76 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 76 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:28.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 31 02:24:29 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 77 pg[9.a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=6 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:29 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 77 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=5 ec=49/34 lis/c=74/49 les/c/f=75/50/0 sis=76) [1] r=0 lpr=76 pi=[49,76)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:30.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:30 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 31 02:24:30 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 31 02:24:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:30.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Jan 31 02:24:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Jan 31 02:24:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:32.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:24:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:32.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:24:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 31 02:24:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 31 02:24:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:34.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:34 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 31 02:24:34 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 31 02:24:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:34.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 31 02:24:35 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 78 pg[6.b( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=11.610070229s) [0] r=-1 lpr=78 pi=[57,78)/1 crt=40'39 mlcod 40'39 active pruub 165.225906372s@ mbc={255={}}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:35 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 78 pg[6.b( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=11.609955788s) [0] r=-1 lpr=78 pi=[57,78)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 165.225906372s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 31 02:24:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 31 02:24:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:36.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:36.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 31 02:24:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 31 02:24:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 31 02:24:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 31 02:24:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 31 02:24:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 31 02:24:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:38 np0005603622 systemd-logind[817]: New session 33 of user zuul.
Jan 31 02:24:38 np0005603622 systemd[1]: Started Session 33 of User zuul.
Jan 31 02:24:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:38.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 31 02:24:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 31 02:24:39 np0005603622 python3.9[86185]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:24:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 31 02:24:39 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 81 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81) [1] r=0 lpr=81 pi=[64,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:39 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 81 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=81) [1] r=0 lpr=81 pi=[64,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 31 02:24:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 31 02:24:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:40.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 31 02:24:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:40.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 31 02:24:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 31 02:24:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 82 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=-1 lpr=82 pi=[64,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 82 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=-1 lpr=82 pi=[64,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 82 pg[9.d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=-1 lpr=82 pi=[64,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 82 pg[9.1d( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=82) [1]/[2] r=-1 lpr=82 pi=[64,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:41 np0005603622 python3.9[86399]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:24:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.6 deep-scrub starts
Jan 31 02:24:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.6 deep-scrub ok
Jan 31 02:24:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 31 02:24:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 31 02:24:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 31 02:24:41 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 83 pg[6.e( empty local-lis/les=0/0 n=0 ec=46/20 lis/c=63/63 les/c/f=64/64/0 sis=83) [1] r=0 lpr=83 pi=[63,83)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:42.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 31 02:24:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:42.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 31 02:24:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 31 02:24:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 84 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 84 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 84 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 84 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:42 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 84 pg[6.e( v 40'39 lc 36'19 (0'0,40'39] local-lis/les=83/84 n=1 ec=46/20 lis/c=63/63 les/c/f=64/64/0 sis=83) [1] r=0 lpr=83 pi=[63,83)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 31 02:24:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 31 02:24:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85) [1] r=0 lpr=85 pi=[60,85)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=85) [1] r=0 lpr=85 pi=[60,85)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[6.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=57/57 les/c/f=58/58/0 sis=85 pruub=11.379614830s) [0] r=-1 lpr=85 pi=[57,85)/1 crt=40'39 mlcod 40'39 active pruub 173.226181030s@ mbc={255={}}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[6.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=46/20 lis/c=57/57 les/c/f=58/58/0 sis=85 pruub=11.379558563s) [0] r=-1 lpr=85 pi=[57,85)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 173.226181030s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=5 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 85 pg[9.d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=6 ec=49/34 lis/c=82/64 les/c/f=83/65/0 sis=84) [1] r=0 lpr=84 pi=[64,84)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:44.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 31 02:24:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:44.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 31 02:24:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 31 02:24:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 86 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=-1 lpr=86 pi=[60,86)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 86 pg[9.1f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=-1 lpr=86 pi=[60,86)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 86 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=-1 lpr=86 pi=[60,86)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:44 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 86 pg[9.f( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=60/60 les/c/f=61/61/0 sis=86) [1]/[2] r=-1 lpr=86 pi=[60,86)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 31 02:24:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:46.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:46.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 31 02:24:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 88 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 88 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 88 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 88 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 31 02:24:48 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 89 pg[9.f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=6 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:48 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 89 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=5 ec=49/34 lis/c=86/60 les/c/f=87/61/0 sis=88) [1] r=0 lpr=88 pi=[60,88)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:24:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:48.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:24:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:49 np0005603622 systemd[1]: session-33.scope: Deactivated successfully.
Jan 31 02:24:49 np0005603622 systemd[1]: session-33.scope: Consumed 7.610s CPU time.
Jan 31 02:24:49 np0005603622 systemd-logind[817]: Session 33 logged out. Waiting for processes to exit.
Jan 31 02:24:49 np0005603622 systemd-logind[817]: Removed session 33.
Jan 31 02:24:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 31 02:24:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 31 02:24:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 31 02:24:50 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 90 pg[9.10( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=90) [1] r=0 lpr=90 pi=[49,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:50.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 31 02:24:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 31 02:24:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 91 pg[9.10( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=91) [1]/[0] r=-1 lpr=91 pi=[49,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:51 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 91 pg[9.10( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=91) [1]/[0] r=-1 lpr=91 pi=[49,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 31 02:24:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:52.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 31 02:24:52 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 92 pg[9.11( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=92) [1] r=0 lpr=92 pi=[49,92)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 31 02:24:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 31 02:24:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:52.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 31 02:24:53 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 93 pg[9.10( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=91/49 les/c/f=92/50/0 sis=93) [1] r=0 lpr=93 pi=[49,93)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:53 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 93 pg[9.10( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=91/49 les/c/f=92/50/0 sis=93) [1] r=0 lpr=93 pi=[49,93)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:53 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 93 pg[9.11( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=93) [1]/[0] r=-1 lpr=93 pi=[49,93)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:53 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 93 pg[9.11( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=93) [1]/[0] r=-1 lpr=93 pi=[49,93)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:24:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:54.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 31 02:24:54 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 94 pg[9.10( v 40'1015 (0'0,40'1015] local-lis/les=93/94 n=6 ec=49/34 lis/c=91/49 les/c/f=92/50/0 sis=93) [1] r=0 lpr=93 pi=[49,93)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:54.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 31 02:24:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 95 pg[9.11( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=93/49 les/c/f=94/50/0 sis=95) [1] r=0 lpr=95 pi=[49,95)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:24:55 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 95 pg[9.11( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=6 ec=49/34 lis/c=93/49 les/c/f=94/50/0 sis=95) [1] r=0 lpr=95 pi=[49,95)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:24:55 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.b scrub starts
Jan 31 02:24:55 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.b scrub ok
Jan 31 02:24:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:24:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:56.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:24:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 31 02:24:56 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 96 pg[9.11( v 40'1015 (0'0,40'1015] local-lis/les=95/96 n=6 ec=49/34 lis/c=93/49 les/c/f=94/50/0 sis=95) [1] r=0 lpr=95 pi=[49,95)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:24:56 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 31 02:24:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:56.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:56 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 31 02:24:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:24:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:24:58.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:58 np0005603622 podman[86628]: 2026-01-31 07:24:58.456464279 +0000 UTC m=+0.053250481 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:24:58 np0005603622 podman[86628]: 2026-01-31 07:24:58.536905259 +0000 UTC m=+0.133691441 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 31 02:24:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:24:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:24:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:24:58.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:24:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 31 02:24:59 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 97 pg[9.12( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=97) [1] r=0 lpr=97 pi=[49,97)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:25:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:00.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:25:00 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.d deep-scrub starts
Jan 31 02:25:00 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.d deep-scrub ok
Jan 31 02:25:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:00.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 31 02:25:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:25:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:25:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 31 02:25:01 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 98 pg[9.12( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=98) [1]/[0] r=-1 lpr=98 pi=[49,98)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:01 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 98 pg[9.12( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=49/49 les/c/f=50/50/0 sis=98) [1]/[0] r=-1 lpr=98 pi=[49,98)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 31 02:25:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 31 02:25:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 31 02:25:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 31 02:25:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:02.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 31 02:25:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 31 02:25:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:02.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 31 02:25:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 31 02:25:03 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 100 pg[9.12( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=98/49 les/c/f=99/50/0 sis=100) [1] r=0 lpr=100 pi=[49,100)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:03 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 100 pg[9.12( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=98/49 les/c/f=99/50/0 sis=100) [1] r=0 lpr=100 pi=[49,100)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 31 02:25:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 31 02:25:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 31 02:25:04 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 101 pg[9.12( v 40'1015 (0'0,40'1015] local-lis/les=100/101 n=5 ec=49/34 lis/c=98/49 les/c/f=99/50/0 sis=100) [1] r=0 lpr=100 pi=[49,100)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:04.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:04 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1a deep-scrub starts
Jan 31 02:25:04 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1a deep-scrub ok
Jan 31 02:25:04 np0005603622 systemd-logind[817]: New session 34 of user zuul.
Jan 31 02:25:04 np0005603622 systemd[1]: Started Session 34 of User zuul.
Jan 31 02:25:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:04.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:05 np0005603622 python3.9[87035]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 31 02:25:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:06.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:06 np0005603622 python3.9[87209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:06.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:07 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 31 02:25:07 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 31 02:25:07 np0005603622 python3.9[87415]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:25:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:25:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:08.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:08.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:08 np0005603622 python3.9[87568]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:25:09 np0005603622 python3.9[87722]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:25:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 31 02:25:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 31 02:25:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:10.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:10 np0005603622 python3.9[87874]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:25:10 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1d deep-scrub starts
Jan 31 02:25:10 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1d deep-scrub ok
Jan 31 02:25:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:10.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 31 02:25:11 np0005603622 python3.9[88024]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:25:11 np0005603622 network[88041]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:25:11 np0005603622 network[88042]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:25:11 np0005603622 network[88043]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:25:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Jan 31 02:25:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Jan 31 02:25:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 31 02:25:11 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 103 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=103) [1] r=0 lpr=103 pi=[64,103)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 31 02:25:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:12.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.12 scrub starts
Jan 31 02:25:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.12 scrub ok
Jan 31 02:25:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:12.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 31 02:25:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 104 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=104) [1]/[2] r=-1 lpr=104 pi=[64,104)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:12 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 104 pg[9.15( empty local-lis/les=0/0 n=0 ec=49/34 lis/c=64/64 les/c/f=65/65/0 sis=104) [1]/[2] r=-1 lpr=104 pi=[64,104)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 31 02:25:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 31 02:25:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 31 02:25:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:14.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 105 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=105 pruub=13.730457306s) [2] r=-1 lpr=105 pi=[66,105)/1 crt=40'1015 mlcod 0'0 active pruub 205.874740601s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 105 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=105 pruub=13.729952812s) [2] r=-1 lpr=105 pi=[66,105)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 205.874740601s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:14.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 31 02:25:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 106 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=106) [2]/[1] r=0 lpr=106 pi=[66,106)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 106 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=104/64 les/c/f=105/65/0 sis=106) [1] r=0 lpr=106 pi=[64,106)/1 luod=0'0 crt=40'1015 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 106 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=0/0 n=5 ec=49/34 lis/c=104/64 les/c/f=105/65/0 sis=106) [1] r=0 lpr=106 pi=[64,106)/1 crt=40'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:14 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 106 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=106) [2]/[1] r=0 lpr=106 pi=[66,106)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:15 np0005603622 python3.9[88303]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:25:15 np0005603622 python3.9[88453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 31 02:25:15 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 107 pg[9.15( v 40'1015 (0'0,40'1015] local-lis/les=106/107 n=5 ec=49/34 lis/c=104/64 les/c/f=105/65/0 sis=106) [1] r=0 lpr=106 pi=[64,106)/1 crt=40'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 31 02:25:15 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 107 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=106/107 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=106) [2]/[1] async=[2] r=0 lpr=106 pi=[66,106)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:16.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:16.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 31 02:25:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 31 02:25:16 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 108 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=106/107 n=5 ec=49/34 lis/c=106/66 les/c/f=107/67/0 sis=108 pruub=14.969518661s) [2] async=[2] r=-1 lpr=108 pi=[66,108)/1 crt=40'1015 mlcod 40'1015 active pruub 209.803573608s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:16 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 108 pg[9.16( v 40'1015 (0'0,40'1015] local-lis/les=106/107 n=5 ec=49/34 lis/c=106/66 les/c/f=107/67/0 sis=108 pruub=14.969273567s) [2] r=-1 lpr=108 pi=[66,108)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 209.803573608s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:17 np0005603622 python3.9[88607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:25:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 31 02:25:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 31 02:25:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:18.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:18 np0005603622 python3.9[88765]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:25:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:18.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 31 02:25:19 np0005603622 python3.9[88849]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:25:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:20.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:20.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:22.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:22.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 31 02:25:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 31 02:25:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:24.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:24 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 31 02:25:24 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 31 02:25:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:24.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 31 02:25:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 31 02:25:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 31 02:25:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 31 02:25:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:26.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 112 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=5 ec=49/34 lis/c=76/76 les/c/f=77/77/0 sis=112 pruub=15.316919327s) [0] r=-1 lpr=112 pi=[76,112)/1 crt=40'1015 mlcod 0'0 active pruub 219.533599854s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:26 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 112 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=5 ec=49/34 lis/c=76/76 les/c/f=77/77/0 sis=112 pruub=15.316833496s) [0] r=-1 lpr=112 pi=[76,112)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 219.533599854s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:26.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 31 02:25:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 31 02:25:27 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 113 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=5 ec=49/34 lis/c=76/76 les/c/f=77/77/0 sis=113) [0]/[1] r=0 lpr=113 pi=[76,113)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:27 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 113 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=76/77 n=5 ec=49/34 lis/c=76/76 les/c/f=77/77/0 sis=113) [0]/[1] r=0 lpr=113 pi=[76,113)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 31 02:25:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 31 02:25:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:28.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 31 02:25:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 114 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=113/114 n=5 ec=49/34 lis/c=76/76 les/c/f=77/77/0 sis=113) [0]/[1] async=[0] r=0 lpr=113 pi=[76,113)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 31 02:25:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 115 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=113/114 n=5 ec=49/34 lis/c=113/76 les/c/f=114/77/0 sis=115 pruub=15.514381409s) [0] async=[0] r=-1 lpr=115 pi=[76,115)/1 crt=40'1015 mlcod 40'1015 active pruub 222.098220825s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:28 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 115 pg[9.1a( v 40'1015 (0'0,40'1015] local-lis/les=113/114 n=5 ec=49/34 lis/c=113/76 les/c/f=114/77/0 sis=115 pruub=15.514296532s) [0] r=-1 lpr=115 pi=[76,115)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 222.098220825s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:28.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:29 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Jan 31 02:25:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 31 02:25:29 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Jan 31 02:25:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:30.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:30 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1b deep-scrub starts
Jan 31 02:25:30 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1b deep-scrub ok
Jan 31 02:25:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 31 02:25:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 31 02:25:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:32.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 31 02:25:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 31 02:25:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 31 02:25:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 31 02:25:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:34.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:34.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 31 02:25:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 31 02:25:35 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 31 02:25:35 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 31 02:25:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 31 02:25:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 31 02:25:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000055s ======
Jan 31 02:25:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:36.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 31 02:25:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:36.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:36.973274) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844336973356, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7105, "num_deletes": 255, "total_data_size": 13189883, "memory_usage": 13404448, "flush_reason": "Manual Compaction"}
Jan 31 02:25:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337013156, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7776423, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 249, "largest_seqno": 7110, "table_properties": {"data_size": 7749070, "index_size": 17894, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8389, "raw_key_size": 78282, "raw_average_key_size": 23, "raw_value_size": 7684003, "raw_average_value_size": 2297, "num_data_blocks": 793, "num_entries": 3345, "num_filter_entries": 3345, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 1769844147, "file_creation_time": 1769844336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 39944 microseconds, and 11910 cpu microseconds.
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.013238) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7776423 bytes OK
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.013259) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.015042) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.015056) EVENT_LOG_v1 {"time_micros": 1769844337015052, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.015072) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 13152781, prev total WAL file size 13152822, number of live WAL files 2.
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.016315) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7594KB) 8(1648B)]
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337016481, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7778071, "oldest_snapshot_seqno": -1}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3094 keys, 7772933 bytes, temperature: kUnknown
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337062737, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7772933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7746245, "index_size": 17874, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7749, "raw_key_size": 74104, "raw_average_key_size": 23, "raw_value_size": 7684283, "raw_average_value_size": 2483, "num_data_blocks": 793, "num_entries": 3094, "num_filter_entries": 3094, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769844337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.062984) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7772933 bytes
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.064829) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.8 rd, 167.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.4, 0.0 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3350, records dropped: 256 output_compression: NoCompression
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.064867) EVENT_LOG_v1 {"time_micros": 1769844337064853, "job": 4, "event": "compaction_finished", "compaction_time_micros": 46356, "compaction_time_cpu_micros": 18624, "output_level": 6, "num_output_files": 1, "total_output_size": 7772933, "num_input_records": 3350, "num_output_records": 3094, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337065656, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844337065714, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:25:37.016187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:37 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 31 02:25:37 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 31 02:25:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 121 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=5 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=121 pruub=9.977580070s) [2] r=-1 lpr=121 pi=[84,121)/1 crt=40'1015 mlcod 0'0 active pruub 225.853500366s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 121 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=5 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=121 pruub=9.977502823s) [2] r=-1 lpr=121 pi=[84,121)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 225.853500366s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:38.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 122 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=5 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=122) [2]/[1] r=0 lpr=122 pi=[84,122)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 122 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=84/85 n=5 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=122) [2]/[1] r=0 lpr=122 pi=[84,122)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:38.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 31 02:25:38 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 31 02:25:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 31 02:25:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 31 02:25:39 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 123 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=122/123 n=5 ec=49/34 lis/c=84/84 les/c/f=85/85/0 sis=122) [2]/[1] async=[2] r=0 lpr=122 pi=[84,122)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:40.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 31 02:25:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 124 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=122/123 n=5 ec=49/34 lis/c=122/84 les/c/f=123/85/0 sis=124 pruub=14.993470192s) [2] async=[2] r=-1 lpr=124 pi=[84,124)/1 crt=40'1015 mlcod 40'1015 active pruub 233.593673706s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:40 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 124 pg[9.1d( v 40'1015 (0'0,40'1015] local-lis/les=122/123 n=5 ec=49/34 lis/c=122/84 les/c/f=123/85/0 sis=124 pruub=14.993373871s) [2] r=-1 lpr=124 pi=[84,124)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 233.593673706s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 31 02:25:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 31 02:25:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.a scrub starts
Jan 31 02:25:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 31 02:25:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.a scrub ok
Jan 31 02:25:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:42.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:42.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 31 02:25:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 31 02:25:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 126 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=126 pruub=8.015479088s) [0] r=-1 lpr=126 pi=[66,126)/1 crt=40'1015 mlcod 0'0 active pruub 229.867950439s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:43 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 126 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=126 pruub=8.015409470s) [0] r=-1 lpr=126 pi=[66,126)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 229.867950439s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:44.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:44.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 31 02:25:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 31 02:25:45 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 127 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=127) [0]/[1] r=0 lpr=127 pi=[66,127)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:45 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 127 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=66/67 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=127) [0]/[1] r=0 lpr=127 pi=[66,127)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 31 02:25:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 31 02:25:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 128 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=5 ec=49/34 lis/c=88/88 les/c/f=89/89/0 sis=128 pruub=13.983708382s) [0] r=-1 lpr=128 pi=[88,128)/1 crt=40'1015 mlcod 0'0 active pruub 237.988052368s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 128 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=5 ec=49/34 lis/c=88/88 les/c/f=89/89/0 sis=128 pruub=13.983617783s) [0] r=-1 lpr=128 pi=[88,128)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 237.988052368s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:46 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 128 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=127/128 n=5 ec=49/34 lis/c=66/66 les/c/f=67/67/0 sis=127) [0]/[1] async=[0] r=0 lpr=127 pi=[66,127)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:46.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:46.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 31 02:25:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 31 02:25:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 129 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=5 ec=49/34 lis/c=88/88 les/c/f=89/89/0 sis=129) [0]/[1] r=0 lpr=129 pi=[88,129)/1 crt=40'1015 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 129 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=127/128 n=5 ec=49/34 lis/c=127/66 les/c/f=128/67/0 sis=129 pruub=15.017527580s) [0] async=[0] r=-1 lpr=129 pi=[66,129)/1 crt=40'1015 mlcod 40'1015 active pruub 240.039886475s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 129 pg[9.1e( v 40'1015 (0'0,40'1015] local-lis/les=127/128 n=5 ec=49/34 lis/c=127/66 les/c/f=128/67/0 sis=129 pruub=15.017376900s) [0] r=-1 lpr=129 pi=[66,129)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 240.039886475s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:47 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 129 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=88/89 n=5 ec=49/34 lis/c=88/88 les/c/f=89/89/0 sis=129) [0]/[1] r=0 lpr=129 pi=[88,129)/1 crt=40'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 31 02:25:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 31 02:25:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 31 02:25:48 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 130 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=129/130 n=5 ec=49/34 lis/c=88/88 les/c/f=89/89/0 sis=129) [0]/[1] async=[0] r=0 lpr=129 pi=[88,129)/1 crt=40'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 31 02:25:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:48.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 31 02:25:48 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 131 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=129/130 n=5 ec=49/34 lis/c=129/88 les/c/f=130/89/0 sis=131 pruub=15.483580589s) [0] async=[0] r=-1 lpr=131 pi=[88,131)/1 crt=40'1015 mlcod 40'1015 active pruub 242.040344238s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 31 02:25:48 np0005603622 ceph-osd[79107]: osd.1 pg_epoch: 131 pg[9.1f( v 40'1015 (0'0,40'1015] local-lis/les=129/130 n=5 ec=49/34 lis/c=129/88 les/c/f=130/89/0 sis=131 pruub=15.483504295s) [0] r=-1 lpr=131 pi=[88,131)/1 crt=40'1015 mlcod 0'0 unknown NOTIFY pruub 242.040344238s@ mbc={}] state<Start>: transitioning to Stray
Jan 31 02:25:48 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 31 02:25:48 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 31 02:25:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:48.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 31 02:25:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:50.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:50 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.c scrub starts
Jan 31 02:25:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:25:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:50.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:25:50 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.c scrub ok
Jan 31 02:25:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:25:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:25:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:52.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:25:52 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Jan 31 02:25:52 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Jan 31 02:25:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:25:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:52.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:25:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:54.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:54.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:56.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:56.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:56 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.d deep-scrub starts
Jan 31 02:25:56 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.d deep-scrub ok
Jan 31 02:25:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:25:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 31 02:25:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 31 02:25:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:25:58.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:25:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:25:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:25:58.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:25:58 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 31 02:25:58 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 31 02:25:59 np0005603622 python3.9[89147]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:26:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:00.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:00.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:00 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.8 deep-scrub starts
Jan 31 02:26:00 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.8 deep-scrub ok
Jan 31 02:26:01 np0005603622 python3.9[89434]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 31 02:26:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.5 deep-scrub starts
Jan 31 02:26:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.5 deep-scrub ok
Jan 31 02:26:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:02.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:02 np0005603622 python3.9[89588]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 31 02:26:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:02.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:03 np0005603622 python3.9[89740]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:26:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:04 np0005603622 python3.9[89892]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 31 02:26:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:04.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:06 np0005603622 python3.9[90044]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:06.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:06 np0005603622 python3.9[90196]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:06.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:07 np0005603622 python3.9[90274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:26:07 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Jan 31 02:26:07 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Jan 31 02:26:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:08.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:08 np0005603622 python3.9[90558]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:08.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:09 np0005603622 python3.9[90712]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 31 02:26:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:26:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:26:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:10.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:10 np0005603622 python3.9[90865]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 31 02:26:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:10.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:11 np0005603622 python3.9[91018]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:26:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:12.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:12 np0005603622 python3.9[91170]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 31 02:26:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:12.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:13 np0005603622 python3.9[91322]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:13 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Jan 31 02:26:13 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Jan 31 02:26:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:14.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:15 np0005603622 python3.9[91475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:16 np0005603622 python3.9[91677]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:26:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:16 np0005603622 python3.9[91755]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:16.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:17 np0005603622 python3.9[91907]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:26:17 np0005603622 python3.9[91985]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:26:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:18.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:18 np0005603622 python3.9[92137]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:18.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:19 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 31 02:26:19 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 31 02:26:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:20.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:20 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Jan 31 02:26:20 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Jan 31 02:26:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:20.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:21 np0005603622 python3.9[92288]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:21 np0005603622 python3.9[92440]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 31 02:26:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:22.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:22 np0005603622 python3.9[92590]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:22 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 31 02:26:22 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 31 02:26:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:22.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:24 np0005603622 python3.9[92742]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:24 np0005603622 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 31 02:26:24 np0005603622 systemd[1]: tuned.service: Deactivated successfully.
Jan 31 02:26:24 np0005603622 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 31 02:26:24 np0005603622 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 31 02:26:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:24.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:24 np0005603622 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 31 02:26:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:24.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:24 np0005603622 python3.9[92904]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 31 02:26:25 np0005603622 systemd[72676]: Created slice User Background Tasks Slice.
Jan 31 02:26:25 np0005603622 systemd[72676]: Starting Cleanup of User's Temporary Files and Directories...
Jan 31 02:26:25 np0005603622 systemd[72676]: Finished Cleanup of User's Temporary Files and Directories.
Jan 31 02:26:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:26.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:26 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 31 02:26:26 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 31 02:26:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:26.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Jan 31 02:26:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Jan 31 02:26:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:28.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:28 np0005603622 python3.9[93057]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:28.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:29 np0005603622 python3.9[93211]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:26:29 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.f scrub starts
Jan 31 02:26:29 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.f scrub ok
Jan 31 02:26:30 np0005603622 systemd[1]: session-34.scope: Deactivated successfully.
Jan 31 02:26:30 np0005603622 systemd[1]: session-34.scope: Consumed 1min 586ms CPU time.
Jan 31 02:26:30 np0005603622 systemd-logind[817]: Session 34 logged out. Waiting for processes to exit.
Jan 31 02:26:30 np0005603622 systemd-logind[817]: Removed session 34.
Jan 31 02:26:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:30.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:30.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 31 02:26:31 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 31 02:26:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:32.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:32 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 31 02:26:32 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 31 02:26:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:32.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:34.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:34.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:35 np0005603622 systemd-logind[817]: New session 35 of user zuul.
Jan 31 02:26:35 np0005603622 systemd[1]: Started Session 35 of User zuul.
Jan 31 02:26:36 np0005603622 python3.9[93392]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:36.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:36 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Jan 31 02:26:36 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Jan 31 02:26:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:36.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:37 np0005603622 python3.9[93548]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 31 02:26:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:38.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:38 np0005603622 python3.9[93701]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:26:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:38.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:39 np0005603622 python3.9[93785]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:26:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:40.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:40.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:41 np0005603622 python3.9[93938]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:42.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 31 02:26:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 31 02:26:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:42.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:43 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 31 02:26:43 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 31 02:26:43 np0005603622 python3.9[94091]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:26:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:44.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:44 np0005603622 python3.9[94244]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:44.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:45 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.d scrub starts
Jan 31 02:26:45 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.d scrub ok
Jan 31 02:26:45 np0005603622 python3.9[94396]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 31 02:26:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:46.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:46.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:46 np0005603622 python3.9[94546]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:26:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:47 np0005603622 python3.9[94704]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:48.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:48.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.7 deep-scrub starts
Jan 31 02:26:49 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.7 deep-scrub ok
Jan 31 02:26:50 np0005603622 python3.9[94857]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:26:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:50.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:50 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 31 02:26:50 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 31 02:26:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:51 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 31 02:26:51 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 31 02:26:51 np0005603622 python3.9[95144]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 31 02:26:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:52.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:52 np0005603622 python3.9[95294]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:52.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:53 np0005603622 python3.9[95448]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:26:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:54.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:26:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:26:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:54.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:26:55 np0005603622 python3.9[95601]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:26:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:56.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:26:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:56.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:26:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:26:57 np0005603622 python3.9[95754]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:26:58 np0005603622 python3.9[95908]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 31 02:26:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:26:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:26:58.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:26:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:26:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:26:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:26:58.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:26:59 np0005603622 systemd-logind[817]: Session 35 logged out. Waiting for processes to exit.
Jan 31 02:26:59 np0005603622 systemd[1]: session-35.scope: Deactivated successfully.
Jan 31 02:26:59 np0005603622 systemd[1]: session-35.scope: Consumed 16.465s CPU time.
Jan 31 02:26:59 np0005603622 systemd-logind[817]: Removed session 35.
Jan 31 02:26:59 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 31 02:26:59 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 31 02:27:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:00.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:00.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Jan 31 02:27:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Jan 31 02:27:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:02.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Jan 31 02:27:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Jan 31 02:27:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:02.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Jan 31 02:27:03 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Jan 31 02:27:04 np0005603622 systemd-logind[817]: New session 36 of user zuul.
Jan 31 02:27:04 np0005603622 systemd[1]: Started Session 36 of User zuul.
Jan 31 02:27:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:04.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:05 np0005603622 python3.9[96086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:06 np0005603622 python3.9[96240]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:06.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:06.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:07 np0005603622 python3.9[96433]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:08 np0005603622 systemd[1]: session-36.scope: Deactivated successfully.
Jan 31 02:27:08 np0005603622 systemd[1]: session-36.scope: Consumed 1.924s CPU time.
Jan 31 02:27:08 np0005603622 systemd-logind[817]: Session 36 logged out. Waiting for processes to exit.
Jan 31 02:27:08 np0005603622 systemd-logind[817]: Removed session 36.
Jan 31 02:27:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:08.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:08.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:10.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:10.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:12.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:12.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:13 np0005603622 systemd-logind[817]: New session 37 of user zuul.
Jan 31 02:27:13 np0005603622 systemd[1]: Started Session 37 of User zuul.
Jan 31 02:27:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:14.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:14 np0005603622 python3.9[96613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:14.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:15 np0005603622 python3.9[96767]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:16 np0005603622 python3.9[97035]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:16.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:16 np0005603622 python3.9[97138]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:16.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:18.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:27:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:27:18 np0005603622 python3.9[97291]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:18.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:19 np0005603622 python3.9[97486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:20.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:20 np0005603622 python3.9[97638]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:20.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:21 np0005603622 python3.9[97803]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:21 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 31 02:27:21 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 31 02:27:22 np0005603622 python3.9[97881]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:22.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:22 np0005603622 python3.9[98033]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:22 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 31 02:27:22 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 31 02:27:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:22.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:23 np0005603622 python3.9[98111]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:23 np0005603622 python3.9[98263]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:24.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:24 np0005603622 python3.9[98415]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:24.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:25 np0005603622 python3.9[98567]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:25 np0005603622 python3.9[98719]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:27:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:26.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:26 np0005603622 python3.9[98921]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:27:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:26.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Jan 31 02:27:27 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Jan 31 02:27:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:28.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:28 np0005603622 python3.9[99074]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:27:28 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.f scrub starts
Jan 31 02:27:28 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.f scrub ok
Jan 31 02:27:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:29 np0005603622 python3.9[99228]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:27:30 np0005603622 python3.9[99380]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:27:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:30.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:31.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:31 np0005603622 python3.9[99532]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:27:31 np0005603622 python3.9[99685]: ansible-service_facts Invoked
Jan 31 02:27:32 np0005603622 network[99702]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:27:32 np0005603622 network[99703]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:27:32 np0005603622 network[99704]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:27:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:32.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:33.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Jan 31 02:27:33 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Jan 31 02:27:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:34.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:34 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Jan 31 02:27:34 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Jan 31 02:27:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:35.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:35 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Jan 31 02:27:35 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Jan 31 02:27:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:36.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:36 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 31 02:27:36 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 31 02:27:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:37.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:37 np0005603622 python3.9[100156]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:27:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:38.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:38 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Jan 31 02:27:38 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Jan 31 02:27:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:39.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:39 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.14 deep-scrub starts
Jan 31 02:27:39 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 8.14 deep-scrub ok
Jan 31 02:27:39 np0005603622 python3.9[100309]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 31 02:27:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:40.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Jan 31 02:27:40 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Jan 31 02:27:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:41.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:41 np0005603622 python3.9[100461]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.a scrub starts
Jan 31 02:27:41 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.a scrub ok
Jan 31 02:27:41 np0005603622 python3.9[100539]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:42.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:42 np0005603622 python3.9[100691]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Jan 31 02:27:42 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Jan 31 02:27:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:43.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:43 np0005603622 python3.9[100769]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:44.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:44 np0005603622 python3.9[100921]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:44 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 31 02:27:44 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 31 02:27:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:45.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:46.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:46 np0005603622 python3.9[101073]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:27:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:47.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:47 np0005603622 python3.9[101157]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:27:48 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 31 02:27:48 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 31 02:27:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:48.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:49.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:49 np0005603622 systemd[1]: session-37.scope: Deactivated successfully.
Jan 31 02:27:49 np0005603622 systemd[1]: session-37.scope: Consumed 20.326s CPU time.
Jan 31 02:27:49 np0005603622 systemd-logind[817]: Session 37 logged out. Waiting for processes to exit.
Jan 31 02:27:49 np0005603622 systemd-logind[817]: Removed session 37.
Jan 31 02:27:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:50.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:51.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:51 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.5 deep-scrub starts
Jan 31 02:27:51 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.5 deep-scrub ok
Jan 31 02:27:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:27:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:52.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:27:52 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.d scrub starts
Jan 31 02:27:52 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.d scrub ok
Jan 31 02:27:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:53.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:53 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.6 deep-scrub starts
Jan 31 02:27:54 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.6 deep-scrub ok
Jan 31 02:27:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:54.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:54 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 31 02:27:54 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 31 02:27:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:55.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:55 np0005603622 systemd-logind[817]: New session 38 of user zuul.
Jan 31 02:27:55 np0005603622 systemd[1]: Started Session 38 of User zuul.
Jan 31 02:27:56 np0005603622 python3.9[101340]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:56.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:57.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:27:57 np0005603622 python3.9[101492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:27:57 np0005603622 python3.9[101570]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:27:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 31 02:27:57 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 31 02:27:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:27:58.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:58 np0005603622 systemd[1]: session-38.scope: Deactivated successfully.
Jan 31 02:27:58 np0005603622 systemd[1]: session-38.scope: Consumed 1.240s CPU time.
Jan 31 02:27:58 np0005603622 systemd-logind[817]: Session 38 logged out. Waiting for processes to exit.
Jan 31 02:27:58 np0005603622 systemd-logind[817]: Removed session 38.
Jan 31 02:27:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:27:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:27:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:27:59.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:27:59 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.a deep-scrub starts
Jan 31 02:27:59 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.a deep-scrub ok
Jan 31 02:28:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:00.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:01.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 31 02:28:01 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 31 02:28:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:02.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 31 02:28:02 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 31 02:28:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:03.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:04.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:04 np0005603622 systemd-logind[817]: New session 39 of user zuul.
Jan 31 02:28:04 np0005603622 systemd[1]: Started Session 39 of User zuul.
Jan 31 02:28:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:28:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:05.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:28:05 np0005603622 python3.9[101748]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:28:05 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 31 02:28:05 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 31 02:28:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:06.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:06 np0005603622 python3.9[101904]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:07.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:07 np0005603622 python3.9[102079]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:08 np0005603622 python3.9[102157]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.6727l33r recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:08.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:08 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Jan 31 02:28:08 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Jan 31 02:28:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:09.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:09 np0005603622 python3.9[102309]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:09 np0005603622 python3.9[102387]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.6fltirgq recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:10 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Jan 31 02:28:10 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Jan 31 02:28:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:10.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:10 np0005603622 python3.9[102539]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:11.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:11 np0005603622 python3.9[102691]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Jan 31 02:28:11 np0005603622 python3.9[102769]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:11 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Jan 31 02:28:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:12.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:12 np0005603622 python3.9[102921]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 31 02:28:12 np0005603622 ceph-osd[79107]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 31 02:28:13 np0005603622 python3.9[102999]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:28:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:13.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:13 np0005603622 python3.9[103151]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:14 np0005603622 python3.9[103303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:14.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:14 np0005603622 python3.9[103381]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:28:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:15.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:28:15 np0005603622 python3.9[103533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:16 np0005603622 python3.9[103611]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:16.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:17.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:17 np0005603622 python3.9[103763]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:28:17 np0005603622 systemd[1]: Reloading.
Jan 31 02:28:17 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:28:17 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:28:18 np0005603622 python3.9[103953]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:18.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:19 np0005603622 python3.9[104031]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:19.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:19 np0005603622 python3.9[104183]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:20 np0005603622 python3.9[104261]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:20.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:21.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:21 np0005603622 python3.9[104413]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:28:21 np0005603622 systemd[1]: Reloading.
Jan 31 02:28:21 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:28:21 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:28:21 np0005603622 systemd[1]: Starting Create netns directory...
Jan 31 02:28:21 np0005603622 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:28:21 np0005603622 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:28:21 np0005603622 systemd[1]: Finished Create netns directory.
Jan 31 02:28:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:22 np0005603622 python3.9[104605]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:28:22 np0005603622 network[104622]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:28:22 np0005603622 network[104623]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:28:22 np0005603622 network[104624]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:28:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:22.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:23.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:24.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.648911) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504648976, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2641, "num_deletes": 251, "total_data_size": 5084000, "memory_usage": 5158080, "flush_reason": "Manual Compaction"}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504666216, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3315327, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7116, "largest_seqno": 9751, "table_properties": {"data_size": 3305440, "index_size": 5677, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 27086, "raw_average_key_size": 21, "raw_value_size": 3282604, "raw_average_value_size": 2628, "num_data_blocks": 252, "num_entries": 1249, "num_filter_entries": 1249, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844336, "oldest_key_time": 1769844336, "file_creation_time": 1769844504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 17409 microseconds, and 7039 cpu microseconds.
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.666326) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3315327 bytes OK
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.666344) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.667363) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.667375) EVENT_LOG_v1 {"time_micros": 1769844504667372, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.667394) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5071634, prev total WAL file size 5071634, number of live WAL files 2.
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.668311) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3237KB)], [15(7590KB)]
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504668399, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11088260, "oldest_snapshot_seqno": -1}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3822 keys, 9565598 bytes, temperature: kUnknown
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504731764, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9565598, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9534391, "index_size": 20537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9605, "raw_key_size": 92014, "raw_average_key_size": 24, "raw_value_size": 9459866, "raw_average_value_size": 2475, "num_data_blocks": 896, "num_entries": 3822, "num_filter_entries": 3822, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769844504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.732433) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9565598 bytes
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.738802) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.4 rd, 151.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.4 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.9) OK, records in: 4343, records dropped: 521 output_compression: NoCompression
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.738838) EVENT_LOG_v1 {"time_micros": 1769844504738821, "job": 6, "event": "compaction_finished", "compaction_time_micros": 63221, "compaction_time_cpu_micros": 31828, "output_level": 6, "num_output_files": 1, "total_output_size": 9565598, "num_input_records": 4343, "num_output_records": 3822, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504739445, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844504740567, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.668164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.740605) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.740611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.740614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.740617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:28:24.740620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:28:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:25.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:26 np0005603622 python3.9[104886]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:26 np0005603622 python3.9[104982]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:26.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:27.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:27 np0005603622 python3.9[105247]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:28 np0005603622 python3.9[105399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:28.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:28 np0005603622 python3.9[105477]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:28:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:28:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:29.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:29 np0005603622 python3.9[105629]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 31 02:28:29 np0005603622 systemd[1]: Starting Time & Date Service...
Jan 31 02:28:29 np0005603622 systemd[1]: Started Time & Date Service.
Jan 31 02:28:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:30.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:30 np0005603622 python3.9[105785]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:31.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:31 np0005603622 python3.9[105937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:31 np0005603622 python3.9[106015]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:32.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:32 np0005603622 python3.9[106167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:33 np0005603622 python3.9[106245]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.l6jqx4t3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:33.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:33 np0005603622 python3.9[106397]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:34 np0005603622 python3.9[106475]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:34.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:35.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:35 np0005603622 python3.9[106627]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:28:36 np0005603622 python3[106830]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:28:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:28:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:36.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:37.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:37 np0005603622 python3.9[106982]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:37 np0005603622 python3.9[107060]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:38 np0005603622 python3.9[107212]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:39.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:39 np0005603622 python3.9[107337]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844517.8649013-900-52551705382069/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:39 np0005603622 python3.9[107489]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:40 np0005603622 python3.9[107567]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:40.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:41.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:41 np0005603622 python3.9[107719]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:41 np0005603622 python3.9[107797]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:42.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:42 np0005603622 python3.9[107949]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:43 np0005603622 python3.9[108027]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:44 np0005603622 python3.9[108179]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:28:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:44.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:44 np0005603622 python3.9[108334]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:45.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:45 np0005603622 python3.9[108486]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:46 np0005603622 python3.9[108638]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:46.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:47.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:47 np0005603622 python3.9[108790]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:28:48 np0005603622 python3.9[108942]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 31 02:28:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:48.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:48 np0005603622 systemd[1]: session-39.scope: Deactivated successfully.
Jan 31 02:28:48 np0005603622 systemd[1]: session-39.scope: Consumed 25.568s CPU time.
Jan 31 02:28:48 np0005603622 systemd-logind[817]: Session 39 logged out. Waiting for processes to exit.
Jan 31 02:28:48 np0005603622 systemd-logind[817]: Removed session 39.
Jan 31 02:28:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:49.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:50.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:28:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:51.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:28:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:52.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:53.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:54 np0005603622 systemd-logind[817]: New session 40 of user zuul.
Jan 31 02:28:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:54.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:54 np0005603622 systemd[1]: Started Session 40 of User zuul.
Jan 31 02:28:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:55.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:55 np0005603622 python3.9[109122]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 31 02:28:56 np0005603622 python3.9[109274]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:28:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:56.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:56 np0005603622 python3.9[109428]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 31 02:28:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:28:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:57.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:28:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:28:57 np0005603622 python3.9[109580]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.sh9sp2xw follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:28:58 np0005603622 python3.9[109705]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.sh9sp2xw mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844537.1983895-108-20356577297524/.source.sh9sp2xw _original_basename=.v338w4yp follow=False checksum=894df0945bb562bf664b2d53d15fbd1da03ff944 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:28:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:28:58.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:28:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:28:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:28:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:28:59 np0005603622 python3.9[109857]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:28:59 np0005603622 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 02:29:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:00.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:00 np0005603622 python3.9[110011]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCXc4C5rCDfOfKEuMVHI9SatZ+NRO9lp335K0yZ19CDCOGSUNO2lblpRlgxO3tw3S+UGGiC/7/HHeZBA2Zd+SUVMb7ytbl5c3+XuZIIQF6DyIIDSELf0FoE0NhuSjKFilPsxyxxGYgH+gVaTZkuGhDoljaywQBSPGZdDwejVKWPVuui5xe0X4T0WVfT5avLSpIL3WjJ9hmzEaR0dUqrbKvPUAXJPDqQOZbQZbpXDIi48NPUDFwByej1xHWHRQaPJ/M6AsyrZKP/hiF2xt0mCIk1FANldusq4OUs9r/0KTVrPRCpSrsSimKBtEMJVdxqxAasE7H07sSdwFcWNC21LtsH8+/LM0oofIZ3D0Lom0NoLaC+Ocy2vqbIhOPYJ6c7Q8J/p4NFiA/lD+bgyjOOnm3Ls4VaaHXUyknu259henkVzJ+iZuRNY8ki345nrzPLoLYyxVwRkSuONyYlRp36jjp0QIL9kXLFlJ2OTHvb9FUhlG7RnxzPeHZhsihSHJv1rgU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAT2MDVMbPz3xtbIO31qZj2gzOQiz4a8pTNWAmd0+CUW#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFU8ym/rLGJxMpEsk09j3JHOh1hW4Vrm23tIOjn4/YJIrK1UFRFiQLDm+yZuj1NhWfbg71SK8ZuZ2miEJ20BHno=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAgshePGnD7oc3Zg8kfD9lUGSfPfE1OzPUGBHE12jLoyHnXwKTxYFYSMTWRcYgdFu4HaP0ShO1gEQF+1nDXxrozH/m2qxK/YPC5cVYCPvscwRdlyUNPOV0rpiruVZptTQ1iibsmRwMbxliXD2t13CtsrNjy9iuLgtvvnkfUh0wZKcZ8Jglg6E4vRTBPgXo3fJCfPF9Iz7GE50DpWAU8OnoLNlOf54/tcd8CyOrmLF9RwHTgNtN9FXscdQ3/A8avCF0WPWNUmfLFc20yOtfrq/xxjJMLn4KOZu1D1yjK5BSJu2pv/j0NPrTFKgPKYWjiXPdttcyubkXNZP96jkK9dgTgsEGRKuM83QpDIu7823wv4/GtEi+IsJeyqCN+3VAJo9hDB9eES8qlX4jAg6Kxen1oNkL9M+tz7N0BSdnxbS3skWEw6MsHlsBLOw7KMYe8gq8JoqHLBKBFQZZbjwaK5kNTeu6l5zAYERpt8uAEZkplq2vV5+4EOh7RPncmKuH0Xs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF//s6MNfOt3MK/jBcrJ5VkyeSY5eg1jUHN32BLTGZtT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEtIHGkRVmmqcsRoXLuIEWyuaX3BoKld3DircbfvRpdFLzOwbxRaZ6uUN5f7sBun3oAcQLdnixnG3R/YK8L7HpM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCSEo2WrFN8DnR2/d+p3YtsWos96nHz1MZInXN3md5cJXE0icMDwEWJuGIDUd5e0SA6Q7i33i/WIEmt/wGMoNhoTI+f3plB2NyAn5vyVQGTZv7m+tOLQI3/k50Kxnpu0c5gO509yln6RcLe4MutF0imS/fINCM+Nznh7oKbn6hELTDlxDz0JH8dNsZGmtVmgnhwIrglpxAg/WpeOWkCmuuXmysx1JcAhIK5016MzaM9cOtHAGzj5s0GE7nQoH4yG0Ak3zMU/DPKr91Xq/m9PCnGKautoHmHgrEG6u+1WubtakbBxlfmroKbvrIFL6KKQzY0SiTrBsH3nZRaFGCqE0ZEyHvJz8AO3quWg2oaXRJWN98f7k3l5dtVJIuwyJxVnv6fUGuLbGxOp4T6UDPqC7b2Eg17EtpUjy77F/+8yrX6NH+hXwcWBwHelRCDSiceGQTm1uexb8Xo1R1Wt9h24H2yRKPFrqzf1R9J2vipDouDo7RLefAiCXEJDdlewdKUM5c=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILwGOCpzCDE8uIHb4RBldbKfEvxhUdsBT4K7sPU4vZLU#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLS8teLqq0Lmt8g22OKhtEhLCXd5cBLM6W2oDJcWxQl8DloBMMFjgDlHt0rzjMKEL0SpxkPbH7sPV1zbWKKJI9M=#012 create=True mode=0644 path=/tmp/ansible.sh9sp2xw state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:01.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:01 np0005603622 python3.9[110163]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.sh9sp2xw' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:02.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:02 np0005603622 python3.9[110317]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.sh9sp2xw state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:02 np0005603622 systemd[1]: session-40.scope: Deactivated successfully.
Jan 31 02:29:02 np0005603622 systemd[1]: session-40.scope: Consumed 4.490s CPU time.
Jan 31 02:29:02 np0005603622 systemd-logind[817]: Session 40 logged out. Waiting for processes to exit.
Jan 31 02:29:02 np0005603622 systemd-logind[817]: Removed session 40.
Jan 31 02:29:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:03.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:04.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:05.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:06.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:07.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:08.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:09.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:09 np0005603622 systemd-logind[817]: New session 41 of user zuul.
Jan 31 02:29:09 np0005603622 systemd[1]: Started Session 41 of User zuul.
Jan 31 02:29:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:10.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:10 np0005603622 python3.9[110495]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:11.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:11 np0005603622 python3.9[110651]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:29:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:12.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:12 np0005603622 python3.9[110805]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:29:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:13.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:13 np0005603622 python3.9[110958]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:14 np0005603622 python3.9[111111]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:29:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:14.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:29:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:15.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:15 np0005603622 python3.9[111263]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:15 np0005603622 systemd[1]: session-41.scope: Deactivated successfully.
Jan 31 02:29:15 np0005603622 systemd[1]: session-41.scope: Consumed 3.603s CPU time.
Jan 31 02:29:15 np0005603622 systemd-logind[817]: Session 41 logged out. Waiting for processes to exit.
Jan 31 02:29:15 np0005603622 systemd-logind[817]: Removed session 41.
Jan 31 02:29:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:16.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:17.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:18.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:19.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:20.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:21.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:21 np0005603622 systemd-logind[817]: New session 42 of user zuul.
Jan 31 02:29:21 np0005603622 systemd[1]: Started Session 42 of User zuul.
Jan 31 02:29:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:22 np0005603622 python3.9[111441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:22.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:23.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:23 np0005603622 python3.9[111597]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:29:24 np0005603622 python3.9[111681]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 31 02:29:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:24.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:25.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:26 np0005603622 python3.9[111832]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:29:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:26.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:27.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:27 np0005603622 python3.9[111983]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:29:28 np0005603622 python3.9[112133]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:29 np0005603622 python3.9[112283]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:29:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:29:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:29.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:29:29 np0005603622 systemd[1]: session-42.scope: Deactivated successfully.
Jan 31 02:29:29 np0005603622 systemd[1]: session-42.scope: Consumed 5.055s CPU time.
Jan 31 02:29:29 np0005603622 systemd-logind[817]: Session 42 logged out. Waiting for processes to exit.
Jan 31 02:29:29 np0005603622 systemd-logind[817]: Removed session 42.
Jan 31 02:29:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:30.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:31.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:32.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:33.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.241387) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574241437, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 880, "num_deletes": 250, "total_data_size": 1859115, "memory_usage": 1879384, "flush_reason": "Manual Compaction"}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574246080, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 785535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9756, "largest_seqno": 10631, "table_properties": {"data_size": 782043, "index_size": 1272, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8741, "raw_average_key_size": 19, "raw_value_size": 774753, "raw_average_value_size": 1768, "num_data_blocks": 56, "num_entries": 438, "num_filter_entries": 438, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844505, "oldest_key_time": 1769844505, "file_creation_time": 1769844574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 4744 microseconds, and 2394 cpu microseconds.
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.246132) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 785535 bytes OK
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.246145) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.248009) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.248029) EVENT_LOG_v1 {"time_micros": 1769844574248022, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.248048) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1854613, prev total WAL file size 1854613, number of live WAL files 2.
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.248545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(767KB)], [18(9341KB)]
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574248580, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10351133, "oldest_snapshot_seqno": -1}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3774 keys, 7751061 bytes, temperature: kUnknown
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574296193, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7751061, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7723013, "index_size": 17506, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 91486, "raw_average_key_size": 24, "raw_value_size": 7652034, "raw_average_value_size": 2027, "num_data_blocks": 764, "num_entries": 3774, "num_filter_entries": 3774, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769844574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.296414) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7751061 bytes
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.297659) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.1 rd, 162.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.1 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(23.0) write-amplify(9.9) OK, records in: 4260, records dropped: 486 output_compression: NoCompression
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.297674) EVENT_LOG_v1 {"time_micros": 1769844574297667, "job": 8, "event": "compaction_finished", "compaction_time_micros": 47677, "compaction_time_cpu_micros": 16041, "output_level": 6, "num_output_files": 1, "total_output_size": 7751061, "num_input_records": 4260, "num_output_records": 3774, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574297824, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844574298489, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.248485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.298509) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.298513) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.298515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.298516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:29:34.298517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:29:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:34.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:35.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:35 np0005603622 systemd-logind[817]: New session 43 of user zuul.
Jan 31 02:29:35 np0005603622 systemd[1]: Started Session 43 of User zuul.
Jan 31 02:29:36 np0005603622 python3.9[112535]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:29:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:36.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:29:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:29:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:37.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:37 np0005603622 python3.9[112748]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:38.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:38 np0005603622 python3.9[112900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:29:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:39.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:29:39 np0005603622 python3.9[113052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:39 np0005603622 python3.9[113175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844578.754714-154-153636494492849/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=634868e2677906a62c86c572047e7d74797995d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:40 np0005603622 python3.9[113327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:40.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:40 np0005603622 python3.9[113450]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844580.103284-154-166873696538410/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=a66cd34ae464c50bbe4c963e6eef9b60dc2a1e49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:41.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:41 np0005603622 python3.9[113602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:41 np0005603622 python3.9[113725]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844581.0813575-154-5396517342312/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9128fd826cf20c1e89b3f5c9ecd9a7cceed1d63e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:42.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:42 np0005603622 python3.9[113877]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:29:43 np0005603622 python3.9[114079]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:29:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:43.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:29:43 np0005603622 python3.9[114231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:44 np0005603622 python3.9[114354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844583.3636034-331-93729998393183/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=6c695393e512d427859d504001405000daa1b0ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:44.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:44 np0005603622 python3.9[114506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:45.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:45 np0005603622 python3.9[114629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844584.4846528-331-214786403121869/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=649eeea41a1e15889a1c750fd61fb88aa589bc91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:45 np0005603622 python3.9[114781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:46 np0005603622 python3.9[114904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844585.52234-331-21372587304451/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=2c3c5a2cf264f4c5d63251f82e0715b6c844e248 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:46.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:46 np0005603622 python3.9[115056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:47.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:47 np0005603622 python3.9[115208]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:48 np0005603622 python3.9[115360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:48 np0005603622 python3.9[115483]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844587.6993008-507-237339474634293/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a9fbeab64982af1e80736b94cfc0ac923dd7c6e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:48.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:49 np0005603622 python3.9[115635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:49.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:49 np0005603622 python3.9[115758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844588.7086298-507-265971618961040/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=649eeea41a1e15889a1c750fd61fb88aa589bc91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:50 np0005603622 python3.9[115910]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:50.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:50 np0005603622 python3.9[116033]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844589.8370855-507-226470811937097/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=107166fcbdb413835d34ab267fdea90d1d5ce832 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:51.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:51 np0005603622 python3.9[116185]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:52 np0005603622 python3.9[116337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:52.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:53 np0005603622 python3.9[116460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844592.1585326-712-34284582402124/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:53 np0005603622 python3.9[116612]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:54 np0005603622 python3.9[116764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:54.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:54 np0005603622 python3.9[116887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844593.9241714-785-157396705919005/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:55.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:55 np0005603622 python3.9[117039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:56 np0005603622 python3.9[117191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:29:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:56.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:29:56 np0005603622 python3.9[117314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844595.824505-863-29785500387050/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:29:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:57.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:57 np0005603622 python3.9[117466]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:57 np0005603622 python3.9[117618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:29:58 np0005603622 python3.9[117741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844597.5181813-935-280909647368690/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:29:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:29:58.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:59 np0005603622 python3.9[117893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:29:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:29:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:29:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:29:59.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:29:59 np0005603622 python3.9[118045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:00 np0005603622 python3.9[118168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844599.2761977-1002-278821037726624/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:30:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:00.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:00 np0005603622 python3.9[118320]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:01.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:01 np0005603622 python3.9[118472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:02 np0005603622 python3.9[118595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844601.1048484-1061-178928126395132/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=95f204ee8062e227608bf68163d0c9f95531c74c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:02.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:03.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:03 np0005603622 systemd[1]: session-43.scope: Deactivated successfully.
Jan 31 02:30:03 np0005603622 systemd[1]: session-43.scope: Consumed 19.511s CPU time.
Jan 31 02:30:03 np0005603622 systemd-logind[817]: Session 43 logged out. Waiting for processes to exit.
Jan 31 02:30:03 np0005603622 systemd-logind[817]: Removed session 43.
Jan 31 02:30:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:04.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:05.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:06.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:08.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:09 np0005603622 systemd-logind[817]: New session 44 of user zuul.
Jan 31 02:30:09 np0005603622 systemd[1]: Started Session 44 of User zuul.
Jan 31 02:30:10 np0005603622 python3.9[118775]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:10.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:10 np0005603622 python3.9[118927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:11.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:11 np0005603622 python3.9[119050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844610.305647-63-18453203913493/.source.conf _original_basename=ceph.conf follow=False checksum=23cbd0a652332596774a4195d9b5b25af094d504 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:11 np0005603622 python3.9[119202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:12 np0005603622 python3.9[119325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844611.5943303-63-251215172204598/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=35152db97829fbbc30ac5e5c6e1f42921e77a1a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:12.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:12 np0005603622 systemd-logind[817]: Session 44 logged out. Waiting for processes to exit.
Jan 31 02:30:12 np0005603622 systemd[1]: session-44.scope: Deactivated successfully.
Jan 31 02:30:12 np0005603622 systemd[1]: session-44.scope: Consumed 2.370s CPU time.
Jan 31 02:30:12 np0005603622 systemd-logind[817]: Removed session 44.
Jan 31 02:30:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:13.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:14.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:15.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:16.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:17.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:18 np0005603622 systemd-logind[817]: New session 45 of user zuul.
Jan 31 02:30:18 np0005603622 systemd[1]: Started Session 45 of User zuul.
Jan 31 02:30:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:18.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:18 np0005603622 python3.9[119503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:19.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:20 np0005603622 python3.9[119659]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:20.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:20 np0005603622 python3.9[119811]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:21.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:21 np0005603622 python3.9[119961]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:22 np0005603622 python3.9[120113]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 02:30:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:22.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:23.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:23 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 31 02:30:24 np0005603622 python3.9[120269]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:30:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:24.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:25 np0005603622 python3.9[120353]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:30:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:25.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:26.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:27 np0005603622 python3.9[120506]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:30:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:27.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:28 np0005603622 python3[120661]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 31 02:30:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:28.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:28 np0005603622 python3.9[120813]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:29.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:29 np0005603622 python3.9[120965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:30 np0005603622 python3.9[121043]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:30.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:30 np0005603622 python3.9[121195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:31 np0005603622 python3.9[121273]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.745pqqow recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:31.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:31 np0005603622 python3.9[121425]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:32 np0005603622 python3.9[121503]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:32.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:33 np0005603622 python3.9[121655]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:33.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:33 np0005603622 python3[121808]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:30:34 np0005603622 python3.9[121960]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:34.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:35 np0005603622 python3.9[122085]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844633.9773479-432-57112828128156/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:35.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:36 np0005603622 python3.9[122237]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:36 np0005603622 python3.9[122362]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844635.4091816-477-237088695856780/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:36.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:37 np0005603622 python3.9[122514]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:37.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:37 np0005603622 python3.9[122639]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844636.7297115-522-175478989326999/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:38 np0005603622 python3.9[122791]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:38.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:38 np0005603622 python3.9[122916]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844637.9691825-567-78154706191576/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:39.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:39 np0005603622 python3.9[123068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:40 np0005603622 python3.9[123193]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844639.177754-612-163565545829160/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:30:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:40.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:30:40 np0005603622 python3.9[123345]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:41.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:41 np0005603622 python3.9[123497]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:42 np0005603622 python3.9[123652]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:42.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:43 np0005603622 python3.9[123887]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:43.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:43 np0005603622 python3.9[124088]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:30:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:30:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:30:44 np0005603622 python3.9[124242]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:44.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:44 np0005603622 python3.9[124397]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:45.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:46 np0005603622 python3.9[124547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:30:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:46.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:47.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:47 np0005603622 python3.9[124700]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:47 np0005603622 ovs-vsctl[124701]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 31 02:30:48 np0005603622 python3.9[124853]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:48.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:48 np0005603622 python3.9[125056]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:30:48 np0005603622 ovs-vsctl[125059]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 31 02:30:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:49.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:49 np0005603622 python3.9[125209]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:30:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:30:50 np0005603622 python3.9[125363]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:50.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:51 np0005603622 python3.9[125515]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:51.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:51 np0005603622 python3.9[125593]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:52 np0005603622 python3.9[125745]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:52 np0005603622 python3.9[125823]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:30:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:52.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:53 np0005603622 python3.9[125975]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:53.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:53 np0005603622 python3.9[126127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:54 np0005603622 python3.9[126205]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:54.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:54 np0005603622 python3.9[126357]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:30:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:55.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:30:55 np0005603622 python3.9[126435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:56 np0005603622 python3.9[126587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:30:56 np0005603622 systemd[1]: Reloading.
Jan 31 02:30:56 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:30:56 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:30:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:56.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:57 np0005603622 python3.9[126776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:30:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:57.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:57 np0005603622 python3.9[126854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:58 np0005603622 python3.9[127006]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:30:58 np0005603622 python3.9[127084]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:30:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:30:58.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:30:59 np0005603622 python3.9[127236]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:30:59 np0005603622 systemd[1]: Reloading.
Jan 31 02:30:59 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:30:59 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:30:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:30:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:30:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:30:59.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:00 np0005603622 systemd[1]: Starting Create netns directory...
Jan 31 02:31:00 np0005603622 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:31:00 np0005603622 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:31:00 np0005603622 systemd[1]: Finished Create netns directory.
Jan 31 02:31:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:00.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:01 np0005603622 python3.9[127430]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:01 np0005603622 python3.9[127582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:01.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:02 np0005603622 python3.9[127705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844661.2316113-1365-131563818824192/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:02.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:03 np0005603622 python3.9[127857]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:03 np0005603622 python3.9[128009]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:03.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:04 np0005603622 python3.9[128161]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:04.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:05 np0005603622 python3.9[128284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844664.1718726-1464-189767106482506/.source.json _original_basename=.cp6gp2r_ follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:05 np0005603622 python3.9[128434]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:05.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:07.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:07 np0005603622 python3.9[128857]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 31 02:31:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:08.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:08 np0005603622 python3.9[129009]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:31:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:09.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:10 np0005603622 python3[129161]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:31:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:10.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:11.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:12.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:13.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:14 np0005603622 podman[129175]: 2026-01-31 07:31:14.406663169 +0000 UTC m=+4.291826551 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:14 np0005603622 podman[129293]: 2026-01-31 07:31:14.593257256 +0000 UTC m=+0.088344298 container create 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 31 02:31:14 np0005603622 podman[129293]: 2026-01-31 07:31:14.527820958 +0000 UTC m=+0.022908010 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:14 np0005603622 python3[129161]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 31 02:31:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:14.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:15 np0005603622 python3.9[129484]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:15.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:16 np0005603622 python3.9[129638]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:16 np0005603622 python3.9[129714]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:17 np0005603622 python3.9[129865]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769844676.6684926-1698-268205832965263/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:17 np0005603622 python3.9[129941]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:31:17 np0005603622 systemd[1]: Reloading.
Jan 31 02:31:17 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:17 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:17.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:18 np0005603622 python3.9[130053]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:31:18 np0005603622 systemd[1]: Reloading.
Jan 31 02:31:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:18.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:18 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:18 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:18 np0005603622 systemd[1]: Starting ovn_controller container...
Jan 31 02:31:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:31:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3108317932c6177c9900d4111f7ab21f6c07b3116fcb96593369af4169509de/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 31 02:31:19 np0005603622 systemd[1]: Started /usr/bin/podman healthcheck run 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4.
Jan 31 02:31:19 np0005603622 podman[130094]: 2026-01-31 07:31:19.1083881 +0000 UTC m=+0.128118125 container init 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + sudo -E kolla_set_configs
Jan 31 02:31:19 np0005603622 podman[130094]: 2026-01-31 07:31:19.143166443 +0000 UTC m=+0.162896378 container start 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:31:19 np0005603622 edpm-start-podman-container[130094]: ovn_controller
Jan 31 02:31:19 np0005603622 systemd[1]: Created slice User Slice of UID 0.
Jan 31 02:31:19 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 31 02:31:19 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 31 02:31:19 np0005603622 systemd[1]: Starting User Manager for UID 0...
Jan 31 02:31:19 np0005603622 edpm-start-podman-container[130093]: Creating additional drop-in dependency for "ovn_controller" (8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4)
Jan 31 02:31:19 np0005603622 systemd[1]: Reloading.
Jan 31 02:31:19 np0005603622 podman[130116]: 2026-01-31 07:31:19.234745956 +0000 UTC m=+0.082936524 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:31:19 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:19 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:19 np0005603622 systemd[130147]: Queued start job for default target Main User Target.
Jan 31 02:31:19 np0005603622 systemd[130147]: Created slice User Application Slice.
Jan 31 02:31:19 np0005603622 systemd[130147]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 31 02:31:19 np0005603622 systemd[130147]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:31:19 np0005603622 systemd[130147]: Reached target Paths.
Jan 31 02:31:19 np0005603622 systemd[130147]: Reached target Timers.
Jan 31 02:31:19 np0005603622 systemd[130147]: Starting D-Bus User Message Bus Socket...
Jan 31 02:31:19 np0005603622 systemd[130147]: Starting Create User's Volatile Files and Directories...
Jan 31 02:31:19 np0005603622 systemd[130147]: Finished Create User's Volatile Files and Directories.
Jan 31 02:31:19 np0005603622 systemd[130147]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:31:19 np0005603622 systemd[130147]: Reached target Sockets.
Jan 31 02:31:19 np0005603622 systemd[130147]: Reached target Basic System.
Jan 31 02:31:19 np0005603622 systemd[130147]: Reached target Main User Target.
Jan 31 02:31:19 np0005603622 systemd[130147]: Startup finished in 119ms.
Jan 31 02:31:19 np0005603622 systemd[1]: Started User Manager for UID 0.
Jan 31 02:31:19 np0005603622 systemd[1]: Started ovn_controller container.
Jan 31 02:31:19 np0005603622 systemd[1]: 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4-30f15f26c251a4ca.service: Main process exited, code=exited, status=1/FAILURE
Jan 31 02:31:19 np0005603622 systemd[1]: 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4-30f15f26c251a4ca.service: Failed with result 'exit-code'.
Jan 31 02:31:19 np0005603622 systemd[1]: Started Session c1 of User root.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: INFO:__main__:Validating config file
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: INFO:__main__:Writing out command to execute
Jan 31 02:31:19 np0005603622 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: ++ cat /run_command
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + ARGS=
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + sudo kolla_copy_cacerts
Jan 31 02:31:19 np0005603622 systemd[1]: Started Session c2 of User root.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + [[ ! -n '' ]]
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + . kolla_extend_start
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + umask 0022
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 31 02:31:19 np0005603622 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7032] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7046] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <warn>  [1769844679.7049] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:31:19 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7064] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7075] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7080] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:31:19 np0005603622 kernel: br-int: entered promiscuous mode
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00019|main|INFO|OVS feature set changed, force recompute.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00020|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00022|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00023|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:19Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7424] manager: (ovn-59a8b9-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7430] manager: (ovn-7ec8bf-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7438] manager: (ovn-71aaf7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 31 02:31:19 np0005603622 systemd-udevd[130243]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:31:19 np0005603622 kernel: genev_sys_6081: entered promiscuous mode
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7617] device (genev_sys_6081): carrier: link connected
Jan 31 02:31:19 np0005603622 NetworkManager[49080]: <info>  [1769844679.7624] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Jan 31 02:31:19 np0005603622 systemd-udevd[130248]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:31:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:19.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:20 np0005603622 python3.9[130375]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 02:31:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:20.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:21 np0005603622 python3.9[130527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:21 np0005603622 python3.9[130650]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844681.0277886-1833-252022179392546/.source.yaml _original_basename=.4f2v60sf follow=False checksum=869a4744df33825307102f8d7b13c7e3fcbb8f59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:21.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:22 np0005603622 python3.9[130802]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:22 np0005603622 ovs-vsctl[130803]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 31 02:31:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:22.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:23 np0005603622 python3.9[130955]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:23 np0005603622 ovs-vsctl[130957]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 31 02:31:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:23.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:24 np0005603622 python3.9[131110]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:31:24 np0005603622 ovs-vsctl[131111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 31 02:31:24 np0005603622 systemd[1]: session-45.scope: Deactivated successfully.
Jan 31 02:31:24 np0005603622 systemd[1]: session-45.scope: Consumed 51.272s CPU time.
Jan 31 02:31:24 np0005603622 systemd-logind[817]: Session 45 logged out. Waiting for processes to exit.
Jan 31 02:31:24 np0005603622 systemd-logind[817]: Removed session 45.
Jan 31 02:31:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:25.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:26.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:27.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:28.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:29 np0005603622 systemd[1]: Stopping User Manager for UID 0...
Jan 31 02:31:29 np0005603622 systemd[130147]: Activating special unit Exit the Session...
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped target Main User Target.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped target Basic System.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped target Paths.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped target Sockets.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped target Timers.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 02:31:29 np0005603622 systemd[130147]: Closed D-Bus User Message Bus Socket.
Jan 31 02:31:29 np0005603622 systemd[130147]: Stopped Create User's Volatile Files and Directories.
Jan 31 02:31:29 np0005603622 systemd[130147]: Removed slice User Application Slice.
Jan 31 02:31:29 np0005603622 systemd[130147]: Reached target Shutdown.
Jan 31 02:31:29 np0005603622 systemd[130147]: Finished Exit the Session.
Jan 31 02:31:29 np0005603622 systemd[130147]: Reached target Exit the Session.
Jan 31 02:31:29 np0005603622 systemd[1]: user@0.service: Deactivated successfully.
Jan 31 02:31:29 np0005603622 systemd[1]: Stopped User Manager for UID 0.
Jan 31 02:31:29 np0005603622 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 31 02:31:29 np0005603622 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 31 02:31:29 np0005603622 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 31 02:31:29 np0005603622 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 31 02:31:29 np0005603622 systemd[1]: Removed slice User Slice of UID 0.
Jan 31 02:31:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:29.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:30 np0005603622 systemd-logind[817]: New session 47 of user zuul.
Jan 31 02:31:30 np0005603622 systemd[1]: Started Session 47 of User zuul.
Jan 31 02:31:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:30.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:31 np0005603622 python3.9[131291]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:31:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:32.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:32 np0005603622 python3.9[131447]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:32.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:33 np0005603622 python3.9[131599]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:33 np0005603622 python3.9[131751]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:34.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:34 np0005603622 python3.9[131903]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:34.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:35 np0005603622 python3.9[132055]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:35 np0005603622 python3.9[132205]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:31:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:36.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:36 np0005603622 python3.9[132357]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 31 02:31:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:31:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:38.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:31:38 np0005603622 python3.9[132512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:38.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:39 np0005603622 python3.9[132634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844697.566017-218-67370934832039/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:39 np0005603622 python3.9[132784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:40.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:40 np0005603622 python3.9[132905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844699.2603533-263-114944062944059/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:40.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:41 np0005603622 python3.9[133057]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:31:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:42.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:42 np0005603622 python3.9[133141]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:31:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:42.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:44.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:44.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:45 np0005603622 python3.9[133294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:31:45 np0005603622 python3.9[133447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:46.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:46 np0005603622 python3.9[133568]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844705.5625918-374-240720480316717/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:46.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:46 np0005603622 python3.9[133718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:47 np0005603622 python3.9[133839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844706.563068-374-16138347810347/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:48.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:48 np0005603622 python3.9[133989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:48.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:49 np0005603622 python3.9[134133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844708.1317182-506-57832553952776/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:49 np0005603622 python3.9[134380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:50.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:50 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:50Z|00025|memory|INFO|16512 kB peak resident set size after 30.4 seconds
Jan 31 02:31:50 np0005603622 ovn_controller[130109]: 2026-01-31T07:31:50Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 31 02:31:50 np0005603622 podman[134475]: 2026-01-31 07:31:50.093212604 +0000 UTC m=+0.088085831 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:31:50 np0005603622 python3.9[134513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844709.3033905-506-179561901118499/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:50.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:50 np0005603622 python3.9[134792]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:31:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:51 np0005603622 python3.9[134963]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:31:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:31:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:52.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:52 np0005603622 python3.9[135115]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:52 np0005603622 python3.9[135193]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:52.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:54.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:54 np0005603622 python3.9[135345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:54.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:54 np0005603622 python3.9[135423]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:31:55 np0005603622 python3.9[135575]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:31:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:56.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:31:56 np0005603622 python3.9[135727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:56 np0005603622 python3.9[135805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:56.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:31:57 np0005603622 python3.9[135957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:31:57 np0005603622 python3.9[136085]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:31:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:31:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:31:58.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:31:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:31:58 np0005603622 python3.9[136237]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:31:58 np0005603622 systemd[1]: Reloading.
Jan 31 02:31:58 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:31:58 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:31:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:31:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:31:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:31:58.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:31:59 np0005603622 python3.9[136427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:00.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:00 np0005603622 python3.9[136505]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:00.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:01 np0005603622 python3.9[136657]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:01 np0005603622 python3.9[136735]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:02.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5744 writes, 24K keys, 5744 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5744 writes, 905 syncs, 6.35 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5744 writes, 24K keys, 5744 commit groups, 1.0 writes per commit group, ingest: 18.97 MB, 0.03 MB/s#012Interval WAL: 5744 writes, 905 syncs, 6.35 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 31 02:32:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:32:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:32:02 np0005603622 python3.9[136887]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:02 np0005603622 systemd[1]: Reloading.
Jan 31 02:32:03 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:03 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:03 np0005603622 systemd[1]: Starting Create netns directory...
Jan 31 02:32:03 np0005603622 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 31 02:32:03 np0005603622 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 31 02:32:03 np0005603622 systemd[1]: Finished Create netns directory.
Jan 31 02:32:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:04.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:04 np0005603622 python3.9[137081]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:04.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:04 np0005603622 python3.9[137233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:05 np0005603622 python3.9[137356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769844724.3804114-959-127492497675799/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:32:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:06.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:32:06 np0005603622 python3.9[137508]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:06.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:06 np0005603622 python3.9[137660]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:32:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:07 np0005603622 python3.9[137812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:08.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:08 np0005603622 python3.9[137935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844727.1206367-1058-56613925946791/.source.json _original_basename=.kvte1tmu follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:08.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:08 np0005603622 python3.9[138085]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:10.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:32:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:10.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.153056) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731153111, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1694, "num_deletes": 251, "total_data_size": 4138839, "memory_usage": 4187552, "flush_reason": "Manual Compaction"}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731170622, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2702218, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10636, "largest_seqno": 12325, "table_properties": {"data_size": 2695154, "index_size": 4135, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14061, "raw_average_key_size": 19, "raw_value_size": 2681102, "raw_average_value_size": 3698, "num_data_blocks": 187, "num_entries": 725, "num_filter_entries": 725, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844574, "oldest_key_time": 1769844574, "file_creation_time": 1769844731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 17654 microseconds, and 7806 cpu microseconds.
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.170708) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2702218 bytes OK
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.170731) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.173367) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.173392) EVENT_LOG_v1 {"time_micros": 1769844731173384, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.173413) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4131122, prev total WAL file size 4131122, number of live WAL files 2.
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.174519) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2638KB)], [21(7569KB)]
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731174581, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10453279, "oldest_snapshot_seqno": -1}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3982 keys, 8239307 bytes, temperature: kUnknown
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731241377, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8239307, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8210145, "index_size": 18108, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9989, "raw_key_size": 96479, "raw_average_key_size": 24, "raw_value_size": 8135709, "raw_average_value_size": 2043, "num_data_blocks": 782, "num_entries": 3982, "num_filter_entries": 3982, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769844731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.241752) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8239307 bytes
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.244121) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.1 rd, 123.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 7.4 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(6.9) write-amplify(3.0) OK, records in: 4499, records dropped: 517 output_compression: NoCompression
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.244157) EVENT_LOG_v1 {"time_micros": 1769844731244140, "job": 10, "event": "compaction_finished", "compaction_time_micros": 66948, "compaction_time_cpu_micros": 27421, "output_level": 6, "num_output_files": 1, "total_output_size": 8239307, "num_input_records": 4499, "num_output_records": 3982, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731244685, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844731245791, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.174464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.245829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.245835) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.245838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.245841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:32:11.245844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:32:11 np0005603622 python3.9[138508]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 31 02:32:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:12.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:12 np0005603622 python3.9[138660]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:32:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:32:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:12.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:32:13 np0005603622 python3[138812]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:32:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:32:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:14.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:32:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:14.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:16.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:16.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:18.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:18.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:20.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:20.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:22.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:22.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:23 np0005603622 podman[138906]: 2026-01-31 07:32:23.387978174 +0000 UTC m=+2.337820580 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 31 02:32:23 np0005603622 podman[138825]: 2026-01-31 07:32:23.630416018 +0000 UTC m=+10.229375990 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:23 np0005603622 podman[138972]: 2026-01-31 07:32:23.734270586 +0000 UTC m=+0.043909264 container create ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:32:23 np0005603622 podman[138972]: 2026-01-31 07:32:23.709148961 +0000 UTC m=+0.018787659 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:23 np0005603622 python3[138812]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:32:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:24.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:24 np0005603622 python3.9[139162]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:32:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:24.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:25 np0005603622 python3.9[139316]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:25 np0005603622 python3.9[139392]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:32:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:26.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:26 np0005603622 python3.9[139543]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769844745.7487118-1292-272666926071/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:26.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:26 np0005603622 python3.9[139619]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:32:26 np0005603622 systemd[1]: Reloading.
Jan 31 02:32:27 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:27 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2162 writes, 12K keys, 2162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2162 writes, 2162 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2162 writes, 12K keys, 2162 commit groups, 1.0 writes per commit group, ingest: 23.45 MB, 0.04 MB/s#012Interval WAL: 2162 writes, 2162 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    170.2      0.08              0.03         5    0.016       0      0       0.0       0.0#012  L6      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.3    168.7    141.8      0.22              0.09         4    0.056     16K   1780       0.0       0.0#012 Sum      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3    123.7    149.4      0.31              0.12         9    0.034     16K   1780       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3    124.5    150.3      0.30              0.12         8    0.038     16K   1780       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    168.7    141.8      0.22              0.09         4    0.056     16K   1780       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    174.3      0.08              0.03         4    0.020       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.3 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 1.27 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000102 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(61,1.11 MB,0.364524%) FilterBlock(9,53.86 KB,0.0173017%) IndexBlock(9,116.48 KB,0.0374192%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:32:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:27 np0005603622 python3.9[139732]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:28 np0005603622 systemd[1]: Reloading.
Jan 31 02:32:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:28.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:28 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:28 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:28 np0005603622 systemd[1]: Starting ovn_metadata_agent container...
Jan 31 02:32:28 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:32:28 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d82f7c2796028c5da4820001ee42b8284c7f9df95fc05960752929c2e5e20b76/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 31 02:32:28 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d82f7c2796028c5da4820001ee42b8284c7f9df95fc05960752929c2e5e20b76/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:32:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:28.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:28 np0005603622 systemd[1]: Started /usr/bin/podman healthcheck run ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4.
Jan 31 02:32:29 np0005603622 podman[139773]: 2026-01-31 07:32:29.020647613 +0000 UTC m=+0.558913191 container init ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + sudo -E kolla_set_configs
Jan 31 02:32:29 np0005603622 podman[139773]: 2026-01-31 07:32:29.043648728 +0000 UTC m=+0.581914256 container start ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 02:32:29 np0005603622 edpm-start-podman-container[139773]: ovn_metadata_agent
Jan 31 02:32:29 np0005603622 edpm-start-podman-container[139772]: Creating additional drop-in dependency for "ovn_metadata_agent" (ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4)
Jan 31 02:32:29 np0005603622 podman[139794]: 2026-01-31 07:32:29.129164909 +0000 UTC m=+0.069480788 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Validating config file
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Copying service configuration files
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Writing out command to execute
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: ++ cat /run_command
Jan 31 02:32:29 np0005603622 systemd[1]: Reloading.
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + CMD=neutron-ovn-metadata-agent
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + ARGS=
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + sudo kolla_copy_cacerts
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + [[ ! -n '' ]]
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + . kolla_extend_start
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: Running command: 'neutron-ovn-metadata-agent'
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + umask 0022
Jan 31 02:32:29 np0005603622 ovn_metadata_agent[139788]: + exec neutron-ovn-metadata-agent
Jan 31 02:32:29 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:29 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:29 np0005603622 systemd[1]: Started ovn_metadata_agent container.
Jan 31 02:32:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:30.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:30 np0005603622 python3.9[140023]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.783 139793 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.783 139793 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.783 139793 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.784 139793 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.785 139793 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.786 139793 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.787 139793 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.788 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.789 139793 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.790 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.791 139793 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.792 139793 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.793 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.794 139793 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.795 139793 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.796 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.797 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.798 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.799 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.800 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.801 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.802 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.803 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.804 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.805 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.806 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.807 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.808 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.809 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.810 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.811 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.812 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.813 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.814 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.815 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.816 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.817 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.818 139793 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.819 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.820 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.821 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.822 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.823 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.824 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.824 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.824 139793 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.824 139793 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.835 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.836 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.836 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.836 139793 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.837 139793 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 31 02:32:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:30.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.851 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name bd097fed-e54b-4ed7-90f0-078b39b8b13a (UUID: bd097fed-e54b-4ed7-90f0-078b39b8b13a) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.887 139793 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.887 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.887 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.887 139793 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.892 139793 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.898 139793 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.907 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'bd097fed-e54b-4ed7-90f0-078b39b8b13a'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], external_ids={}, name=bd097fed-e54b-4ed7-90f0-078b39b8b13a, nb_cfg_timestamp=1769844687729, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.908 139793 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f018ea87f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.909 139793 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.909 139793 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.912 139793 DEBUG oslo_service.service [-] Started child 140048 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.914 139793 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpzqq8c2wu/privsep.sock']#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.916 140048 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-493020'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.940 140048 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.941 140048 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.941 140048 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.945 140048 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.950 140048 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 31 02:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:30.956 140048 INFO eventlet.wsgi.server [-] (140048) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 31 02:32:31 np0005603622 python3.9[140180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:32:31 np0005603622 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.540 139793 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.541 139793 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzqq8c2wu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.421 140181 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.424 140181 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.428 140181 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.428 140181 INFO oslo.privsep.daemon [-] privsep daemon running as pid 140181#033[00m
Jan 31 02:32:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:31.544 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[27d51471-f9d0-46cb-9725-a5f812ffb40c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:32:31 np0005603622 python3.9[140310]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844750.9650736-1427-194856184614299/.source.yaml _original_basename=.9qg59wkq follow=False checksum=87ad539680adb8db4e1be011e7c446590196a675 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:32.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.090 140181 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.090 140181 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.090 140181 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:32:32 np0005603622 systemd[1]: session-47.scope: Deactivated successfully.
Jan 31 02:32:32 np0005603622 systemd[1]: session-47.scope: Consumed 49.017s CPU time.
Jan 31 02:32:32 np0005603622 systemd-logind[817]: Session 47 logged out. Waiting for processes to exit.
Jan 31 02:32:32 np0005603622 systemd-logind[817]: Removed session 47.
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.639 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4728aa-c390-40eb-ab26-dfd077787dd7]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.642 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, column=external_ids, values=({'neutron:ovn-metadata-id': 'ce8f9185-70c1-5ed0-b3d0-ebe122b8d913'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.653 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.661 139793 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.662 139793 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.662 139793 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.662 139793 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.662 139793 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.662 139793 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.663 139793 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.663 139793 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.663 139793 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.664 139793 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.664 139793 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.664 139793 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.665 139793 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.665 139793 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.666 139793 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.666 139793 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.666 139793 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.667 139793 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.667 139793 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.667 139793 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.668 139793 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.668 139793 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.668 139793 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.669 139793 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.669 139793 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.670 139793 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.670 139793 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.670 139793 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.670 139793 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.671 139793 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.671 139793 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.671 139793 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.672 139793 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.672 139793 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.673 139793 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.673 139793 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.673 139793 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.674 139793 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.674 139793 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.674 139793 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.675 139793 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.675 139793 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.675 139793 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.675 139793 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.676 139793 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.676 139793 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.676 139793 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.676 139793 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.677 139793 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.677 139793 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.677 139793 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.677 139793 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.678 139793 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.678 139793 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.678 139793 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.678 139793 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.678 139793 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.679 139793 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.679 139793 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.679 139793 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.679 139793 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.680 139793 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.680 139793 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.680 139793 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.681 139793 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.681 139793 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.681 139793 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.681 139793 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.681 139793 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.682 139793 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.682 139793 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.682 139793 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.682 139793 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.683 139793 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.683 139793 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.683 139793 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.683 139793 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.684 139793 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.684 139793 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.684 139793 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.684 139793 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.685 139793 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.685 139793 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.685 139793 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.686 139793 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.686 139793 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.686 139793 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.686 139793 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.686 139793 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.687 139793 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.687 139793 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.687 139793 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.687 139793 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.688 139793 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.688 139793 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.688 139793 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.688 139793 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.689 139793 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.689 139793 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.689 139793 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.689 139793 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.690 139793 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.690 139793 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.690 139793 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.690 139793 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.690 139793 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.691 139793 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.691 139793 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.691 139793 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.691 139793 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.692 139793 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.692 139793 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.692 139793 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.692 139793 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.693 139793 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.693 139793 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.693 139793 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.693 139793 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.694 139793 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.694 139793 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.694 139793 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.694 139793 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.695 139793 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.697 139793 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.698 139793 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.699 139793 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.700 139793 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.701 139793 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.702 139793 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.703 139793 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.704 139793 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.705 139793 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.706 139793 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.707 139793 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.708 139793 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.709 139793 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.710 139793 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.711 139793 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.712 139793 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.713 139793 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.714 139793 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.715 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.716 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.717 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.718 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:32:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:32:32.719 139793 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:32:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:32.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:34.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:34.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:36.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:37 np0005603622 systemd-logind[817]: New session 48 of user zuul.
Jan 31 02:32:37 np0005603622 systemd[1]: Started Session 48 of User zuul.
Jan 31 02:32:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:38.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:38.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:38 np0005603622 python3.9[140488]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:32:40 np0005603622 python3.9[140644]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:32:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:40.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 31 02:32:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:40.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 31 02:32:41 np0005603622 python3.9[140809]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:32:41 np0005603622 systemd[1]: Reloading.
Jan 31 02:32:41 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:32:41 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:32:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:42.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:42 np0005603622 python3.9[140993]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:32:42 np0005603622 network[141010]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:32:42 np0005603622 network[141011]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:32:42 np0005603622 network[141012]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:32:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:44.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:46.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:46.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:47 np0005603622 python3.9[141274]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:47 np0005603622 python3.9[141427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:48.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:48 np0005603622 python3.9[141580]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:48.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:49 np0005603622 python3.9[141733]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:49 np0005603622 python3.9[141886]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:50.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:50 np0005603622 python3.9[142039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:50.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:51 np0005603622 python3.9[142192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:32:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:52.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:52 np0005603622 python3.9[142345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:52 np0005603622 python3.9[142497]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 31 02:32:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:52.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 31 02:32:53 np0005603622 python3.9[142649]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:53 np0005603622 podman[142773]: 2026-01-31 07:32:53.756328753 +0000 UTC m=+0.088183223 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 02:32:53 np0005603622 python3.9[142813]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:54.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:54 np0005603622 python3.9[142980]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:54.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:55 np0005603622 python3.9[143133]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:55 np0005603622 python3.9[143285]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:56.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:56 np0005603622 python3.9[143437]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:32:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:56.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:32:57 np0005603622 python3.9[143589]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:32:57 np0005603622 python3.9[143841]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:32:58.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:58 np0005603622 python3.9[144024]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:32:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:32:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:32:58.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:32:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:32:59 np0005603622 python3.9[144176]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:32:59 np0005603622 podman[144300]: 2026-01-31 07:32:59.452474688 +0000 UTC m=+0.076048341 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:32:59 np0005603622 python3.9[144344]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:33:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:00.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:00 np0005603622 python3.9[144499]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:33:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:00.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:02.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:02 np0005603622 python3.9[144651]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:02.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:02 np0005603622 python3.9[144803]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:33:03 np0005603622 python3.9[144955]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:33:03 np0005603622 systemd[1]: Reloading.
Jan 31 02:33:03 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:33:03 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:33:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:04.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:33:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:33:04 np0005603622 python3.9[145192]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:04.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:05 np0005603622 python3.9[145345]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:05 np0005603622 python3.9[145498]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:06.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:06 np0005603622 python3.9[145651]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:33:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:06.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:33:06 np0005603622 python3.9[145804]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:07 np0005603622 python3.9[145957]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:08.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:08 np0005603622 python3.9[146110]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:33:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:08.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:10.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:10 np0005603622 python3.9[146263]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 31 02:33:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:10.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:11 np0005603622 python3.9[146416]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:33:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:33:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:12.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:33:12 np0005603622 python3.9[146574]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:33:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:33:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:12.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:33:13 np0005603622 python3.9[146734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:33:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:14.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:14 np0005603622 python3.9[146818]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:33:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:14.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:16.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:16.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000047s ======
Jan 31 02:33:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:18.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Jan 31 02:33:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:18.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:20.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000023s ======
Jan 31 02:33:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:20.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 31 02:33:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:22.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:22.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:24.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:24 np0005603622 podman[146829]: 2026-01-31 07:33:24.349390171 +0000 UTC m=+0.260805322 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 02:33:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:24.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:26.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:26.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:28.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:30 np0005603622 podman[146856]: 2026-01-31 07:33:30.165642348 +0000 UTC m=+0.091892405 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 02:33:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:30.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:33:30.825 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:33:30.826 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:33:30.826 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:33:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:30.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:32.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:32.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:34.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:34.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:36.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:36.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:38.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:38.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:40.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:40.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:42.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:42.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:44.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:44.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:46.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:46.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:48.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:48.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:50.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:50.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:52 np0005603622 kernel: SELinux:  Converting 2779 SID table entries...
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:33:52 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:33:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:52.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:52.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:54.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:54.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:33:55 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 31 02:33:55 np0005603622 podman[147062]: 2026-01-31 07:33:55.190614734 +0000 UTC m=+0.101981858 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Jan 31 02:33:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:56.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:56.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:33:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:33:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:33:58.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:33:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:33:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:33:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:33:58.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:00.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:00.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:01 np0005603622 podman[147089]: 2026-01-31 07:34:01.142722467 +0000 UTC m=+0.068306618 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:34:01 np0005603622 kernel: SELinux:  Converting 2779 SID table entries...
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:34:01 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:34:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:34:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:02.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:34:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:02.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:03 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 31 02:34:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:04.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:04.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:34:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:34:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:06.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:06.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:08.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:34:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:09.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:34:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:10.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:34:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:11.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:13.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:14.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:15.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:16.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:17.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:19.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:34:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:20.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:34:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:21.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:22.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:23.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:24.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:25.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:26 np0005603622 podman[158852]: 2026-01-31 07:34:26.183546463 +0000 UTC m=+0.098084243 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:34:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:34:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:26.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:34:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:27.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:28.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:29.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:30.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:34:30.826 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:34:30.828 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:34:30.828 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:34:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:31.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:32 np0005603622 podman[164175]: 2026-01-31 07:34:32.127210305 +0000 UTC m=+0.055758459 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:34:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:32.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:33.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:34.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:35.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:36.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:37.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:38.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:39.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:40.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:41.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:42.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:43.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:44.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:45.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:46 np0005603622 kernel: SELinux:  Converting 2780 SID table entries...
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability network_peer_controls=1
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability open_perms=1
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability extended_socket_class=1
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability always_check_network=0
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 31 02:34:46 np0005603622 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 31 02:34:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:46.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:47.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:47 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:34:47 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 31 02:34:47 np0005603622 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Jan 31 02:34:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:48.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:49.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:50.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:51.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:52.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:53.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:34:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:54.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:34:54 np0005603622 systemd[1]: Stopping OpenSSH server daemon...
Jan 31 02:34:54 np0005603622 systemd[1]: sshd.service: Deactivated successfully.
Jan 31 02:34:54 np0005603622 systemd[1]: Stopped OpenSSH server daemon.
Jan 31 02:34:54 np0005603622 systemd[1]: sshd.service: Consumed 1.998s CPU time, read 32.0K from disk, written 0B to disk.
Jan 31 02:34:54 np0005603622 systemd[1]: Stopped target sshd-keygen.target.
Jan 31 02:34:54 np0005603622 systemd[1]: Stopping sshd-keygen.target...
Jan 31 02:34:54 np0005603622 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:54 np0005603622 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:54 np0005603622 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 31 02:34:54 np0005603622 systemd[1]: Reached target sshd-keygen.target.
Jan 31 02:34:54 np0005603622 systemd[1]: Starting OpenSSH server daemon...
Jan 31 02:34:54 np0005603622 systemd[1]: Started OpenSSH server daemon.
Jan 31 02:34:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:56 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:34:56 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:34:56 np0005603622 systemd[1]: Reloading.
Jan 31 02:34:56 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:34:56 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:34:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:56.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:34:56 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:34:56 np0005603622 podman[165406]: 2026-01-31 07:34:56.57309975 +0000 UTC m=+0.137412874 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller)
Jan 31 02:34:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:34:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:34:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:34:58.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:34:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:34:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:34:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:34:59.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:00.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:01.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:01 np0005603622 python3.9[172795]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:01 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:01 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:01 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:02 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:35:02 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:35:02 np0005603622 systemd[1]: man-db-cache-update.service: Consumed 7.224s CPU time.
Jan 31 02:35:02 np0005603622 systemd[1]: run-r4d912cd3a35442c59b37417573f959e7.service: Deactivated successfully.
Jan 31 02:35:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:35:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:02.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:35:02 np0005603622 python3.9[174066]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:02 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:02 np0005603622 podman[174069]: 2026-01-31 07:35:02.489091996 +0000 UTC m=+0.048652938 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:35:02 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:02 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:35:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:03.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:35:03 np0005603622 python3.9[174276]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:03 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:03 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:03 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:04.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:04 np0005603622 python3.9[174466]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:04 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:04 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:04 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:35:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:35:05 np0005603622 python3.9[174656]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:05 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:05 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:05 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:06.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:06 np0005603622 python3.9[174845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:06 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:06 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:06 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:07.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:07 np0005603622 python3.9[175034]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:07 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:07 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:07 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:08.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:08 np0005603622 python3.9[175224]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:09.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:09 np0005603622 python3.9[175379]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:09 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:09 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:09 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:10.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:10 np0005603622 python3.9[175668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 31 02:35:10 np0005603622 systemd[1]: Reloading.
Jan 31 02:35:10 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:35:10 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:35:10 np0005603622 podman[175740]: 2026-01-31 07:35:10.634887801 +0000 UTC m=+0.362252191 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 31 02:35:10 np0005603622 podman[175740]: 2026-01-31 07:35:10.752365832 +0000 UTC m=+0.479730162 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 31 02:35:10 np0005603622 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 31 02:35:10 np0005603622 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 31 02:35:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:11.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:11 np0005603622 python3.9[176022]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:12.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:12 np0005603622 python3.9[176341]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:35:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:35:13 np0005603622 python3.9[176496]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:13.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:13 np0005603622 python3.9[176651]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:14.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:14 np0005603622 python3.9[176806]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:15.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:15 np0005603622 python3.9[176961]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:16 np0005603622 python3.9[177116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:16.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:17 np0005603622 python3.9[177271]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:17.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:17 np0005603622 python3.9[177426]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:18.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:18 np0005603622 python3.9[177581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 31 02:35:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:19.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:35:19 np0005603622 python3.9[177786]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:19.975321) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844919975483, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1944, "num_deletes": 251, "total_data_size": 4913174, "memory_usage": 4971800, "flush_reason": "Manual Compaction"}
Jan 31 02:35:19 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920139799, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 3225713, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12330, "largest_seqno": 14269, "table_properties": {"data_size": 3217646, "index_size": 4946, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 15085, "raw_average_key_size": 18, "raw_value_size": 3201662, "raw_average_value_size": 3962, "num_data_blocks": 221, "num_entries": 808, "num_filter_entries": 808, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844732, "oldest_key_time": 1769844732, "file_creation_time": 1769844919, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 164554 microseconds, and 9396 cpu microseconds.
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.139877) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 3225713 bytes OK
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.139900) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.141680) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.141701) EVENT_LOG_v1 {"time_micros": 1769844920141694, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.141724) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 4904463, prev total WAL file size 4904463, number of live WAL files 2.
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.142860) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(3150KB)], [24(8046KB)]
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920143176, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 11465020, "oldest_snapshot_seqno": -1}
Jan 31 02:35:20 np0005603622 python3.9[177941]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:20.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4270 keys, 10961207 bytes, temperature: kUnknown
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920424998, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 10961207, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10927618, "index_size": 21831, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10693, "raw_key_size": 103980, "raw_average_key_size": 24, "raw_value_size": 10845577, "raw_average_value_size": 2539, "num_data_blocks": 929, "num_entries": 4270, "num_filter_entries": 4270, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769844920, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.425235) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 10961207 bytes
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.448817) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 40.7 rd, 38.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 7.9 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(7.0) write-amplify(3.4) OK, records in: 4790, records dropped: 520 output_compression: NoCompression
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.448845) EVENT_LOG_v1 {"time_micros": 1769844920448833, "job": 12, "event": "compaction_finished", "compaction_time_micros": 281898, "compaction_time_cpu_micros": 30483, "output_level": 6, "num_output_files": 1, "total_output_size": 10961207, "num_input_records": 4790, "num_output_records": 4270, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920449197, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769844920449932, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.142783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.450041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.450048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.450051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.450053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:20 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:35:20.450055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:35:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:21.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:21 np0005603622 python3.9[178096]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:21 np0005603622 python3.9[178251]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 31 02:35:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:22.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:22 np0005603622 python3.9[178406]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:23.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:23 np0005603622 python3.9[178558]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:23 np0005603622 python3.9[178710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:35:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:24.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:35:24 np0005603622 python3.9[178862]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:25 np0005603622 python3.9[179014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:25.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:25 np0005603622 python3.9[179166]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:35:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:26.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:26 np0005603622 python3.9[179316]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:35:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 31 02:35:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:27.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 31 02:35:27 np0005603622 podman[179381]: 2026-01-31 07:35:27.161940073 +0000 UTC m=+0.089305504 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 02:35:27 np0005603622 python3.9[179495]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:28 np0005603622 python3.9[179620]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844927.0213106-1647-254994583221744/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:28.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:28 np0005603622 python3.9[179772]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:29.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:29 np0005603622 python3.9[179897]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844928.4166472-1647-264357561533683/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:30 np0005603622 python3.9[180049]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:30.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:30 np0005603622 python3.9[180174]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844929.5079243-1647-14073711820846/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:35:30.827 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:35:30.827 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:35:30.827 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:35:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:31.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:31 np0005603622 python3.9[180326]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:31 np0005603622 python3.9[180451]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844930.8085399-1647-138705517624375/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:32.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:32 np0005603622 python3.9[180603]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:32 np0005603622 podman[180700]: 2026-01-31 07:35:32.896067076 +0000 UTC m=+0.052921939 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:35:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:33 np0005603622 python3.9[180746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844931.9813488-1647-183776037694196/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:33.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:33 np0005603622 python3.9[180900]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:34 np0005603622 python3.9[181025]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844933.2188094-1647-192243597927946/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:34.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:34 np0005603622 python3.9[181177]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:35.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:35 np0005603622 python3.9[181300]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844934.330304-1647-17507166447849/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:35 np0005603622 python3.9[181452]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:36.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:36 np0005603622 python3.9[181577]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769844935.4492273-1647-106161601571322/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:37 np0005603622 python3.9[181729]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 31 02:35:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:37.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:37 np0005603622 python3.9[181882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:38.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:38 np0005603622 python3.9[182034]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:39 np0005603622 python3.9[182186]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:39.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:39 np0005603622 python3.9[182338]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:40 np0005603622 python3.9[182490]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:40.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:40 np0005603622 python3.9[182642]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:41.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:41 np0005603622 python3.9[182794]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:41 np0005603622 python3.9[182946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:42.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:42 np0005603622 python3.9[183098]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:43 np0005603622 python3.9[183250]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:43.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:43 np0005603622 python3.9[183402]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:44 np0005603622 python3.9[183554]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:44.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:44 np0005603622 python3.9[183706]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:45.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:45 np0005603622 python3.9[183858]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:46 np0005603622 python3.9[184010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:46 np0005603622 python3.9[184133]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844945.7600617-2310-73395882585141/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:47.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:47 np0005603622 python3.9[184285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:47 np0005603622 python3.9[184408]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844946.9457154-2310-63500672795177/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:48.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:48 np0005603622 python3.9[184560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:48 np0005603622 python3.9[184683]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844948.0190916-2310-123802059320439/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:49.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:49 np0005603622 python3.9[184835]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:49 np0005603622 python3.9[184958]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844949.0528896-2310-42327972504759/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:50.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:50 np0005603622 python3.9[185110]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:50 np0005603622 python3.9[185233]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844950.0868826-2310-141928640882291/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:51.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:51 np0005603622 python3.9[185385]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:52 np0005603622 python3.9[185508]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844951.143411-2310-237943157910679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:52.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:52 np0005603622 python3.9[185660]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:53.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:53 np0005603622 python3.9[185783]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844952.2707772-2310-220104175596360/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:53 np0005603622 python3.9[185935]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:54.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:54 np0005603622 python3.9[186058]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844953.394442-2310-87858098489152/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:55 np0005603622 python3.9[186210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:55.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:55 np0005603622 python3.9[186333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844954.529573-2310-220017091423903/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:56 np0005603622 python3.9[186485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:56.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:56 np0005603622 python3.9[186608]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844955.7492769-2310-74849871824666/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:57.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:35:57 np0005603622 podman[186760]: 2026-01-31 07:35:57.25746955 +0000 UTC m=+0.063717861 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 02:35:57 np0005603622 python3.9[186761]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:57 np0005603622 python3.9[186910]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844956.9267857-2310-267848896209649/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:35:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:35:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:35:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:35:58 np0005603622 python3.9[187062]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:35:59 np0005603622 python3.9[187185]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844957.9998245-2310-195067889938779/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:35:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:35:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:35:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:35:59.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:00.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:00 np0005603622 python3.9[187337]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:01 np0005603622 python3.9[187460]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844959.1431515-2310-187839455368253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:01 np0005603622 python3.9[187612]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:02 np0005603622 python3.9[187735]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844961.193497-2310-65462689197686/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:02.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:03 np0005603622 python3.9[187885]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:03 np0005603622 podman[187886]: 2026-01-31 07:36:03.130933802 +0000 UTC m=+0.060030191 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:36:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:03.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:03 np0005603622 python3.9[188059]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 31 02:36:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:04.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:36:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:36:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:06.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:06 np0005603622 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 31 02:36:06 np0005603622 python3.9[188215]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:07.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:07 np0005603622 python3.9[188367]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:07 np0005603622 python3.9[188519]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:08.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:08 np0005603622 python3.9[188671]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:09 np0005603622 python3.9[188823]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:09.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:09 np0005603622 python3.9[188975]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:10.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:10 np0005603622 python3.9[189127]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:11 np0005603622 python3.9[189279]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:36:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:11.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:36:11 np0005603622 python3.9[189431]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:12 np0005603622 python3.9[189583]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:12.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:13.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:13 np0005603622 python3.9[189735]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:13 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:13 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:13 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:13 np0005603622 systemd[1]: Starting libvirt logging daemon socket...
Jan 31 02:36:13 np0005603622 systemd[1]: Listening on libvirt logging daemon socket.
Jan 31 02:36:13 np0005603622 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 31 02:36:13 np0005603622 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 31 02:36:13 np0005603622 systemd[1]: Starting libvirt logging daemon...
Jan 31 02:36:13 np0005603622 systemd[1]: Started libvirt logging daemon.
Jan 31 02:36:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:14.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:14 np0005603622 python3.9[189928]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:14 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:14 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:14 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:14 np0005603622 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 31 02:36:14 np0005603622 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 31 02:36:14 np0005603622 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 31 02:36:14 np0005603622 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 31 02:36:14 np0005603622 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 31 02:36:14 np0005603622 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 31 02:36:14 np0005603622 systemd[1]: Starting libvirt nodedev daemon...
Jan 31 02:36:14 np0005603622 systemd[1]: Started libvirt nodedev daemon.
Jan 31 02:36:15 np0005603622 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 31 02:36:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:15.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:15 np0005603622 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 31 02:36:15 np0005603622 python3.9[190145]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:15 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:15 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:15 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:15 np0005603622 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 31 02:36:15 np0005603622 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 31 02:36:15 np0005603622 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 31 02:36:15 np0005603622 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 31 02:36:15 np0005603622 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 31 02:36:15 np0005603622 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 31 02:36:15 np0005603622 systemd[1]: Starting libvirt proxy daemon...
Jan 31 02:36:15 np0005603622 systemd[1]: Started libvirt proxy daemon.
Jan 31 02:36:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:16.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:16 np0005603622 python3.9[190365]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:16 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:16 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:16 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:16 np0005603622 setroubleshoot[190040]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 21ead114-1335-480b-a576-774847d1035f
Jan 31 02:36:16 np0005603622 setroubleshoot[190040]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 02:36:16 np0005603622 setroubleshoot[190040]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 21ead114-1335-480b-a576-774847d1035f
Jan 31 02:36:16 np0005603622 setroubleshoot[190040]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 31 02:36:16 np0005603622 systemd[1]: Listening on libvirt locking daemon socket.
Jan 31 02:36:16 np0005603622 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 31 02:36:16 np0005603622 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 31 02:36:16 np0005603622 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 31 02:36:16 np0005603622 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 31 02:36:16 np0005603622 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 31 02:36:16 np0005603622 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 31 02:36:16 np0005603622 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 31 02:36:16 np0005603622 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 31 02:36:16 np0005603622 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 31 02:36:16 np0005603622 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 02:36:16 np0005603622 systemd[1]: Started libvirt QEMU daemon.
Jan 31 02:36:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:17.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:17 np0005603622 python3.9[190581]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:36:17 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:17 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:17 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:18 np0005603622 systemd[1]: Starting libvirt secret daemon socket...
Jan 31 02:36:18 np0005603622 systemd[1]: Listening on libvirt secret daemon socket.
Jan 31 02:36:18 np0005603622 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 31 02:36:18 np0005603622 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 31 02:36:18 np0005603622 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 31 02:36:18 np0005603622 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 31 02:36:18 np0005603622 systemd[1]: Starting libvirt secret daemon...
Jan 31 02:36:18 np0005603622 systemd[1]: Started libvirt secret daemon.
Jan 31 02:36:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:18.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:19 np0005603622 python3.9[190891]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:19.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:19 np0005603622 python3.9[191077]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:36:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:20 np0005603622 python3.9[191229]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:21 np0005603622 python3.9[191383]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:36:22 np0005603622 python3.9[191533]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:22.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:22 np0005603622 python3.9[191654]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844981.691835-3384-10630939729912/.source.xml follow=False _original_basename=secret.xml.j2 checksum=450e5279e3f961806683176060af91f2a100b4e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:23.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:23 np0005603622 python3.9[191806]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:24.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:24 np0005603622 python3.9[191968]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:25.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:36:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:26.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:36:26 np0005603622 python3.9[192431]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:26 np0005603622 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 31 02:36:26 np0005603622 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 31 02:36:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:27 np0005603622 podman[192555]: 2026-01-31 07:36:27.536080462 +0000 UTC m=+0.148282982 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 02:36:27 np0005603622 python3.9[192596]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:28 np0005603622 python3.9[192732]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769844987.1388931-3549-258896058427578/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:28.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:28 np0005603622 python3.9[192884]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:29.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:29 np0005603622 python3.9[193036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:30 np0005603622 python3.9[193114]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:30.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:30 np0005603622 python3.9[193266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:36:30.827 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:36:30.828 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:36:30.828 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:36:31 np0005603622 python3.9[193344]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.38g5zjcq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:31.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:31 np0005603622 python3.9[193546]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:36:32 np0005603622 python3.9[193624]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:32.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:32 np0005603622 python3.9[193776]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:33.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:33 np0005603622 podman[193901]: 2026-01-31 07:36:33.492078213 +0000 UTC m=+0.076749233 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 31 02:36:33 np0005603622 python3[193943]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 31 02:36:34 np0005603622 python3.9[194100]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:34.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:34 np0005603622 python3.9[194178]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:35.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:35 np0005603622 python3.9[194330]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:35 np0005603622 python3.9[194455]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844994.8506172-3816-261500983043357/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:36:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:36.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:36:36 np0005603622 python3.9[194607]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:37 np0005603622 python3.9[194685]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:37.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:37 np0005603622 auditd[699]: Audit daemon rotating log files
Jan 31 02:36:37 np0005603622 python3.9[194837]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:38 np0005603622 python3.9[194915]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:38.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:38 np0005603622 python3.9[195067]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:39 np0005603622 python3.9[195192]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769844998.4043047-3933-16358644106247/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:40 np0005603622 python3.9[195344]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:40.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:40 np0005603622 python3.9[195496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:41 np0005603622 python3.9[195651]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:42 np0005603622 python3.9[195803]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:43 np0005603622 python3.9[195956]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:36:44 np0005603622 python3.9[196110]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:36:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:44.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:44 np0005603622 python3.9[196265]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:45.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:45 np0005603622 python3.9[196417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:46 np0005603622 python3.9[196540]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845005.261353-4149-241933689843101/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:46.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:46 np0005603622 python3.9[196692]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:36:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:47.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:36:47 np0005603622 python3.9[196815]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845006.4630847-4195-190273777005039/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:48 np0005603622 python3.9[196967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:36:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:48.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:48 np0005603622 python3.9[197090]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845007.6398091-4240-181257021472904/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:36:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:49.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:49 np0005603622 python3.9[197242]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:36:49 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:49 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:49 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:49 np0005603622 systemd[1]: Reached target edpm_libvirt.target.
Jan 31 02:36:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:50.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:50 np0005603622 python3.9[197433]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 31 02:36:50 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:50 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:50 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:50 np0005603622 systemd[1]: Reloading.
Jan 31 02:36:51 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:36:51 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:36:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:51.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:51 np0005603622 systemd[1]: session-48.scope: Deactivated successfully.
Jan 31 02:36:51 np0005603622 systemd[1]: session-48.scope: Consumed 2min 56.925s CPU time.
Jan 31 02:36:51 np0005603622 systemd-logind[817]: Session 48 logged out. Waiting for processes to exit.
Jan 31 02:36:51 np0005603622 systemd-logind[817]: Removed session 48.
Jan 31 02:36:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:52.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:36:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:53.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:36:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:54.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:36:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:55.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:36:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:56.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:57 np0005603622 systemd-logind[817]: New session 49 of user zuul.
Jan 31 02:36:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:57.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:57 np0005603622 systemd[1]: Started Session 49 of User zuul.
Jan 31 02:36:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:36:58 np0005603622 podman[197657]: 2026-01-31 07:36:58.035186546 +0000 UTC m=+0.085404187 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 02:36:58 np0005603622 python3.9[197696]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:36:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:36:58.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:36:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:36:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:36:59.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:36:59 np0005603622 python3.9[197863]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:36:59 np0005603622 network[197880]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:36:59 np0005603622 network[197881]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:36:59 np0005603622 network[197882]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:37:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:00.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:01.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:02.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:03.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:03 np0005603622 podman[198126]: 2026-01-31 07:37:03.923275067 +0000 UTC m=+0.078068841 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 02:37:04 np0005603622 python3.9[198173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 31 02:37:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:04.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:05 np0005603622 python3.9[198258]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:05.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:06.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:07.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:08.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:09.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:10.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:11 np0005603622 python3.9[198411]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:11.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:12 np0005603622 python3.9[198563]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:12.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:12 np0005603622 python3.9[198716]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:13.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:13 np0005603622 python3.9[198868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:14 np0005603622 python3.9[199021]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:14.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:14 np0005603622 python3.9[199144]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845033.8212607-246-38272353088268/.source.iscsi _original_basename=.qidpc5t8 follow=False checksum=8b527706ea0636a5ebc18d31b6a60b607d85e6b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:15.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:15 np0005603622 python3.9[199296]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:16.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:16 np0005603622 python3.9[199448]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:16 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:37:16 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:37:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:17.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:17 np0005603622 python3.9[199601]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:17 np0005603622 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 31 02:37:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:18.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:18 np0005603622 python3.9[199757]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:18 np0005603622 systemd[1]: Reloading.
Jan 31 02:37:18 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:18 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:19 np0005603622 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 02:37:19 np0005603622 systemd[1]: Starting Open-iSCSI...
Jan 31 02:37:19 np0005603622 kernel: Loading iSCSI transport class v2.0-870.
Jan 31 02:37:19 np0005603622 systemd[1]: Started Open-iSCSI.
Jan 31 02:37:19 np0005603622 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 31 02:37:19 np0005603622 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 31 02:37:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:19.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:20 np0005603622 python3.9[199956]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:37:20 np0005603622 network[199973]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:37:20 np0005603622 network[199974]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:37:20 np0005603622 network[199975]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:37:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:20.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:21.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:22.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:23.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:23 np0005603622 python3.9[200247]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:24.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:25.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:26 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:37:26 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:37:26 np0005603622 systemd[1]: Reloading.
Jan 31 02:37:26 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:26 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:26 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:37:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:26.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:26 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:37:26 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:37:26 np0005603622 systemd[1]: run-rba3b711eed664ea4a1aa95e8cb424c0a.service: Deactivated successfully.
Jan 31 02:37:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:27.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:28 np0005603622 python3.9[200562]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 02:37:28 np0005603622 podman[200566]: 2026-01-31 07:37:28.160515417 +0000 UTC m=+0.083827765 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 02:37:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:28.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:28 np0005603622 python3.9[200740]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 31 02:37:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:29.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:29 np0005603622 python3.9[200896]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:30 np0005603622 python3.9[201019]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845049.071158-510-50894956416552/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:30.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:30 np0005603622 python3.9[201171]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:37:30.828 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:37:30.829 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:37:30.830 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:37:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:31.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:32 np0005603622 python3.9[201441]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:37:32 np0005603622 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 02:37:32 np0005603622 systemd[1]: Stopped Load Kernel Modules.
Jan 31 02:37:32 np0005603622 systemd[1]: Stopping Load Kernel Modules...
Jan 31 02:37:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:37:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:37:32 np0005603622 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:37:32 np0005603622 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:37:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:32.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:32 np0005603622 python3.9[201612]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:33.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:33 np0005603622 python3.9[201765]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:34 np0005603622 podman[201842]: 2026-01-31 07:37:34.130838368 +0000 UTC m=+0.052592112 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 02:37:34 np0005603622 python3.9[201936]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:34.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:34 np0005603622 python3.9[202059]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845053.9864473-663-260394477797669/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:35.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:35 np0005603622 python3.9[202211]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:36 np0005603622 python3.9[202364]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:36.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:37 np0005603622 python3.9[202516]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:37.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:37:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:38.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:38 np0005603622 python3.9[202718]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:39.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:39 np0005603622 python3.9[202870]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:40.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:40 np0005603622 python3.9[203022]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:41 np0005603622 python3.9[203174]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:41.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:41 np0005603622 python3.9[203326]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:42.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:43 np0005603622 python3.9[203478]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:37:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:43.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:43 np0005603622 python3.9[203632]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:37:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:44.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:45 np0005603622 python3.9[203785]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:45.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:46 np0005603622 systemd[1]: Listening on multipathd control socket.
Jan 31 02:37:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:46.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:46 np0005603622 python3.9[203941]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:37:46 np0005603622 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 31 02:37:47 np0005603622 udevadm[203946]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 31 02:37:47 np0005603622 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 31 02:37:47 np0005603622 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 02:37:47 np0005603622 multipathd[203950]: --------start up--------
Jan 31 02:37:47 np0005603622 multipathd[203950]: read /etc/multipath.conf
Jan 31 02:37:47 np0005603622 multipathd[203950]: path checkers start up
Jan 31 02:37:47 np0005603622 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 02:37:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:47.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:48 np0005603622 python3.9[204109]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 31 02:37:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:48.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:49 np0005603622 python3.9[204261]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 31 02:37:49 np0005603622 kernel: Key type psk registered
Jan 31 02:37:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:49.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:50 np0005603622 python3.9[204424]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:37:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:50.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:50 np0005603622 python3.9[204547]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769845069.5206811-1053-169857312670669/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:51.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:51 np0005603622 python3.9[204699]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:37:52 np0005603622 python3.9[204851]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:37:52 np0005603622 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 31 02:37:52 np0005603622 systemd[1]: Stopped Load Kernel Modules.
Jan 31 02:37:52 np0005603622 systemd[1]: Stopping Load Kernel Modules...
Jan 31 02:37:52 np0005603622 systemd[1]: Starting Load Kernel Modules...
Jan 31 02:37:52 np0005603622 systemd[1]: Finished Load Kernel Modules.
Jan 31 02:37:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:37:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:52.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:37:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:37:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:53.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:37:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:54.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.850203) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074850449, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1847, "num_deletes": 501, "total_data_size": 4014913, "memory_usage": 4056200, "flush_reason": "Manual Compaction"}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074865312, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1530405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14274, "largest_seqno": 16116, "table_properties": {"data_size": 1524843, "index_size": 2317, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 16629, "raw_average_key_size": 19, "raw_value_size": 1510835, "raw_average_value_size": 1744, "num_data_blocks": 107, "num_entries": 866, "num_filter_entries": 866, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844921, "oldest_key_time": 1769844921, "file_creation_time": 1769845074, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 15174 microseconds, and 5902 cpu microseconds.
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.865389) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1530405 bytes OK
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.865424) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.868078) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.868103) EVENT_LOG_v1 {"time_micros": 1769845074868095, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.868133) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 4005599, prev total WAL file size 4005599, number of live WAL files 2.
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.869270) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1494KB)], [27(10MB)]
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074869483, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 12491612, "oldest_snapshot_seqno": -1}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4192 keys, 8056917 bytes, temperature: kUnknown
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074943901, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 8056917, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8027506, "index_size": 17845, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10501, "raw_key_size": 103476, "raw_average_key_size": 24, "raw_value_size": 7950306, "raw_average_value_size": 1896, "num_data_blocks": 752, "num_entries": 4192, "num_filter_entries": 4192, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845074, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.944119) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 8056917 bytes
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.945158) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.7 rd, 108.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 10.5 +0.0 blob) out(7.7 +0.0 blob), read-write-amplify(13.4) write-amplify(5.3) OK, records in: 5136, records dropped: 944 output_compression: NoCompression
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.945172) EVENT_LOG_v1 {"time_micros": 1769845074945165, "job": 14, "event": "compaction_finished", "compaction_time_micros": 74503, "compaction_time_cpu_micros": 29842, "output_level": 6, "num_output_files": 1, "total_output_size": 8056917, "num_input_records": 5136, "num_output_records": 4192, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074945452, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845074946184, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.869158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.946387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.946393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.946395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.946397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:37:54.946399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:37:55 np0005603622 python3.9[205007]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 31 02:37:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:55.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:56.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:57 np0005603622 systemd[1]: Reloading.
Jan 31 02:37:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:57.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:57 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:57 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:57 np0005603622 systemd[1]: Reloading.
Jan 31 02:37:57 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:57 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:37:58 np0005603622 systemd-logind[817]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 31 02:37:58 np0005603622 systemd-logind[817]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 31 02:37:58 np0005603622 lvm[205124]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 02:37:58 np0005603622 lvm[205124]: VG ceph_vg0 finished
Jan 31 02:37:58 np0005603622 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 31 02:37:58 np0005603622 systemd[1]: Starting man-db-cache-update.service...
Jan 31 02:37:58 np0005603622 systemd[1]: Reloading.
Jan 31 02:37:58 np0005603622 podman[205142]: 2026-01-31 07:37:58.294340228 +0000 UTC m=+0.098918068 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:37:58 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:37:58 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:37:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:37:58.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:58 np0005603622 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 31 02:37:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:37:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:37:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:37:59.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:37:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 02:38:00 np0005603622 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 31 02:38:00 np0005603622 systemd[1]: Finished man-db-cache-update.service.
Jan 31 02:38:00 np0005603622 systemd[1]: man-db-cache-update.service: Consumed 1.202s CPU time.
Jan 31 02:38:00 np0005603622 systemd[1]: run-r543810cbc5a142d4b97960fd124c8171.service: Deactivated successfully.
Jan 31 02:38:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:00.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:00 np0005603622 python3.9[206503]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:38:00 np0005603622 systemd[1]: Stopping Open-iSCSI...
Jan 31 02:38:00 np0005603622 iscsid[199797]: iscsid shutting down.
Jan 31 02:38:00 np0005603622 systemd[1]: iscsid.service: Deactivated successfully.
Jan 31 02:38:00 np0005603622 systemd[1]: Stopped Open-iSCSI.
Jan 31 02:38:00 np0005603622 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 31 02:38:00 np0005603622 systemd[1]: Starting Open-iSCSI...
Jan 31 02:38:00 np0005603622 systemd[1]: Started Open-iSCSI.
Jan 31 02:38:01 np0005603622 python3.9[206659]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:38:01 np0005603622 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 31 02:38:01 np0005603622 multipathd[203950]: exit (signal)
Jan 31 02:38:01 np0005603622 multipathd[203950]: --------shut down-------
Jan 31 02:38:01 np0005603622 systemd[1]: multipathd.service: Deactivated successfully.
Jan 31 02:38:01 np0005603622 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 31 02:38:01 np0005603622 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 31 02:38:01 np0005603622 multipathd[206665]: --------start up--------
Jan 31 02:38:01 np0005603622 multipathd[206665]: read /etc/multipath.conf
Jan 31 02:38:01 np0005603622 multipathd[206665]: path checkers start up
Jan 31 02:38:01 np0005603622 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 31 02:38:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:02 np0005603622 python3.9[206822]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 31 02:38:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:02.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:03 np0005603622 python3.9[206978]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:03.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:04 np0005603622 podman[207102]: 2026-01-31 07:38:04.275098164 +0000 UTC m=+0.083380065 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 02:38:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:04.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:04 np0005603622 python3.9[207147]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:38:04 np0005603622 systemd[1]: Reloading.
Jan 31 02:38:04 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:38:04 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:38:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:05.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:05 np0005603622 python3.9[207333]: ansible-ansible.builtin.service_facts Invoked
Jan 31 02:38:05 np0005603622 network[207350]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 31 02:38:05 np0005603622 network[207351]: 'network-scripts' will be removed from distribution in near future.
Jan 31 02:38:05 np0005603622 network[207352]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 31 02:38:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:06.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:07.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:08.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:09.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:10 np0005603622 python3.9[207625]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:38:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:10.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:38:11 np0005603622 python3.9[207778]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:11.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:11 np0005603622 python3.9[207931]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:12.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:12 np0005603622 python3.9[208084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:13 np0005603622 python3.9[208237]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:13.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:14 np0005603622 python3.9[208390]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:14 np0005603622 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 31 02:38:15 np0005603622 python3.9[208543]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:15.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:15 np0005603622 python3.9[208697]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:38:15 np0005603622 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 02:38:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:16.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:16 np0005603622 python3.9[208851]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:17 np0005603622 python3.9[209003]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:17.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:18 np0005603622 python3.9[209155]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:18.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:18 np0005603622 python3.9[209307]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:19 np0005603622 python3.9[209459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:19 np0005603622 python3.9[209611]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:20.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:20 np0005603622 python3.9[209763]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:21 np0005603622 python3.9[209915]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:21.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:22 np0005603622 python3.9[210067]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:22.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:22 np0005603622 python3.9[210219]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:23 np0005603622 python3.9[210371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:23.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:24 np0005603622 python3.9[210523]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:24.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:25 np0005603622 python3.9[210675]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:25.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:25 np0005603622 python3.9[210827]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:26 np0005603622 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 31 02:38:26 np0005603622 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 31 02:38:26 np0005603622 python3.9[210981]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:26.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:26 np0005603622 python3.9[211133]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:27.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:27 np0005603622 python3.9[211285]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:28.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:28 np0005603622 python3.9[211437]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 31 02:38:29 np0005603622 podman[211514]: 2026-01-31 07:38:29.192632473 +0000 UTC m=+0.105309757 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:38:29 np0005603622 python3.9[211615]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:38:29 np0005603622 systemd[1]: Reloading.
Jan 31 02:38:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:38:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:29.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:38:29 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:38:29 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:38:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 31 02:38:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:30.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 31 02:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:38:30.829 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:38:30.831 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:38:30.831 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:38:31 np0005603622 python3.9[211803]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:31.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:31 np0005603622 python3.9[211956]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:32 np0005603622 python3.9[212109]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:32.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:32 np0005603622 python3.9[212262]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:33 np0005603622 python3.9[212415]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:38:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:33.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:38:34 np0005603622 python3.9[212568]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:34 np0005603622 podman[212693]: 2026-01-31 07:38:34.438568019 +0000 UTC m=+0.056081620 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 31 02:38:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:34.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:34 np0005603622 python3.9[212735]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:35 np0005603622 python3.9[212890]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 31 02:38:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:35.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:36.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:37 np0005603622 python3.9[213043]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:37.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:37 np0005603622 python3.9[213195]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:38 np0005603622 python3.9[213370]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:38.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:39 np0005603622 python3.9[213630]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:39.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:39 np0005603622 python3.9[213782]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:40 np0005603622 python3.9[213934]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:40.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:40 np0005603622 python3.9[214086]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:38:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:38:41 np0005603622 python3.9[214238]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:41.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:42 np0005603622 python3.9[214390]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:42.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:42 np0005603622 python3.9[214542]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:43.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:45.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:38:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:38:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:46.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:38:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:47.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:48.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:48 np0005603622 python3.9[214744]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 31 02:38:49 np0005603622 python3.9[214897]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 31 02:38:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:49.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:50.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:50 np0005603622 python3.9[215055]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 31 02:38:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:51.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:52 np0005603622 systemd-logind[817]: New session 50 of user zuul.
Jan 31 02:38:52 np0005603622 systemd[1]: Started Session 50 of User zuul.
Jan 31 02:38:52 np0005603622 systemd[1]: session-50.scope: Deactivated successfully.
Jan 31 02:38:52 np0005603622 systemd-logind[817]: Session 50 logged out. Waiting for processes to exit.
Jan 31 02:38:52 np0005603622 systemd-logind[817]: Removed session 50.
Jan 31 02:38:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:52.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:52 np0005603622 python3.9[215241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:53 np0005603622 python3.9[215362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845132.4138532-2661-177881127693453/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:38:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:53.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:38:53 np0005603622 python3.9[215512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:54 np0005603622 python3.9[215588]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:54.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:54 np0005603622 python3.9[215738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:55 np0005603622 python3.9[215859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845134.527622-2661-34990131679157/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:55.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:56 np0005603622 python3.9[216009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:38:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:56.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:38:56 np0005603622 python3.9[216130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845135.660565-2661-125907537172194/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:57 np0005603622 python3.9[216280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:57.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:57 np0005603622 python3.9[216401]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845136.8985558-2661-175677206676209/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:38:58 np0005603622 python3.9[216551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:38:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:38:58.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:38:58 np0005603622 python3.9[216672]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845137.9468093-2661-222724227295258/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:38:59 np0005603622 podman[216796]: 2026-01-31 07:38:59.427977889 +0000 UTC m=+0.091732266 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 02:38:59 np0005603622 python3.9[216839]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:38:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:38:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:38:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:38:59.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:00 np0005603622 python3.9[217000]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.005000132s ======
Jan 31 02:39:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:00.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000132s
Jan 31 02:39:00 np0005603622 python3.9[217152]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:01 np0005603622 python3.9[217304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:01.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:02 np0005603622 python3.9[217427]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769845141.1032758-2981-194560923784775/.source _original_basename=.rez8wp4r follow=False checksum=37ee6433834c3a03110278c8884fda665e998feb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 31 02:39:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:02.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:02 np0005603622 python3.9[217579]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:03 np0005603622 python3.9[217731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:03.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:04 np0005603622 python3.9[217852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845143.1548896-3059-279454465651725/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:39:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:04.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:04 np0005603622 podman[217976]: 2026-01-31 07:39:04.703272785 +0000 UTC m=+0.066531448 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 02:39:04 np0005603622 python3.9[218012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 31 02:39:05 np0005603622 python3.9[218143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769845144.3066201-3104-58160840916991/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 31 02:39:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:05.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:06 np0005603622 python3.9[218295]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 31 02:39:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:06.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:07 np0005603622 python3.9[218447]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:39:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:07.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:08 np0005603622 python3[218599]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:39:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:08.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:09.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:10.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:11.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:12.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:13.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:14.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:15.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:16.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:17.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:18.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:19.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:20.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:21.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:22.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:23.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:24 np0005603622 podman[218613]: 2026-01-31 07:39:24.164496777 +0000 UTC m=+15.535300924 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:24 np0005603622 podman[218706]: 2026-01-31 07:39:24.288631593 +0000 UTC m=+0.047579984 container create 77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:39:24 np0005603622 podman[218706]: 2026-01-31 07:39:24.25951103 +0000 UTC m=+0.018459441 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:24 np0005603622 python3[218599]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 31 02:39:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:24.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:25.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:26 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:39:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:26.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:27.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:28.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).paxos(paxos updating c 1005..1710) lease_timeout -- calling new election
Jan 31 02:39:29 np0005603622 ceph-mon[81571]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 31 02:39:29 np0005603622 ceph-mon[81571]: paxos.2).electionLogic(14) init, last seen epoch 14
Jan 31 02:39:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:29.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:30 np0005603622 podman[218769]: 2026-01-31 07:39:30.177325096 +0000 UTC m=+0.091928562 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 02:39:30 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:39:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:30.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:39:30.830 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:39:30.831 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:39:30.832 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:39:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:39:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:33.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:39:34 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:39:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:39:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:34.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:39:35 np0005603622 podman[218796]: 2026-01-31 07:39:35.148127676 +0000 UTC m=+0.071538549 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:39:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:35.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:36.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:36 np0005603622 ceph-mon[81571]: paxos.2).electionLogic(15) init, last seen epoch 15, mid-election, bumping
Jan 31 02:39:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:37.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:38 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:39:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:38.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:39 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.101:3300/0
Jan 31 02:39:39 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.101:3300/0
Jan 31 02:39:39 np0005603622 ceph-mgr[81925]: ms_deliver_dispatch: unhandled message 0x559cac8291e0 mon_map magic: 0 v1 from mon.2 v2:192.168.122.101:3300/0
Jan 31 02:39:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:39.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:40.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:41.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 31 02:39:42 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:39:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:39:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:42.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: mon.compute-0 calling monitor election
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: mon.compute-1 calling monitor election
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: mon.compute-2 calling monitor election
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: mon.compute-0 calling monitor election
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:39:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:43.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:44 np0005603622 python3.9[219075]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:44.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:45 np0005603622 python3.9[219229]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 31 02:39:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:45.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:46 np0005603622 python3.9[219381]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 31 02:39:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:46.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:47 np0005603622 python3[219533]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 31 02:39:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:47.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:47 np0005603622 podman[219572]: 2026-01-31 07:39:47.795157012 +0000 UTC m=+0.024151913 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 31 02:39:48 np0005603622 podman[219572]: 2026-01-31 07:39:48.252356351 +0000 UTC m=+0.481351212 container create 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:39:48 np0005603622 python3[219533]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 31 02:39:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:48.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:39:49 np0005603622 python3.9[219811]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:49.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:50 np0005603622 python3.9[219965]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:50.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:51 np0005603622 python3.9[220116]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769845190.340672-3392-23916335995419/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 31 02:39:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:51.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:51 np0005603622 python3.9[220193]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 31 02:39:51 np0005603622 systemd[1]: Reloading.
Jan 31 02:39:51 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:39:51 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:39:52 np0005603622 python3.9[220305]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 31 02:39:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:52.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:52 np0005603622 systemd[1]: Reloading.
Jan 31 02:39:52 np0005603622 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 31 02:39:52 np0005603622 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 31 02:39:53 np0005603622 systemd[1]: Starting nova_compute container...
Jan 31 02:39:53 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:39:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 02:39:53 np0005603622 podman[220345]: 2026-01-31 07:39:53.143391978 +0000 UTC m=+0.110157410 container init 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:39:53 np0005603622 podman[220345]: 2026-01-31 07:39:53.152871722 +0000 UTC m=+0.119637174 container start 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + sudo -E kolla_set_configs
Jan 31 02:39:53 np0005603622 podman[220345]: nova_compute
Jan 31 02:39:53 np0005603622 systemd[1]: Started nova_compute container.
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Validating config file
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying service configuration files
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Deleting /etc/ceph
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Creating directory /etc/ceph
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Writing out command to execute
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:39:53 np0005603622 nova_compute[220361]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:39:53 np0005603622 nova_compute[220361]: ++ cat /run_command
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + CMD=nova-compute
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + ARGS=
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + sudo kolla_copy_cacerts
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + [[ ! -n '' ]]
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + . kolla_extend_start
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 02:39:53 np0005603622 nova_compute[220361]: Running command: 'nova-compute'
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + umask 0022
Jan 31 02:39:53 np0005603622 nova_compute[220361]: + exec nova-compute
Jan 31 02:39:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:53.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:54 np0005603622 python3.9[220523]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:54.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:55 np0005603622 python3.9[220673]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.444 220365 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.444 220365 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.445 220365 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.445 220365 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.592 220365 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.602 220365 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:39:55 np0005603622 nova_compute[220361]: 2026-01-31 07:39:55.602 220365 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:39:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:39:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:55.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:39:56 np0005603622 nova_compute[220361]: 2026-01-31 07:39:56.549 220365 INFO nova.virt.driver [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 31 02:39:56 np0005603622 python3.9[220827]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 31 02:39:56 np0005603622 nova_compute[220361]: 2026-01-31 07:39:56.642 220365 INFO nova.compute.provider_config [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 31 02:39:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:56.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.022 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.023 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.023 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.024 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.024 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.024 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.024 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.025 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.025 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.025 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.025 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.025 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.026 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.026 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.026 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.026 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.026 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.027 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.027 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.027 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.027 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.027 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.028 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.028 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.028 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.028 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.028 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.029 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.029 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.029 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.029 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.029 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.030 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.030 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.030 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.030 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.031 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.031 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.031 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.031 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.031 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.032 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.032 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.032 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.032 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.032 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.033 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.033 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.033 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.033 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.033 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.034 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.034 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.034 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.034 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.035 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.035 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.035 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.035 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.035 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.036 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.036 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.036 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.036 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.036 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.037 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.037 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.037 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.037 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.037 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.038 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.038 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.038 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.038 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.038 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.039 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.040 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.041 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.042 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.043 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.044 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.045 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.046 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.047 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.048 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.049 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.050 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.050 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.050 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.050 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.050 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.051 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.052 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.053 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.054 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.054 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.054 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.054 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.054 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.055 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.055 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.055 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.055 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.055 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.056 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.057 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.058 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.058 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.058 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.058 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.058 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.059 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.060 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.061 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.062 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.063 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.064 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.065 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.066 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.067 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.068 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.069 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.070 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.071 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.072 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.073 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.074 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.075 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.076 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.077 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.078 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.079 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.080 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.081 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.082 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.083 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.084 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.085 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.085 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.085 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.085 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.085 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.086 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.087 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.087 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.087 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.087 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.087 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.088 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.088 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.088 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.088 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.088 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.089 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.090 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.090 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.090 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.090 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.090 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.091 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.092 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.092 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.092 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.092 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.093 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.094 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.094 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.094 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.094 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.094 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.095 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.096 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.097 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.098 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.098 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.098 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.098 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.098 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.099 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.099 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.099 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.099 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.099 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.100 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.100 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.100 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.100 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.100 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.101 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.101 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.101 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.101 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.101 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.102 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.102 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.102 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.102 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.102 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.103 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.103 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.103 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.103 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.103 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.104 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.104 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.104 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.104 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.104 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.105 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.105 220365 WARNING oslo_config.cfg [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 31 02:39:57 np0005603622 nova_compute[220361]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 31 02:39:57 np0005603622 nova_compute[220361]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 31 02:39:57 np0005603622 nova_compute[220361]: and ``live_migration_inbound_addr`` respectively.
Jan 31 02:39:57 np0005603622 nova_compute[220361]: ).  Its value may be silently ignored in the future.#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.105 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.105 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.106 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.106 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.106 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.106 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.106 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.107 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.108 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.108 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.108 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.108 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.108 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.109 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rbd_secret_uuid        = 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.109 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.109 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.109 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.109 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.110 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.110 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.110 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.110 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.110 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.111 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.111 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.111 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.111 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.112 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.112 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.112 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.112 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.112 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.113 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.114 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.114 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.114 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.114 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.114 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.115 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.115 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.115 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.115 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.116 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.117 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.118 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.119 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.119 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.119 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.119 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.119 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.120 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.121 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.121 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.121 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.121 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.121 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.122 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.123 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.124 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.124 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.124 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.124 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.124 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.125 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.125 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.125 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.125 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.125 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.126 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.127 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.128 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.128 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.128 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.128 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.128 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.129 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.130 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.131 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.132 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.133 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.134 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.135 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.136 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.137 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.137 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.137 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.137 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.138 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.139 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.140 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.141 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.142 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.143 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.144 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.145 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.146 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.147 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.148 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.148 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.148 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.148 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.148 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.149 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.150 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.151 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.152 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.153 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.154 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.155 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.156 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.157 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.158 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.159 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.160 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.161 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.162 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.163 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.164 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.165 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.166 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.167 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.168 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.169 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.170 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.171 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.172 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.173 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.174 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.174 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.174 220365 DEBUG oslo_service.service [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.175 220365 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.221 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.222 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.222 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.222 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 31 02:39:57 np0005603622 systemd[1]: Starting libvirt QEMU daemon...
Jan 31 02:39:57 np0005603622 systemd[1]: Started libvirt QEMU daemon.
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.301 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f84d56d36d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.303 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f84d56d36d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.304 220365 INFO nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.437 220365 WARNING nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 31 02:39:57 np0005603622 nova_compute[220361]: 2026-01-31 07:39:57.438 220365 DEBUG nova.virt.libvirt.volume.mount [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 31 02:39:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:39:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:39:58 np0005603622 python3.9[221039]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 02:39:58 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.298 220365 INFO nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Libvirt host capabilities <capabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <host>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <uuid>ff96e0bf-56de-4f3e-ba96-02bb86eece1f</uuid>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <arch>x86_64</arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <microcode version='16777317'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <signature family='23' model='49' stepping='0'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='x2apic'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='tsc-deadline'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='osxsave'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='hypervisor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='tsc_adjust'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='spec-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='stibp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='arch-capabilities'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='cmp_legacy'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='topoext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='virt-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='lbrv'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='tsc-scale'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='vmcb-clean'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='pause-filter'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='pfthreshold'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='rdctl-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='skip-l1dfl-vmentry'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='mds-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature name='pschange-mc-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <pages unit='KiB' size='4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <pages unit='KiB' size='2048'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <pages unit='KiB' size='1048576'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <power_management>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <suspend_mem/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </power_management>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <iommu support='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <migration_features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <live/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <uri_transports>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <uri_transport>tcp</uri_transport>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <uri_transport>rdma</uri_transport>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </uri_transports>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </migration_features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <topology>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <cells num='1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <cell id='0'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <memory unit='KiB'>7864300</memory>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <pages unit='KiB' size='4'>1966075</pages>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <pages unit='KiB' size='2048'>0</pages>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <distances>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <sibling id='0' value='10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          </distances>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          <cpus num='8'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:          </cpus>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        </cell>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </cells>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </topology>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <cache>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </cache>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <secmodel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model>selinux</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <doi>0</doi>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </secmodel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <secmodel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model>dac</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <doi>0</doi>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </secmodel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </host>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <guest>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <os_type>hvm</os_type>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <arch name='i686'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <wordsize>32</wordsize>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <domain type='qemu'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <domain type='kvm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <pae/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <nonpae/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <acpi default='on' toggle='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <apic default='on' toggle='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <cpuselection/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <deviceboot/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <externalSnapshot/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </guest>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <guest>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <os_type>hvm</os_type>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <arch name='x86_64'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <wordsize>64</wordsize>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <domain type='qemu'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <domain type='kvm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <acpi default='on' toggle='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <apic default='on' toggle='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <cpuselection/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <deviceboot/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <externalSnapshot/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </guest>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </capabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: #033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.305 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.328 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 31 02:39:58 np0005603622 nova_compute[220361]: <domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <arch>i686</arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <vcpu max='240'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <os supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>rom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pflash</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>yes</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='secure'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </loader>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </os>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>memfd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </memoryBacking>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>disk</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>floppy</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>lun</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ide</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>fdc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>sata</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </disk>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vnc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </graphics>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <video supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vga</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>none</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>bochs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </video>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='mode'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>requisite</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>optional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pci</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hostdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>random</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </rng>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>path</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>handle</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </filesystem>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emulator</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>external</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>2.0</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </tpm>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </redirdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </channel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </crypto>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>passt</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </interface>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>isa</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </panic>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <console supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>null</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dev</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pipe</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stdio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>udp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tcp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </console>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='features'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vapic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>runtime</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>synic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stimer</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reset</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ipi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>avic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hyperv>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.343 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 31 02:39:58 np0005603622 nova_compute[220361]: <domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <arch>i686</arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <vcpu max='4096'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <os supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>rom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pflash</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>yes</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='secure'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </loader>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </os>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>memfd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </memoryBacking>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>disk</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>floppy</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>lun</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>fdc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>sata</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </disk>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vnc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </graphics>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <video supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vga</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>none</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>bochs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </video>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='mode'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>requisite</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>optional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pci</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hostdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>random</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </rng>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>path</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>handle</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </filesystem>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emulator</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>external</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>2.0</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </tpm>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </redirdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </channel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </crypto>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>passt</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </interface>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>isa</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </panic>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <console supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>null</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dev</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pipe</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stdio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>udp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tcp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </console>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='features'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vapic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>runtime</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>synic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stimer</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reset</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ipi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>avic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hyperv>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.400 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.405 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 31 02:39:58 np0005603622 nova_compute[220361]: <domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <arch>x86_64</arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <vcpu max='240'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <os supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='firmware'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>rom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pflash</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>yes</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='secure'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </loader>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </os>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>memfd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </memoryBacking>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>disk</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>floppy</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>lun</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ide</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>fdc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>sata</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </disk>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vnc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </graphics>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <video supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vga</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>none</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>bochs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </video>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='mode'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>requisite</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>optional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pci</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hostdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>random</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </rng>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>path</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>handle</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </filesystem>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emulator</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>external</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>2.0</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </tpm>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </redirdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </channel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </crypto>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>passt</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </interface>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>isa</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </panic>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <console supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>null</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dev</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pipe</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stdio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>udp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tcp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </console>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='features'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vapic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>runtime</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>synic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stimer</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reset</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ipi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>avic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hyperv>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.473 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 31 02:39:58 np0005603622 nova_compute[220361]: <domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <domain>kvm</domain>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <arch>x86_64</arch>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <vcpu max='4096'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <iothreads supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <os supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='firmware'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>efi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <loader supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>rom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pflash</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='readonly'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>yes</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='secure'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>yes</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>no</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </loader>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </os>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='maximum' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='maximumMigratable'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>on</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>off</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='host-model' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <vendor>AMD</vendor>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='x2apic'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='stibp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='succor'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lbrv'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <mode name='custom' supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Broadwell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ddpd-u'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sha512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm3'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sm4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Cooperlake-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Denverton-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Dhyana-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amd-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='auto-ibrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibpb-brtype'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='no-nested-data-bp'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='null-sel-clr-base'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='perfmon-v2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbpb'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='stibp-always-on'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='EPYC-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-128'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-256'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx10-512'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='prefetchiti'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Haswell-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='IvyBridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='KnightsMill-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4fmaps'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-4vnniw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512er'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512pf'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fma4'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tbm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xop'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='amx-tile'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-bf16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-fp16'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bitalg'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vbmi2'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrc'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fzrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='la57'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='taa-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='tsx-ldtrk'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='SierraForest-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ifma'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-ne-convert'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx-vnni-int8'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bhi-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='bus-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cmpccxadd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fbsdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='fsrs'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ibrs-all'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='intel-psfd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ipred-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='lam'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mcdt-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pbrsb-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='psdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rrsba-ctrl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='serialize'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vaes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='vpclmulqdq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='hle'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='rtm'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512bw'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512cd'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512dq'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512f'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='avx512vl'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='invpcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pcid'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='pku'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='mpx'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v2'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v3'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='core-capability'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='split-lock-detect'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='Snowridge-v4'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='cldemote'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='erms'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='gfni'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdir64b'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='movdiri'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='xsaves'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='athlon-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='core2duo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='coreduo-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='n270-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='ss'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <blockers model='phenom-v1'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnow'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <feature name='3dnowext'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </blockers>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </mode>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <memoryBacking supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <enum name='sourceType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>anonymous</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <value>memfd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </memoryBacking>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <disk supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='diskDevice'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>disk</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cdrom</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>floppy</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>lun</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>fdc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>sata</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </disk>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <graphics supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vnc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egl-headless</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </graphics>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <video supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='modelType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vga</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>cirrus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>none</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>bochs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ramfb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </video>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hostdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='mode'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>subsystem</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='startupPolicy'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>mandatory</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>requisite</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>optional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='subsysType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pci</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>scsi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='capsType'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='pciBackend'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hostdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <rng supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtio-non-transitional</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>random</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>egd</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </rng>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <filesystem supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='driverType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>path</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>handle</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>virtiofs</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </filesystem>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tpm supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-tis</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tpm-crb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emulator</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>external</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendVersion'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>2.0</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </tpm>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <redirdev supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='bus'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>usb</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </redirdev>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <channel supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </channel>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <crypto supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendModel'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>builtin</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </crypto>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <interface supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='backendType'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>default</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>passt</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </interface>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <panic supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='model'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>isa</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>hyperv</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </panic>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <console supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='type'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>null</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vc</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pty</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dev</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>file</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>pipe</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stdio</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>udp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tcp</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>unix</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>qemu-vdagent</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>dbus</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </console>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </devices>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <gic supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <vmcoreinfo supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <genid supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backingStoreInput supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <backup supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <async-teardown supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <s390-pv supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <ps2 supported='yes'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <tdx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sev supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <sgx supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <hyperv supported='yes'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <enum name='features'>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>relaxed</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vapic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>spinlocks</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vpindex</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>runtime</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>synic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>stimer</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reset</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>vendor_id</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>frequencies</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>reenlightenment</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>tlbflush</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>ipi</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>avic</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>emsr_bitmap</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <value>xmm_input</value>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </enum>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      <defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <spinlocks>4095</spinlocks>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <stimer_direct>on</stimer_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:      </defaults>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    </hyperv>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:    <launchSecurity supported='no'/>
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  </features>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </domainCapabilities>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.534 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.535 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.535 220365 DEBUG nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.538 220365 INFO nova.virt.libvirt.host [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Secure Boot support detected#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.540 220365 INFO nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.540 220365 INFO nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.551 220365 DEBUG nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 31 02:39:58 np0005603622 nova_compute[220361]:  <model>Nehalem</model>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: </cpu>
Jan 31 02:39:58 np0005603622 nova_compute[220361]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.555 220365 DEBUG nova.virt.libvirt.driver [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.610 220365 INFO nova.virt.node [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Determined node identity f7fd90d1-7583-42ff-b709-f5fc55f6e273 from /var/lib/nova/compute_id#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.631 220365 WARNING nova.compute.manager [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Compute nodes ['f7fd90d1-7583-42ff-b709-f5fc55f6e273'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.678 220365 INFO nova.compute.manager [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 31 02:39:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:39:58.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.789 220365 WARNING nova.compute.manager [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.790 220365 DEBUG oslo_concurrency.lockutils [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.790 220365 DEBUG oslo_concurrency.lockutils [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.790 220365 DEBUG oslo_concurrency.lockutils [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.790 220365 DEBUG nova.compute.resource_tracker [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:39:58 np0005603622 nova_compute[220361]: 2026-01-31 07:39:58.791 220365 DEBUG oslo_concurrency.processutils [None req-abf12933-7055-41c1-85e4-9a4b8ce688a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:39:59 np0005603622 python3.9[221217]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 31 02:39:59 np0005603622 systemd[1]: Stopping nova_compute container...
Jan 31 02:39:59 np0005603622 nova_compute[220361]: 2026-01-31 07:39:59.492 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:39:59 np0005603622 nova_compute[220361]: 2026-01-31 07:39:59.492 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:39:59 np0005603622 nova_compute[220361]: 2026-01-31 07:39:59.493 220365 DEBUG oslo_concurrency.lockutils [None req-453d791e-a4bb-4e3e-a140-2fd98556612d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:39:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:39:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:39:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:39:59.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:39:59 np0005603622 virtqemud[220925]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 31 02:39:59 np0005603622 virtqemud[220925]: hostname: compute-1
Jan 31 02:39:59 np0005603622 virtqemud[220925]: End of file while reading data: Input/output error
Jan 31 02:39:59 np0005603622 systemd[1]: libpod-967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c.scope: Deactivated successfully.
Jan 31 02:39:59 np0005603622 systemd[1]: libpod-967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c.scope: Consumed 3.418s CPU time.
Jan 31 02:39:59 np0005603622 podman[221241]: 2026-01-31 07:39:59.931850497 +0000 UTC m=+0.500787906 container died 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.950575) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199950620, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1215, "num_deletes": 255, "total_data_size": 2710364, "memory_usage": 2758464, "flush_reason": "Manual Compaction"}
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199986370, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1768003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16121, "largest_seqno": 17331, "table_properties": {"data_size": 1762790, "index_size": 2673, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11139, "raw_average_key_size": 18, "raw_value_size": 1752061, "raw_average_value_size": 2984, "num_data_blocks": 121, "num_entries": 587, "num_filter_entries": 587, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845075, "oldest_key_time": 1769845075, "file_creation_time": 1769845199, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 35838 microseconds, and 4053 cpu microseconds.
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:39:59 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c-userdata-shm.mount: Deactivated successfully.
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.986416) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1768003 bytes OK
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.986435) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.989148) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.989171) EVENT_LOG_v1 {"time_micros": 1769845199989160, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.989190) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2704544, prev total WAL file size 2704544, number of live WAL files 2.
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.989822) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1726KB)], [30(7868KB)]
Jan 31 02:39:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845199989909, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 9824920, "oldest_snapshot_seqno": -1}
Jan 31 02:39:59 np0005603622 systemd[1]: var-lib-containers-storage-overlay-b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea-merged.mount: Deactivated successfully.
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4251 keys, 9463771 bytes, temperature: kUnknown
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200073034, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 9463771, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9432833, "index_size": 19180, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10693, "raw_key_size": 106010, "raw_average_key_size": 24, "raw_value_size": 9353281, "raw_average_value_size": 2200, "num_data_blocks": 801, "num_entries": 4251, "num_filter_entries": 4251, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845199, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.073383) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 9463771 bytes
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.080795) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.1 rd, 113.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 7.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(10.9) write-amplify(5.4) OK, records in: 4779, records dropped: 528 output_compression: NoCompression
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.080826) EVENT_LOG_v1 {"time_micros": 1769845200080813, "job": 16, "event": "compaction_finished", "compaction_time_micros": 83220, "compaction_time_cpu_micros": 16157, "output_level": 6, "num_output_files": 1, "total_output_size": 9463771, "num_input_records": 4779, "num_output_records": 4251, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200081364, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845200082509, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:39:59.989692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.082569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.082575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.082578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.082581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:40:00.082584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:40:00 np0005603622 podman[221241]: 2026-01-31 07:40:00.086845986 +0000 UTC m=+0.655783435 container cleanup 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 02:40:00 np0005603622 podman[221241]: nova_compute
Jan 31 02:40:00 np0005603622 podman[221271]: nova_compute
Jan 31 02:40:00 np0005603622 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 31 02:40:00 np0005603622 systemd[1]: Stopped nova_compute container.
Jan 31 02:40:00 np0005603622 systemd[1]: Starting nova_compute container...
Jan 31 02:40:00 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:40:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ebb8d963e24e3e2e1b8918dd3bd6186e7a039ed28cbc133d9c1a5c1ad207ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:00 np0005603622 podman[221284]: 2026-01-31 07:40:00.297220796 +0000 UTC m=+0.116399806 container init 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 02:40:00 np0005603622 podman[221284]: 2026-01-31 07:40:00.304490561 +0000 UTC m=+0.123669521 container start 967a8879bdc80206f12e5ed17b13c0e0d39fb95a560aaa683841898e3081111c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + sudo -E kolla_set_configs
Jan 31 02:40:00 np0005603622 podman[221284]: nova_compute
Jan 31 02:40:00 np0005603622 systemd[1]: Started nova_compute container.
Jan 31 02:40:00 np0005603622 podman[221297]: 2026-01-31 07:40:00.361392044 +0000 UTC m=+0.112960935 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Validating config file
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying service configuration files
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /etc/ceph
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Creating directory /etc/ceph
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/ceph
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Writing out command to execute
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:00 np0005603622 nova_compute[221301]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 31 02:40:00 np0005603622 nova_compute[221301]: ++ cat /run_command
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + CMD=nova-compute
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + ARGS=
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + sudo kolla_copy_cacerts
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + [[ ! -n '' ]]
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + . kolla_extend_start
Jan 31 02:40:00 np0005603622 nova_compute[221301]: Running command: 'nova-compute'
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + echo 'Running command: '\''nova-compute'\'''
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + umask 0022
Jan 31 02:40:00 np0005603622 nova_compute[221301]: + exec nova-compute
Jan 31 02:40:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:40:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:40:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:40:00.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:40:01 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:40:01 np0005603622 python3.9[221489]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 31 02:40:01 np0005603622 systemd[1]: Started libpod-conmon-77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57.scope.
Jan 31 02:40:01 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:40:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f852bd708226831330628b75bb2f5be3a1f300de0eeb4654266ef4e4ed0813c6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f852bd708226831330628b75bb2f5be3a1f300de0eeb4654266ef4e4ed0813c6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f852bd708226831330628b75bb2f5be3a1f300de0eeb4654266ef4e4ed0813c6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 31 02:40:01 np0005603622 podman[221515]: 2026-01-31 07:40:01.399576204 +0000 UTC m=+0.130185876 container init 77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm)
Jan 31 02:40:01 np0005603622 podman[221515]: 2026-01-31 07:40:01.408142033 +0000 UTC m=+0.138751685 container start 77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 02:40:01 np0005603622 python3.9[221489]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Applying nova statedir ownership
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 31 02:40:01 np0005603622 nova_compute_init[221536]: INFO:nova_statedir:Nova statedir ownership complete
Jan 31 02:40:01 np0005603622 systemd[1]: libpod-77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57.scope: Deactivated successfully.
Jan 31 02:40:01 np0005603622 podman[221555]: 2026-01-31 07:40:01.514694015 +0000 UTC m=+0.030859227 container died 77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 02:40:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57-userdata-shm.mount: Deactivated successfully.
Jan 31 02:40:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f852bd708226831330628b75bb2f5be3a1f300de0eeb4654266ef4e4ed0813c6-merged.mount: Deactivated successfully.
Jan 31 02:40:01 np0005603622 podman[221555]: 2026-01-31 07:40:01.560164702 +0000 UTC m=+0.076329854 container cleanup 77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 31 02:40:01 np0005603622 systemd[1]: libpod-conmon-77b3b2a7a2800f3089e698fd712913d406ec2c88fa3437187a42d15160735b57.scope: Deactivated successfully.
Jan 31 02:40:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:40:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:40:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:40:01.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.239 221324 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.240 221324 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.240 221324 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.240 221324 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.364 221324 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.386 221324 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.387 221324 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:40:02 np0005603622 systemd[1]: session-49.scope: Deactivated successfully.
Jan 31 02:40:02 np0005603622 systemd[1]: session-49.scope: Consumed 1min 48.955s CPU time.
Jan 31 02:40:02 np0005603622 systemd-logind[817]: Session 49 logged out. Waiting for processes to exit.
Jan 31 02:40:02 np0005603622 systemd-logind[817]: Removed session 49.
Jan 31 02:40:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:40:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:40:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:40:02.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.865 221324 INFO nova.virt.driver [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.958 221324 INFO nova.compute.provider_config [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.969 221324 DEBUG oslo_concurrency.lockutils [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.969 221324 DEBUG oslo_concurrency.lockutils [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.970 221324 DEBUG oslo_concurrency.lockutils [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.970 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.970 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.970 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.970 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.971 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.972 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.973 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.974 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.975 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.976 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.976 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.976 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.976 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.976 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.977 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.978 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.979 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.980 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.981 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.982 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.983 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.984 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.984 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.984 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.984 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.985 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.986 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.987 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.987 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.987 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.987 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.987 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.988 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.989 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.989 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.989 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.989 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.989 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.990 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.991 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.991 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.991 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.991 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.992 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.992 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.992 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.992 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.992 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.993 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.994 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.995 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.996 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.997 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.998 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:02 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:02.999 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.000 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.001 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.002 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.003 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.004 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.005 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.006 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.007 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.008 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.009 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.010 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.011 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.012 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.013 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.014 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.015 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.016 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.017 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.018 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.019 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.020 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.021 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.022 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.022 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.022 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.022 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.022 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.023 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.024 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.025 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.025 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.025 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.025 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.025 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.026 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.027 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.028 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.029 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.030 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.031 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.032 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.033 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.034 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.035 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.036 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.037 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.038 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.039 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.040 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.041 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.042 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.043 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.044 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.044 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.044 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.044 221324 WARNING oslo_config.cfg [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 31 02:40:03 np0005603622 nova_compute[221301]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 31 02:40:03 np0005603622 nova_compute[221301]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 31 02:40:03 np0005603622 nova_compute[221301]: and ``live_migration_inbound_addr`` respectively.
Jan 31 02:40:03 np0005603622 nova_compute[221301]: ).  Its value may be silently ignored in the future.#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.044 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.045 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.046 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.047 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rbd_secret_uuid        = 2f5ab832-5f2e-5a84-bd93-cf8bab960ee2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.048 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.049 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.050 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.051 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.052 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.053 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.054 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.055 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.056 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.057 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.058 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.059 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.060 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.061 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.062 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.063 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.064 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.064 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.064 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.064 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.064 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.065 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.066 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.067 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.068 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.069 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.070 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.071 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.072 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.073 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.074 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.075 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.076 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.077 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.078 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.079 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.080 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.081 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.082 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.083 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.084 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.085 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.086 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.087 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.087 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.087 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.087 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.087 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.088 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.089 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.090 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.091 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.092 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.092 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.092 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.092 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.092 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.093 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.094 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.095 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.096 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.097 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.098 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.099 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.100 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.101 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.102 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.103 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.103 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.103 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.103 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.103 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.104 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.105 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.106 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.106 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.106 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.106 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.106 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.107 221324 DEBUG oslo_service.service [None req-75d8f316-5f5a-4a30-8070-38d1a764c214 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.109 221324 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.132 221324 INFO nova.virt.node [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Determined node identity f7fd90d1-7583-42ff-b709-f5fc55f6e273 from /var/lib/nova/compute_id#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.133 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.134 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.134 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.134 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.145 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f306a9fcd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.149 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f306a9fcd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.150 221324 INFO nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.155 221324 INFO nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Libvirt host capabilities <capabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <host>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <uuid>ff96e0bf-56de-4f3e-ba96-02bb86eece1f</uuid>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <arch>x86_64</arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <microcode version='16777317'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <signature family='23' model='49' stepping='0'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='x2apic'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='tsc-deadline'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='osxsave'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='hypervisor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='tsc_adjust'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='spec-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='stibp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='arch-capabilities'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='cmp_legacy'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='topoext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='virt-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='lbrv'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='tsc-scale'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='vmcb-clean'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='pause-filter'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='pfthreshold'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='rdctl-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='skip-l1dfl-vmentry'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='mds-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature name='pschange-mc-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <pages unit='KiB' size='4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <pages unit='KiB' size='2048'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <pages unit='KiB' size='1048576'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <power_management>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <suspend_mem/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </power_management>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <iommu support='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <migration_features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <live/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <uri_transports>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <uri_transport>tcp</uri_transport>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <uri_transport>rdma</uri_transport>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </uri_transports>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </migration_features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <topology>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <cells num='1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <cell id='0'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <memory unit='KiB'>7864300</memory>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <pages unit='KiB' size='4'>1966075</pages>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <pages unit='KiB' size='2048'>0</pages>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <distances>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <sibling id='0' value='10'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          </distances>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          <cpus num='8'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:          </cpus>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        </cell>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </cells>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </topology>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <cache>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </cache>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <secmodel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model>selinux</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <doi>0</doi>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </secmodel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <secmodel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model>dac</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <doi>0</doi>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </secmodel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </host>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <guest>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <os_type>hvm</os_type>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <arch name='i686'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <wordsize>32</wordsize>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <domain type='qemu'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <domain type='kvm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <pae/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <nonpae/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <acpi default='on' toggle='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <apic default='on' toggle='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <cpuselection/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <deviceboot/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <externalSnapshot/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </guest>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <guest>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <os_type>hvm</os_type>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <arch name='x86_64'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <wordsize>64</wordsize>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <domain type='qemu'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <domain type='kvm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <acpi default='on' toggle='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <apic default='on' toggle='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <cpuselection/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <deviceboot/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <disksnapshot default='on' toggle='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <externalSnapshot/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </guest>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 
Jan 31 02:40:03 np0005603622 nova_compute[221301]: </capabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: #033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.162 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.167 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 31 02:40:03 np0005603622 nova_compute[221301]: <domainCapabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <arch>i686</arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <vcpu max='4096'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <os supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <enum name='firmware'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>rom</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pflash</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>yes</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='secure'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </loader>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Turin'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='KnightsMill'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='KnightsMill-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='athlon'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='athlon-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='core2duo'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='core2duo-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='coreduo'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='coreduo-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='n270'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='n270-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='phenom'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='phenom-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <memoryBacking supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <enum name='sourceType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>file</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>anonymous</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>memfd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </memoryBacking>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <disk supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='diskDevice'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>disk</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>cdrom</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>floppy</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>lun</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='bus'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>fdc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>scsi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>sata</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <graphics supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vnc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>egl-headless</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dbus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <video supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='modelType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vga</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>cirrus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>none</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>bochs</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>ramfb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <hostdev supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='mode'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>subsystem</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='startupPolicy'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>default</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>mandatory</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>requisite</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>optional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='subsysType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pci</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>scsi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='capsType'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='pciBackend'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </hostdev>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <rng supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>random</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>egd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>builtin</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <filesystem supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='driverType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>path</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>handle</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtiofs</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </filesystem>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <tpm supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tpm-tis</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tpm-crb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>emulator</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>external</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendVersion'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>2.0</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </tpm>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <redirdev supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='bus'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </redirdev>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <channel supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pty</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>unix</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </channel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <crypto supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>qemu</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>builtin</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </crypto>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <interface supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>default</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>passt</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <panic supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>isa</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>hyperv</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </panic>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <console supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>null</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pty</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dev</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>file</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pipe</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>stdio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>udp</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tcp</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>unix</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>qemu-vdagent</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dbus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </console>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <gic supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <vmcoreinfo supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <genid supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <backingStoreInput supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <backup supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <async-teardown supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <s390-pv supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <ps2 supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <tdx supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <sev supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <sgx supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <hyperv supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='features'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>relaxed</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vapic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>spinlocks</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vpindex</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>runtime</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>synic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>stimer</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>reset</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vendor_id</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>frequencies</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>reenlightenment</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tlbflush</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>ipi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>avic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>emsr_bitmap</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>xmm_input</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <defaults>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <spinlocks>4095</spinlocks>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <stimer_direct>on</stimer_direct>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </defaults>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </hyperv>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <launchSecurity supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: </domainCapabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.173 221324 DEBUG nova.virt.libvirt.volume.mount [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.176 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 31 02:40:03 np0005603622 nova_compute[221301]: <domainCapabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <arch>i686</arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <vcpu max='240'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <os supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <enum name='firmware'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>rom</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pflash</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>yes</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='secure'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </loader>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Turin'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Turin-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibpb-brtype'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbpb'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='srso-user-kernel-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='GraniteRapids-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-128'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-256'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx10-512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Haswell-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v6'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Icelake-Server-v7'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='IvyBridge-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='KnightsMill'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='KnightsMill-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4fmaps'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-4vnniw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512er'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512pf'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G4-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Opteron_G5-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fma4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tbm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xop'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SapphireRapids-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amx-tile'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-fp16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrc'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fzrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='tsx-ldtrk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='SierraForest-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Client-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Skylake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='core-capability'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='split-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Snowridge-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='athlon'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='athlon-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='core2duo'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='core2duo-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='coreduo'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='coreduo-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='n270'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='n270-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='phenom'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='phenom-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnow'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='3dnowext'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <memoryBacking supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <enum name='sourceType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>file</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>anonymous</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>memfd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </memoryBacking>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <disk supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='diskDevice'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>disk</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>cdrom</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>floppy</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>lun</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='bus'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>ide</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>fdc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>scsi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>sata</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <graphics supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vnc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>egl-headless</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dbus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <video supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='modelType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vga</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>cirrus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>none</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>bochs</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>ramfb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <hostdev supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='mode'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>subsystem</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='startupPolicy'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>default</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>mandatory</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>requisite</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>optional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='subsysType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pci</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>scsi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='capsType'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='pciBackend'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </hostdev>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <rng supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtio-non-transitional</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>random</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>egd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>builtin</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <filesystem supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='driverType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>path</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>handle</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>virtiofs</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </filesystem>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <tpm supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tpm-tis</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tpm-crb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>emulator</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>external</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendVersion'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>2.0</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </tpm>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <redirdev supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='bus'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>usb</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </redirdev>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <channel supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pty</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>unix</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </channel>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <crypto supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>qemu</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendModel'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>builtin</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </crypto>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <interface supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='backendType'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>default</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>passt</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <panic supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='model'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>isa</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>hyperv</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </panic>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <console supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>null</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vc</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pty</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dev</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>file</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pipe</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>stdio</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>udp</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tcp</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>unix</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>qemu-vdagent</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>dbus</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </console>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <gic supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <vmcoreinfo supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <genid supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <backingStoreInput supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <backup supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <async-teardown supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <s390-pv supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <ps2 supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <tdx supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <sev supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <sgx supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <hyperv supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='features'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>relaxed</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vapic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>spinlocks</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vpindex</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>runtime</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>synic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>stimer</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>reset</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>vendor_id</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>frequencies</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>reenlightenment</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>tlbflush</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>ipi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>avic</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>emsr_bitmap</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>xmm_input</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <defaults>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <spinlocks>4095</spinlocks>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <stimer_direct>on</stimer_direct>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <tlbflush_direct>on</tlbflush_direct>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <tlbflush_extended>on</tlbflush_extended>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </defaults>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </hyperv>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <launchSecurity supported='no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: </domainCapabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.232 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 31 02:40:03 np0005603622 nova_compute[221301]: 2026-01-31 07:40:03.240 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 31 02:40:03 np0005603622 nova_compute[221301]: <domainCapabilities>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <path>/usr/libexec/qemu-kvm</path>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <domain>kvm</domain>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <arch>x86_64</arch>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <vcpu max='4096'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <iothreads supported='yes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <os supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <enum name='firmware'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>efi</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <loader supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='type'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>rom</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>pflash</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='readonly'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>yes</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='secure'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>yes</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>no</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </loader>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:  <cpu>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-passthrough' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='hostPassthroughMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='maximum' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <enum name='maximumMigratable'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>on</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <value>off</value>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </enum>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='host-model' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <vendor>AMD</vendor>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='x2apic'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-deadline'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='hypervisor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc_adjust'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='spec-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='stibp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='cmp_legacy'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='overflow-recov'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='succor'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='amd-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='virt-ssbd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lbrv'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='tsc-scale'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='vmcb-clean'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='flushbyasid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pause-filter'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='pfthreshold'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='svme-addr-chk'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <feature policy='disable' name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    </mode>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:    <mode name='custom' supported='yes'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Broadwell-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v4'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cascadelake-Server-v5'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='ClearwaterForest-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-ne-convert'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni-int8'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bhi-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='bus-lock-detect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cldemote'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='cmpccxadd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ddpd-u'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fbsdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='intel-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ipred-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='lam'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mcdt-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdir64b'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='movdiri'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pbrsb-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='prefetchiti'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='psdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rrsba-ctrl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sbdr-ssdp-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='serialize'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sha512'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm3'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='sm4'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ss'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Cooperlake-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='hle'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='ibrs-all'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='rtm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='taa-no'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='mpx'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Denverton-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='Dhyana-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Genoa-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vl'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fs-gs-base-ns'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='gfni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='la57'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='perfmon-v2'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Milan-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='erms'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='fsrm'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='invpcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='no-nested-data-bp'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='null-sel-clr-base'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pcid'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='pku'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='stibp-always-on'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vaes'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='vpclmulqdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v1'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v2'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Rome-v3'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='xsaves'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      </blockers>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:      <blockers model='EPYC-Turin'>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='amd-psfd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='auto-ibrs'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx-vnni'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-bf16'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vp2intersect'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512-vpopcntdq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bitalg'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512bw'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512cd'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512dq'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512f'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512ifma'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi'/>
Jan 31 02:40:03 np0005603622 nova_compute[221301]:        <feature name='avx512vbmi2'/>
Jan 31 02:41:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:21.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:22 np0005603622 rsyslogd[1003]: imjournal: 3922 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 31 02:41:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:22.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:23.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:24.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:25.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:26.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:27.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:28.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:29.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:41:30.833 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:41:30.833 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:41:30.834 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:41:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:31.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:32.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:33 np0005603622 podman[222030]: 2026-01-31 07:41:33.242028014 +0000 UTC m=+0.152339438 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 02:41:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:33.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:34.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:35.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:36.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:37.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:38.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:39 np0005603622 podman[222056]: 2026-01-31 07:41:39.160404312 +0000 UTC m=+0.088705816 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 02:41:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:39.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:40.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:41.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:42.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:43.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:44.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:45.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:46.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:47.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:48.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:49.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:50.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:51.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:52.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:41:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:53.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:41:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:41:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:54.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:41:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:55.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:56.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:41:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:41:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:57.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:41:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:41:58.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:41:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:41:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:41:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:41:59.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:00.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:42:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6173 writes, 25K keys, 6173 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6173 writes, 1096 syncs, 5.63 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 429 writes, 694 keys, 429 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 429 writes, 191 syncs, 2.25 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 31 02:42:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:02.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:42:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:42:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.133 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.150 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.151 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.151 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.165 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.166 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.166 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.167 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.167 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.168 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.168 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.169 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:04 np0005603622 podman[222376]: 2026-01-31 07:42:04.174675721 +0000 UTC m=+0.102035063 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.213 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.213 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.213 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.214 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.214 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:42:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:42:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/916684519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.627 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.767 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.769 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5304MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.769 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.769 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.838 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.839 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:42:04 np0005603622 nova_compute[221301]: 2026-01-31 07:42:04.860 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:42:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:04.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:42:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/766741625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.267 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.275 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.304 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.307 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.307 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:05.446 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:42:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:05.448 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:42:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:05.449 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.537 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:05 np0005603622 nova_compute[221301]: 2026-01-31 07:42:05.538 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:42:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:42:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:06.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:42:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:07.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:08.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:42:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:42:10 np0005603622 podman[222446]: 2026-01-31 07:42:10.157460556 +0000 UTC m=+0.074287668 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:42:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:10.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:12.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:12.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:14.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:42:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2554818493' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:42:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:42:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2554818493' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:42:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:14.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:42:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:42:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:16.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:18.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:18.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:20.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:24.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:24.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:26.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:26.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:42:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3493 writes, 18K keys, 3493 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3493 writes, 3493 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1331 writes, 6480 keys, 1331 commit groups, 1.0 writes per commit group, ingest: 14.37 MB, 0.02 MB/s#012Interval WAL: 1331 writes, 1331 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     67.4      0.30              0.05         9    0.034       0      0       0.0       0.0#012  L6      1/0    7.41 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.2    109.9     91.9      0.72              0.19         8    0.090     35K   4282       0.0       0.0#012 Sum      1/0    7.41 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.2     77.4     84.6      1.03              0.24        17    0.060     35K   4282       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.3     57.7     57.1      0.72              0.12         8    0.090     19K   2502       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    109.9     91.9      0.72              0.19         8    0.090     35K   4282       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     67.8      0.30              0.05         8    0.038       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.0 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 4.63 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 6.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(252,4.32 MB,1.41985%) FilterBlock(17,106.98 KB,0.0343674%) IndexBlock(17,211.83 KB,0.0680472%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:42:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:28.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:28.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:30.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:30.834 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:30.835 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:42:30.835 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:42:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:30.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:32.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:32.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:34.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:34.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:35 np0005603622 podman[222466]: 2026-01-31 07:42:35.17952681 +0000 UTC m=+0.104682702 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:42:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:36.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:36.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:38.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:38.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:40.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:40.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:41 np0005603622 podman[222492]: 2026-01-31 07:42:41.140676284 +0000 UTC m=+0.064816647 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 02:42:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:42.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:42.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:44.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:44.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:46.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:46.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:48.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:48.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:50.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:50.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:52.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:52.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:42:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:54.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:42:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:54.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:56.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:57.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:42:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:42:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:42:58.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:42:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:42:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:42:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:42:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:42:59.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:00.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:01.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:02.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.698 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.699 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.699 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.699 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:43:02 np0005603622 nova_compute[221301]: 2026-01-31 07:43:02.700 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:43:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:03.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:43:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/840896816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:43:03 np0005603622 nova_compute[221301]: 2026-01-31 07:43:03.133 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:43:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:03 np0005603622 nova_compute[221301]: 2026-01-31 07:43:03.260 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:43:03 np0005603622 nova_compute[221301]: 2026-01-31 07:43:03.261 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5319MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:43:03 np0005603622 nova_compute[221301]: 2026-01-31 07:43:03.262 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:03 np0005603622 nova_compute[221301]: 2026-01-31 07:43:03.262 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:04.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:04 np0005603622 nova_compute[221301]: 2026-01-31 07:43:04.116 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:43:04 np0005603622 nova_compute[221301]: 2026-01-31 07:43:04.117 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:43:04 np0005603622 nova_compute[221301]: 2026-01-31 07:43:04.226 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:43:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:05.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:06.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:06 np0005603622 podman[222678]: 2026-01-31 07:43:06.185433238 +0000 UTC m=+0.115706244 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:43:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:43:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/526084406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:43:06 np0005603622 nova_compute[221301]: 2026-01-31 07:43:06.732 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:43:06 np0005603622 nova_compute[221301]: 2026-01-31 07:43:06.739 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:43:06 np0005603622 nova_compute[221301]: 2026-01-31 07:43:06.840 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:43:06 np0005603622 nova_compute[221301]: 2026-01-31 07:43:06.842 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:43:06 np0005603622 nova_compute[221301]: 2026-01-31 07:43:06.842 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:07 np0005603622 nova_compute[221301]: 2026-01-31 07:43:07.842 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:07 np0005603622 nova_compute[221301]: 2026-01-31 07:43:07.842 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:07 np0005603622 nova_compute[221301]: 2026-01-31 07:43:07.843 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:43:07 np0005603622 nova_compute[221301]: 2026-01-31 07:43:07.843 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.060 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.060 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.061 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.062 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.062 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.062 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.063 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:43:08 np0005603622 nova_compute[221301]: 2026-01-31 07:43:08.063 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:43:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:08.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:09.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:43:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:43:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:43:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:10.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:12.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:12 np0005603622 podman[222715]: 2026-01-31 07:43:12.131507212 +0000 UTC m=+0.058784597 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 02:43:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:13.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:14.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:15.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:16.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:17.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:18.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:18 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:43:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:19.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).paxos(paxos updating c 1256..1939) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 3.141343594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 02:43:19 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1[81564]: 2026-01-31T07:43:19.719+0000 7f65ea1de640 -1 mon.compute-1@2(peon).paxos(paxos updating c 1256..1939) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 3.141343594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 02:43:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:43:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:21.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:22.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:23.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:23 np0005603622 nova_compute[221301]: 2026-01-31 07:43:23.074 221324 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.12 sec#033[00m
Jan 31 02:43:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:25.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:27.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:28.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:29.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:30.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:43:30.835 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:43:30.836 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:43:30.836 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:43:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:31.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:32.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:33.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:34.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - - [31/Jan/2026:07:43:34.134 +0000] "GET /swift/info HTTP/1.1" 200 509 - "python-urllib3/1.26.5" - latency=0.000000000s
Jan 31 02:43:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:35.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:36.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:37.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:37 np0005603622 podman[222786]: 2026-01-31 07:43:37.148956415 +0000 UTC m=+0.077956994 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 31 02:43:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:38.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:39.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:40.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:41.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 31 02:43:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:42.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:43.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:43 np0005603622 podman[222812]: 2026-01-31 07:43:43.106730117 +0000 UTC m=+0.037927986 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 02:43:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 31 02:43:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 31 02:43:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:44.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:46.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:47.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 31 02:43:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:48.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:49.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:50.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 31 02:43:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:51.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:52.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:43:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:53.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:43:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:54.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:43:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:55.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:56.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:57.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:43:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:43:58.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:43:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:43:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:43:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:43:59.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:43:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:00.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:01.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:02.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:03.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:04.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.415 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.416 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.416 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.454 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.455 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.455 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.456 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.456 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.495 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.496 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.496 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.496 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.497 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3446588789' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:04 np0005603622 nova_compute[221301]: 2026-01-31 07:44:04.963 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:05.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.142 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.143 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5339MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.144 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.144 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.307 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.308 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.337 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2016038652' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.723 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.729 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.770 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.773 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:44:05 np0005603622 nova_compute[221301]: 2026-01-31 07:44:05.774 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:06.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:06 np0005603622 nova_compute[221301]: 2026-01-31 07:44:06.714 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:06 np0005603622 nova_compute[221301]: 2026-01-31 07:44:06.715 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:06 np0005603622 nova_compute[221301]: 2026-01-31 07:44:06.716 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:07.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:07 np0005603622 nova_compute[221301]: 2026-01-31 07:44:07.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:08.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:08 np0005603622 podman[222876]: 2026-01-31 07:44:08.203873979 +0000 UTC m=+0.125436662 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 02:44:08 np0005603622 nova_compute[221301]: 2026-01-31 07:44:08.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:44:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:08.734 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:44:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:08.736 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:44:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:09.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:10.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:11.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:12.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.584 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "bc2609cd-2682-4f66-91db-7b152e03fc32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.585 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.615 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.768 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.769 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.775 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.776 221324 INFO nova.compute.claims [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:44:12 np0005603622 nova_compute[221301]: 2026-01-31 07:44:12.898 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:13.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:13 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3319030497' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.357 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.363 221324 DEBUG nova.compute.provider_tree [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.376 221324 DEBUG nova.scheduler.client.report [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.407 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.408 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.484 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.505 221324 INFO nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.526 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.641 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.644 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.644 221324 INFO nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Creating image(s)#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.683 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.720 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.761 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.766 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:13 np0005603622 nova_compute[221301]: 2026-01-31 07:44:13.767 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:14 np0005603622 podman[222978]: 2026-01-31 07:44:14.157440023 +0000 UTC m=+0.086256709 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 02:44:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:14.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:44:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4285731726' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:44:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:44:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4285731726' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:44:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:14 np0005603622 nova_compute[221301]: 2026-01-31 07:44:14.932 221324 DEBUG nova.virt.libvirt.imagebackend [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.104073) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455104165, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2407, "num_deletes": 251, "total_data_size": 6166781, "memory_usage": 6238832, "flush_reason": "Manual Compaction"}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 31 02:44:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:15.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455136145, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 4037929, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17762, "largest_seqno": 20164, "table_properties": {"data_size": 4028036, "index_size": 6323, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19870, "raw_average_key_size": 20, "raw_value_size": 4008304, "raw_average_value_size": 4094, "num_data_blocks": 281, "num_entries": 979, "num_filter_entries": 979, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845218, "oldest_key_time": 1769845218, "file_creation_time": 1769845455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 32125 microseconds, and 10674 cpu microseconds.
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.136206) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 4037929 bytes OK
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.136230) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.139929) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.139951) EVENT_LOG_v1 {"time_micros": 1769845455139944, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.139973) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 6156305, prev total WAL file size 6156305, number of live WAL files 2.
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.141200) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3943KB)], [36(7587KB)]
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455141262, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11807234, "oldest_snapshot_seqno": -1}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4513 keys, 9756130 bytes, temperature: kUnknown
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455213392, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9756130, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9723382, "index_size": 20350, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112753, "raw_average_key_size": 24, "raw_value_size": 9639001, "raw_average_value_size": 2135, "num_data_blocks": 844, "num_entries": 4513, "num_filter_entries": 4513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.214007) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9756130 bytes
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.215577) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.5 rd, 135.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 7.4 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(5.3) write-amplify(2.4) OK, records in: 5036, records dropped: 523 output_compression: NoCompression
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.215607) EVENT_LOG_v1 {"time_micros": 1769845455215592, "job": 20, "event": "compaction_finished", "compaction_time_micros": 72197, "compaction_time_cpu_micros": 31002, "output_level": 6, "num_output_files": 1, "total_output_size": 9756130, "num_input_records": 5036, "num_output_records": 4513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455216748, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845455218487, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.141105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.218614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.218629) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.218637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.218648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:44:15.218652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:44:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:16.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:16 np0005603622 nova_compute[221301]: 2026-01-31 07:44:16.901 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:16 np0005603622 nova_compute[221301]: 2026-01-31 07:44:16.954 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:16 np0005603622 nova_compute[221301]: 2026-01-31 07:44:16.956 221324 DEBUG nova.virt.images [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 02:44:16 np0005603622 nova_compute[221301]: 2026-01-31 07:44:16.958 221324 DEBUG nova.privsep.utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:44:16 np0005603622 nova_compute[221301]: 2026-01-31 07:44:16.959 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:17.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.164 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.part /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.168 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.241 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.243 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.274 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:17 np0005603622 nova_compute[221301]: 2026-01-31 07:44:17.279 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bc2609cd-2682-4f66-91db-7b152e03fc32_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 31 02:44:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:18.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:18.739 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:44:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:19.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.425 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 bc2609cd-2682-4f66-91db-7b152e03fc32_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.503 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] resizing rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.652 221324 DEBUG nova.objects.instance [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'migration_context' on Instance uuid bc2609cd-2682-4f66-91db-7b152e03fc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.666 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.666 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Ensure instance console log exists: /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.667 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.667 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.668 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.670 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.674 221324 WARNING nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.680 221324 DEBUG nova.virt.libvirt.host [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.680 221324 DEBUG nova.virt.libvirt.host [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.697 221324 DEBUG nova.virt.libvirt.host [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.698 221324 DEBUG nova.virt.libvirt.host [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.699 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.700 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.700 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.700 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.701 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.701 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.701 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.701 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.702 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.702 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.703 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.703 221324 DEBUG nova.virt.hardware [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.707 221324 DEBUG nova.privsep.utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:44:19 np0005603622 nova_compute[221301]: 2026-01-31 07:44:19.707 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2518385861' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:20.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.193 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.235 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.242 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/118359385' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.684 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.688 221324 DEBUG nova.objects.instance [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'pci_devices' on Instance uuid bc2609cd-2682-4f66-91db-7b152e03fc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.711 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <uuid>bc2609cd-2682-4f66-91db-7b152e03fc32</uuid>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <name>instance-00000001</name>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:name>tempest-AutoAllocateNetworkTest-server-289774986</nova:name>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:44:19</nova:creationTime>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:user uuid="0eb58e8663574849b17616075ce5c43e">tempest-AutoAllocateNetworkTest-2135409609-project-member</nova:user>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <nova:project uuid="dc2f6584d8b64364b13683f53c58617f">tempest-AutoAllocateNetworkTest-2135409609</nova:project>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="serial">bc2609cd-2682-4f66-91db-7b152e03fc32</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="uuid">bc2609cd-2682-4f66-91db-7b152e03fc32</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/bc2609cd-2682-4f66-91db-7b152e03fc32_disk">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/console.log" append="off"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:44:20 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:44:20 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:44:20 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:44:20 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.778 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.780 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.781 221324 INFO nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Using config drive#033[00m
Jan 31 02:44:20 np0005603622 nova_compute[221301]: 2026-01-31 07:44:20.819 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:21.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:44:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.566 221324 INFO nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Creating config drive at /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.573 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpez0zatv2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.710 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpez0zatv2" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.747 221324 DEBUG nova.storage.rbd_utils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.752 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.968 221324 DEBUG oslo_concurrency.processutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config bc2609cd-2682-4f66-91db-7b152e03fc32_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:21 np0005603622 nova_compute[221301]: 2026-01-31 07:44:21.970 221324 INFO nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Deleting local config drive /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32/disk.config because it was imported into RBD.#033[00m
Jan 31 02:44:22 np0005603622 systemd[1]: Starting libvirt secret daemon...
Jan 31 02:44:22 np0005603622 systemd[1]: Started libvirt secret daemon.
Jan 31 02:44:22 np0005603622 systemd-machined[190406]: New machine qemu-1-instance-00000001.
Jan 31 02:44:22 np0005603622 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 31 02:44:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:22.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.650 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845462.6493697, bc2609cd-2682-4f66-91db-7b152e03fc32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.651 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.654 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.655 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.661 221324 INFO nova.virt.libvirt.driver [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance spawned successfully.#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.661 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.711 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.714 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.715 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.715 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.716 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.716 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.717 221324 DEBUG nova.virt.libvirt.driver [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.720 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.763 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.764 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845462.6507614, bc2609cd-2682-4f66-91db-7b152e03fc32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.764 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] VM Started (Lifecycle Event)#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.815 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.819 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.824 221324 INFO nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Took 9.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.825 221324 DEBUG nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.837 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.892 221324 INFO nova.compute.manager [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Took 10.17 seconds to build instance.#033[00m
Jan 31 02:44:22 np0005603622 nova_compute[221301]: 2026-01-31 07:44:22.914 221324 DEBUG oslo_concurrency.lockutils [None req-2b14b8e0-5fbc-4458-8096-606d2aabbcb1 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:23.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:24.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:25.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:26.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:44:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:27.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:28.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:29.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.124 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.126 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.171 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:44:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:30.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.324 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.326 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.335 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.336 221324 INFO nova.compute.claims [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.506 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:30.836 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:30.837 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:44:30.837 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/88307960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.990 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:30 np0005603622 nova_compute[221301]: 2026-01-31 07:44:30.998 221324 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.083 221324 ERROR nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [req-0da944cf-3d5e-4542-a492-7cb7f7fd0142] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID f7fd90d1-7583-42ff-b709-f5fc55f6e273.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-0da944cf-3d5e-4542-a492-7cb7f7fd0142"}]}#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.124 221324 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:44:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:31.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.205 221324 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.206 221324 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.234 221324 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.266 221324 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.375 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:44:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2012512258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.821 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:31 np0005603622 nova_compute[221301]: 2026-01-31 07:44:31.826 221324 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.162 221324 DEBUG nova.scheduler.client.report [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updated inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.165 221324 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.166 221324 DEBUG nova.compute.provider_tree [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:44:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:32.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.235 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.236 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.333 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.334 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.381 221324 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.525 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.739 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.742 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.743 221324 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Creating image(s)#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.783 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.819 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.864 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.868 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.927 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.928 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.929 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.929 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.956 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:32 np0005603622 nova_compute[221301]: 2026-01-31 07:44:32.960 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:33.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.360 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.454 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] resizing rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.600 221324 DEBUG nova.objects.instance [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'migration_context' on Instance uuid 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.666 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.668 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Ensure instance console log exists: /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.669 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.670 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.670 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:44:33 np0005603622 nova_compute[221301]: 2026-01-31 07:44:33.908 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Automatically allocating a network for project dc2f6584d8b64364b13683f53c58617f. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Jan 31 02:44:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:34.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:35 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 31 02:44:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:44:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:35.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:44:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:37.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:38.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:39.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:39 np0005603622 podman[223712]: 2026-01-31 07:44:39.186886093 +0000 UTC m=+0.104825866 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:44:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:40.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:41.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:42.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:43.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:44.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 31 02:44:45 np0005603622 podman[223738]: 2026-01-31 07:44:45.167870453 +0000 UTC m=+0.090089622 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 31 02:44:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:45.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 31 02:44:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:46.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:47.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:48.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:44:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:49.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:44:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:50.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:51.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:52.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:53.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:53 np0005603622 nova_compute[221301]: 2026-01-31 07:44:53.394 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Automatically allocated network: {'id': '992dcec1-3019-47a1-a14c-defd99a80f3d', 'name': 'auto_allocated_network', 'tenant_id': 'dc2f6584d8b64364b13683f53c58617f', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['84a7580e-2eb6-465f-aef5-8fe041bdab03', 'c97780ea-8676-4982-b5b9-dfb934b09fd9'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-31T07:44:32Z', 'updated_at': '2026-01-31T07:44:43Z', 'revision_number': 4, 'project_id': 'dc2f6584d8b64364b13683f53c58617f'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Jan 31 02:44:53 np0005603622 nova_compute[221301]: 2026-01-31 07:44:53.415 221324 WARNING oslo_policy.policy [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 31 02:44:53 np0005603622 nova_compute[221301]: 2026-01-31 07:44:53.416 221324 WARNING oslo_policy.policy [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 31 02:44:53 np0005603622 nova_compute[221301]: 2026-01-31 07:44:53.420 221324 DEBUG nova.policy [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0eb58e8663574849b17616075ce5c43e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc2f6584d8b64364b13683f53c58617f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:44:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:54 np0005603622 nova_compute[221301]: 2026-01-31 07:44:54.439 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Successfully created port: 631c8d69-c396-429f-a74e-ba51262add3a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:44:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:44:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:55.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.386 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Successfully updated port: 631c8d69-c396-429f-a74e-ba51262add3a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.403 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.404 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquired lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.404 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:44:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.651 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.878 221324 DEBUG nova.compute.manager [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-changed-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.878 221324 DEBUG nova.compute.manager [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Refreshing instance network info cache due to event network-changed-631c8d69-c396-429f-a74e-ba51262add3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:44:55 np0005603622 nova_compute[221301]: 2026-01-31 07:44:55.878 221324 DEBUG oslo_concurrency.lockutils [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:44:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:56.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:57.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.663 221324 DEBUG nova.network.neutron [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Updating instance_info_cache with network_info: [{"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.685 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Releasing lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.685 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Instance network_info: |[{"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.686 221324 DEBUG oslo_concurrency.lockutils [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.686 221324 DEBUG nova.network.neutron [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Refreshing network info cache for port 631c8d69-c396-429f-a74e-ba51262add3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.690 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Start _get_guest_xml network_info=[{"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.697 221324 WARNING nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.705 221324 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.706 221324 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.711 221324 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.711 221324 DEBUG nova.virt.libvirt.host [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.713 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.713 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.714 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.714 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.715 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.715 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.715 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.715 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.716 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.716 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.716 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.717 221324 DEBUG nova.virt.hardware [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:44:57 np0005603622 nova_compute[221301]: 2026-01-31 07:44:57.720 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:44:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/492145327' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:44:58 np0005603622 nova_compute[221301]: 2026-01-31 07:44:58.156 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:44:58 np0005603622 nova_compute[221301]: 2026-01-31 07:44:58.194 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:44:58 np0005603622 nova_compute[221301]: 2026-01-31 07:44:58.199 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:44:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:44:58.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:44:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:44:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:44:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:44:59.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.112 221324 DEBUG nova.network.neutron [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Updated VIF entry in instance network info cache for port 631c8d69-c396-429f-a74e-ba51262add3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.113 221324 DEBUG nova.network.neutron [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Updating instance_info_cache with network_info: [{"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.131 221324 DEBUG oslo_concurrency.lockutils [req-4aa2c4aa-a9be-4469-b0d9-1b49ffc436e3 req-91899993-d113-44a5-9689-50f3151ae574 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:00.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/21208082' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.507 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.308s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.508 221324 DEBUG nova.virt.libvirt.vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-3',id=4,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:32Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=3cbcda3b-911f-4b3b-8fb1-ce0bad66b087,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.508 221324 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.509 221324 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.511 221324 DEBUG nova.objects.instance [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'pci_devices' on Instance uuid 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.527 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <uuid>3cbcda3b-911f-4b3b-8fb1-ce0bad66b087</uuid>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <name>instance-00000004</name>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:name>tempest-tempest.common.compute-instance-1079069239-3</nova:name>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:44:57</nova:creationTime>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:user uuid="0eb58e8663574849b17616075ce5c43e">tempest-AutoAllocateNetworkTest-2135409609-project-member</nova:user>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:project uuid="dc2f6584d8b64364b13683f53c58617f">tempest-AutoAllocateNetworkTest-2135409609</nova:project>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <nova:port uuid="631c8d69-c396-429f-a74e-ba51262add3a">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.1.0.94" ipVersion="4"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="fdfe:381f:8400::9b" ipVersion="6"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="serial">3cbcda3b-911f-4b3b-8fb1-ce0bad66b087</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="uuid">3cbcda3b-911f-4b3b-8fb1-ce0bad66b087</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:51:a5:3d"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <target dev="tap631c8d69-c3"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/console.log" append="off"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:45:00 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:45:00 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:45:00 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:45:00 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.528 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Preparing to wait for external event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.528 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.528 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.529 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.529 221324 DEBUG nova.virt.libvirt.vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-3',id=4,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:44:32Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=3cbcda3b-911f-4b3b-8fb1-ce0bad66b087,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.530 221324 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.530 221324 DEBUG nova.network.os_vif_util [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.531 221324 DEBUG os_vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.572 221324 DEBUG ovsdbapp.backend.ovs_idl [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.572 221324 DEBUG ovsdbapp.backend.ovs_idl [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.572 221324 DEBUG ovsdbapp.backend.ovs_idl [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.593 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.593 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.594 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:00 np0005603622 nova_compute[221301]: 2026-01-31 07:45:00.595 221324 INFO oslo.privsep.daemon [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpob0mfae0/privsep.sock']#033[00m
Jan 31 02:45:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:01.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.348 221324 INFO oslo.privsep.daemon [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.155 223824 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.163 223824 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.165 223824 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.166 223824 INFO oslo.privsep.daemon [-] privsep daemon running as pid 223824#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.756 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap631c8d69-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.756 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap631c8d69-c3, col_values=(('external_ids', {'iface-id': '631c8d69-c396-429f-a74e-ba51262add3a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:a5:3d', 'vm-uuid': '3cbcda3b-911f-4b3b-8fb1-ce0bad66b087'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:45:01 np0005603622 NetworkManager[49080]: <info>  [1769845501.7643] manager: (tap631c8d69-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.773 221324 INFO os_vif [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3')#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.844 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.845 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.845 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] No VIF found with MAC fa:16:3e:51:a5:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.845 221324 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Using config drive#033[00m
Jan 31 02:45:01 np0005603622 nova_compute[221301]: 2026-01-31 07:45:01.869 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:02.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.400 221324 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Creating config drive at /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.403 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsdru10ug execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.432 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.432 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.445 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.523 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsdru10ug" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.556 221324 DEBUG nova.storage.rbd_utils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] rbd image 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:02 np0005603622 nova_compute[221301]: 2026-01-31 07:45:02.561 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:03.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.035 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.036 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.054 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.144 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.145 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.155 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.155 221324 INFO nova.compute.claims [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:45:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:04.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.321 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.557 221324 DEBUG oslo_concurrency.processutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.997s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.558 221324 INFO nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Deleting local config drive /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087/disk.config because it was imported into RBD.#033[00m
Jan 31 02:45:04 np0005603622 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 31 02:45:04 np0005603622 kernel: tap631c8d69-c3: entered promiscuous mode
Jan 31 02:45:04 np0005603622 NetworkManager[49080]: <info>  [1769845504.6141] manager: (tap631c8d69-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 31 02:45:04 np0005603622 systemd-udevd[223922]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:04 np0005603622 NetworkManager[49080]: <info>  [1769845504.6545] device (tap631c8d69-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:45:04 np0005603622 NetworkManager[49080]: <info>  [1769845504.6552] device (tap631c8d69-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:04Z|00027|binding|INFO|Claiming lport 631c8d69-c396-429f-a74e-ba51262add3a for this chassis.
Jan 31 02:45:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:04Z|00028|binding|INFO|631c8d69-c396-429f-a74e-ba51262add3a: Claiming fa:16:3e:51:a5:3d 10.1.0.94 fdfe:381f:8400::9b
Jan 31 02:45:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:04.681 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a5:3d 10.1.0.94 fdfe:381f:8400::9b'], port_security=['fa:16:3e:51:a5:3d 10.1.0.94 fdfe:381f:8400::9b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.94/26 fdfe:381f:8400::9b/64', 'neutron:device_id': '3cbcda3b-911f-4b3b-8fb1-ce0bad66b087', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992dcec1-3019-47a1-a14c-defd99a80f3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc2f6584d8b64364b13683f53c58617f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f48a740a-df16-488d-83ce-01edcece1d5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d5eabbe-dd4d-4e48-a2ac-b48c29338142, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=631c8d69-c396-429f-a74e-ba51262add3a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:04.682 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 631c8d69-c396-429f-a74e-ba51262add3a in datapath 992dcec1-3019-47a1-a14c-defd99a80f3d bound to our chassis#033[00m
Jan 31 02:45:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:04.684 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 992dcec1-3019-47a1-a14c-defd99a80f3d#033[00m
Jan 31 02:45:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:04.687 139793 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmph059cndw/privsep.sock']#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.710 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:04 np0005603622 systemd-machined[190406]: New machine qemu-2-instance-00000004.
Jan 31 02:45:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:04Z|00029|binding|INFO|Setting lport 631c8d69-c396-429f-a74e-ba51262add3a ovn-installed in OVS
Jan 31 02:45:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:04Z|00030|binding|INFO|Setting lport 631c8d69-c396-429f-a74e-ba51262add3a up in Southbound
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.719 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:04 np0005603622 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 31 02:45:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/856683639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.783 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.791 221324 DEBUG nova.compute.provider_tree [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.805 221324 DEBUG nova.scheduler.client.report [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.835 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.835 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.885 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.886 221324 DEBUG nova.network.neutron [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.905 221324 INFO nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:45:04 np0005603622 nova_compute[221301]: 2026-01-31 07:45:04.920 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.003 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.004 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.004 221324 INFO nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Creating image(s)#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.033 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.079 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.154 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.159 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:05.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.209 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.210 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.211 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.211 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.277 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.281 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.298 139793 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.299 139793 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmph059cndw/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.173 223997 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.178 223997 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.182 223997 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.182 223997 INFO oslo.privsep.daemon [-] privsep daemon running as pid 223997#033[00m
Jan 31 02:45:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:05.301 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39a685ec-5ff7-4b64-9e7d-1430405c6509]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.365 221324 DEBUG nova.network.neutron [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.366 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.453 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.454 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:05 np0005603622 nova_compute[221301]: 2026-01-31 07:45:05.455 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:06.181 223997 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:06.181 223997 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:06.181 223997 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:06.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:06 np0005603622 nova_compute[221301]: 2026-01-31 07:45:06.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.010 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845507.0100048, 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.010 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] VM Started (Lifecycle Event)#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.026 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.030 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845507.0101273, 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.030 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.043 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.048 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.066 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.116 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.835s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.167 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.167 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.167 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.167 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bc2609cd-2682-4f66-91db-7b152e03fc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:07.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:07 np0005603622 nova_compute[221301]: 2026-01-31 07:45:07.209 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] resizing rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.001 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2447d331-ceec-491d-9d20-896adf581fa5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.002 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap992dcec1-31 in ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.004 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap992dcec1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.005 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[54b29bcd-b7d3-406c-8d22-5d8a41ac8aa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.011 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f25332d1-b3a7-40f8-a88c-56694ffcbc11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.033 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[db9ec3a0-7615-4c83-89db-f4035881900d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.047 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0d9065-31f8-490c-9075-87d9b8077fdd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.049 139793 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmplronvzw3/privsep.sock']#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.154 221324 DEBUG nova.objects.instance [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lazy-loading 'migration_context' on Instance uuid 35cd3508-d6a0-4daf-8bd1-af07b852cabf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.158 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.183 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.183 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Ensure instance console log exists: /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.184 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.184 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.185 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.187 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.192 221324 WARNING nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.199 221324 DEBUG nova.virt.libvirt.host [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.200 221324 DEBUG nova.virt.libvirt.host [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.206 221324 DEBUG nova.virt.libvirt.host [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.206 221324 DEBUG nova.virt.libvirt.host [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.208 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.208 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.209 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.209 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.209 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.210 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.210 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.210 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.211 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.211 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.211 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.212 221324 DEBUG nova.virt.hardware [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.216 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:08.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.280 221324 DEBUG nova.compute.manager [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.280 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.282 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.282 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.282 221324 DEBUG nova.compute.manager [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Processing event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.282 221324 DEBUG nova.compute.manager [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.283 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.283 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.283 221324 DEBUG oslo_concurrency.lockutils [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.283 221324 DEBUG nova.compute.manager [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] No waiting events found dispatching network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.284 221324 WARNING nova.compute.manager [req-f21c2ebe-dfe5-4f41-b3d5-e5e1f5defe64 req-222014df-2773-4df3-9014-36b6f0520ef3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received unexpected event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a for instance with vm_state building and task_state spawning.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.284 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.287 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845508.287675, 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.288 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.290 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.312 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.319 221324 INFO nova.virt.libvirt.driver [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Instance spawned successfully.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.320 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.321 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.344 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.349 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.349 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.350 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.350 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.351 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.351 221324 DEBUG nova.virt.libvirt.driver [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.403 221324 INFO nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Took 35.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.404 221324 DEBUG nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.462 221324 INFO nova.compute.manager [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Took 38.22 seconds to build instance.#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.477 221324 DEBUG oslo_concurrency.lockutils [None req-36ed46a6-e1c2-4d7b-8a82-f14719996b90 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 38.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3054798852' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.664 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.707 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.715 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.735 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.755 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.755 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.756 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.756 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.756 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.757 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.760 139793 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.761 139793 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplronvzw3/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.607 224184 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.611 224184 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.613 224184 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.613 224184 INFO oslo.privsep.daemon [-] privsep daemon running as pid 224184#033[00m
Jan 31 02:45:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:08.767 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2924f6b2-587f-4079-888f-e57f9b7b2061]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.775 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.776 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.776 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.776 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:45:08 np0005603622 nova_compute[221301]: 2026-01-31 07:45:08.776 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:45:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1025414350' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.169 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.172 221324 DEBUG nova.objects.instance [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lazy-loading 'pci_devices' on Instance uuid 35cd3508-d6a0-4daf-8bd1-af07b852cabf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.198 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <uuid>35cd3508-d6a0-4daf-8bd1-af07b852cabf</uuid>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <name>instance-00000008</name>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:name>tempest-LiveMigrationNegativeTest-server-2023908642</nova:name>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:45:08</nova:creationTime>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:user uuid="5dac6d92165448b3a1c60bea57f8e48d">tempest-LiveMigrationNegativeTest-402716808-project-member</nova:user>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <nova:project uuid="5047d468cca049c2891d27def49df57f">tempest-LiveMigrationNegativeTest-402716808</nova:project>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="serial">35cd3508-d6a0-4daf-8bd1-af07b852cabf</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="uuid">35cd3508-d6a0-4daf-8bd1-af07b852cabf</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/console.log" append="off"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:45:09 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:45:09 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:45:09 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:45:09 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:45:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:09.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3488977813' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.277 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.277 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.277 221324 INFO nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Using config drive#033[00m
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.282 224184 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.282 224184 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.282 224184 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.305 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.311 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:09 np0005603622 podman[224251]: 2026-01-31 07:45:09.325727352 +0000 UTC m=+0.086196098 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.389 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.389 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.392 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.392 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.395 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.395 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.511 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.512 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4832MB free_disk=20.778430938720703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.512 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.513 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.601 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance bc2609cd-2682-4f66-91db-7b152e03fc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.601 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.602 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 35cd3508-d6a0-4daf-8bd1-af07b852cabf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.602 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.602 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.703 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.875 221324 INFO nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Creating config drive at /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config#033[00m
Jan 31 02:45:09 np0005603622 nova_compute[221301]: 2026-01-31 07:45:09.882 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgx4ynfvb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.904 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c5fdf2-beaa-4a59-8d8a-6311862abba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:09 np0005603622 NetworkManager[49080]: <info>  [1769845509.9240] manager: (tap992dcec1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.922 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4b173411-47b3-4f7c-bc9b-75fd216ad17d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:09 np0005603622 systemd-udevd[224329]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.949 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0d59d742-90af-4a53-b762-60c24d741618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.956 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[62d694c0-49b1-4e51-a2b1-2c39f105ea44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:09 np0005603622 NetworkManager[49080]: <info>  [1769845509.9895] device (tap992dcec1-30): carrier: link connected
Jan 31 02:45:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:09.996 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bdaf0952-64e1-4663-b790-58bd0b18fadc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.008 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgx4ynfvb" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.016 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d788646a-4a76-4391-abfc-dd64af058b3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992dcec1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:60:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476132, 'reachable_time': 40000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224347, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.032 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3c7ff3-53cc-4282-bd38-b04e885dbc3a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:6054'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476132, 'tstamp': 476132}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224348, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.050 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[363f69ed-fab2-4112-a54c-e10c29866333]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992dcec1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:60:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476132, 'reachable_time': 40000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224363, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.059 221324 DEBUG nova.storage.rbd_utils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] rbd image 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.064 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.086 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[35567fc6-cfc8-41a7-afd0-e49d01bcdcdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.140 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b32b7621-fb87-4cb8-8beb-0681530f21a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.143 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992dcec1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.144 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.144 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap992dcec1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603622 kernel: tap992dcec1-30: entered promiscuous mode
Jan 31 02:45:10 np0005603622 NetworkManager[49080]: <info>  [1769845510.1475] manager: (tap992dcec1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.152 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap992dcec1-30, col_values=(('external_ids', {'iface-id': '1443ed6f-926c-4e3e-8e41-280e0ddc0f64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:10Z|00031|binding|INFO|Releasing lport 1443ed6f-926c-4e3e-8e41-280e0ddc0f64 from this chassis (sb_readonly=0)
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.163 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.164 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb743404-3ff1-47ee-beae-4c47cf3d5095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.167 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-992dcec1-3019-47a1-a14c-defd99a80f3d
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/992dcec1-3019-47a1-a14c-defd99a80f3d.pid.haproxy
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 992dcec1-3019-47a1-a14c-defd99a80f3d
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:45:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/87932381' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.169 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'env', 'PROCESS_TAG=haproxy-992dcec1-3019-47a1-a14c-defd99a80f3d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/992dcec1-3019-47a1-a14c-defd99a80f3d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.185 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.194 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.224 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:10.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.268 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:45:10 np0005603622 nova_compute[221301]: 2026-01-31 07:45:10.268 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.266 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:10 np0005603622 podman[224418]: 2026-01-31 07:45:10.595230652 +0000 UTC m=+0.067054685 container create 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:45:10 np0005603622 podman[224418]: 2026-01-31 07:45:10.553822494 +0000 UTC m=+0.025646487 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:45:10 np0005603622 systemd[1]: Started libpod-conmon-9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a.scope.
Jan 31 02:45:10 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:45:10 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dd7588e7ab2d553395aea35eaaa22178f1eb86326a557ce29df9d54017eff36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:45:10 np0005603622 podman[224418]: 2026-01-31 07:45:10.715468879 +0000 UTC m=+0.187292912 container init 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:45:10 np0005603622 podman[224418]: 2026-01-31 07:45:10.722687852 +0000 UTC m=+0.194511865 container start 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 02:45:10 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [NOTICE]   (224438) : New worker (224443) forked
Jan 31 02:45:10 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [NOTICE]   (224438) : Loading success.
Jan 31 02:45:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:10.791 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:45:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:11.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.331 221324 DEBUG oslo_concurrency.processutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config 35cd3508-d6a0-4daf-8bd1-af07b852cabf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.332 221324 INFO nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Deleting local config drive /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf/disk.config because it was imported into RBD.#033[00m
Jan 31 02:45:11 np0005603622 systemd-machined[190406]: New machine qemu-3-instance-00000008.
Jan 31 02:45:11 np0005603622 systemd[1]: Started Virtual Machine qemu-3-instance-00000008.
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.907 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:11 np0005603622 nova_compute[221301]: 2026-01-31 07:45:11.908 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:45:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:12.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.482 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845512.482097, 35cd3508-d6a0-4daf-8bd1-af07b852cabf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.483 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.487 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.488 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.498 221324 INFO nova.virt.libvirt.driver [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance spawned successfully.#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.499 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.512 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.521 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.529 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.530 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.530 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.532 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.532 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.533 221324 DEBUG nova.virt.libvirt.driver [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.544 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.544 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845512.4835165, 35cd3508-d6a0-4daf-8bd1-af07b852cabf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.544 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] VM Started (Lifecycle Event)#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.567 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.571 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.587 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.603 221324 INFO nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Took 7.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.603 221324 DEBUG nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.659 221324 INFO nova.compute.manager [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Took 8.55 seconds to build instance.#033[00m
Jan 31 02:45:12 np0005603622 nova_compute[221301]: 2026-01-31 07:45:12.679 221324 DEBUG oslo_concurrency.lockutils [None req-5f56f9e2-dff1-40de-a128-61f4ed5542c3 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:13.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.646 221324 DEBUG nova.objects.instance [None req-02ace33a-b433-4c99-96e8-3d7c03e25320 6da31ddd1eaa43c682822d7c0d227b11 478ac60a644d4037bc645c3ea159d0db - - default default] Lazy-loading 'pci_devices' on Instance uuid 35cd3508-d6a0-4daf-8bd1-af07b852cabf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.666 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845514.6660779, 35cd3508-d6a0-4daf-8bd1-af07b852cabf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.666 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.684 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.690 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:45:14 np0005603622 nova_compute[221301]: 2026-01-31 07:45:14.711 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 02:45:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:15.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:15 np0005603622 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 31 02:45:15 np0005603622 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 3.293s CPU time.
Jan 31 02:45:15 np0005603622 systemd-machined[190406]: Machine qemu-3-instance-00000008 terminated.
Jan 31 02:45:15 np0005603622 podman[224512]: 2026-01-31 07:45:15.39632444 +0000 UTC m=+0.060827889 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 02:45:15 np0005603622 nova_compute[221301]: 2026-01-31 07:45:15.436 221324 DEBUG nova.compute.manager [None req-02ace33a-b433-4c99-96e8-3d7c03e25320 6da31ddd1eaa43c682822d7c0d227b11 478ac60a644d4037bc645c3ea159d0db - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:16.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:16 np0005603622 nova_compute[221301]: 2026-01-31 07:45:16.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:16 np0005603622 nova_compute[221301]: 2026-01-31 07:45:16.762 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:16.794 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:17.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:18.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.529 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.530 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.530 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.531 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.531 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.533 221324 INFO nova.compute.manager [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Terminating instance#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.534 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "refresh_cache-35cd3508-d6a0-4daf-8bd1-af07b852cabf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.534 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquired lock "refresh_cache-35cd3508-d6a0-4daf-8bd1-af07b852cabf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.534 221324 DEBUG nova.network.neutron [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:45:18 np0005603622 nova_compute[221301]: 2026-01-31 07:45:18.703 221324 DEBUG nova.network.neutron [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:19 np0005603622 nova_compute[221301]: 2026-01-31 07:45:19.098 221324 DEBUG nova.network.neutron [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:19 np0005603622 nova_compute[221301]: 2026-01-31 07:45:19.144 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Releasing lock "refresh_cache-35cd3508-d6a0-4daf-8bd1-af07b852cabf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:19 np0005603622 nova_compute[221301]: 2026-01-31 07:45:19.144 221324 DEBUG nova.compute.manager [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:45:19 np0005603622 nova_compute[221301]: 2026-01-31 07:45:19.156 221324 INFO nova.virt.libvirt.driver [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance destroyed successfully.#033[00m
Jan 31 02:45:19 np0005603622 nova_compute[221301]: 2026-01-31 07:45:19.157 221324 DEBUG nova.objects.instance [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lazy-loading 'resources' on Instance uuid 35cd3508-d6a0-4daf-8bd1-af07b852cabf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:19.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:20.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:21.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:21 np0005603622 nova_compute[221301]: 2026-01-31 07:45:21.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:21 np0005603622 nova_compute[221301]: 2026-01-31 07:45:21.763 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:22.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:23.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.259 221324 INFO nova.virt.libvirt.driver [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Deleting instance files /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf_del#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.261 221324 INFO nova.virt.libvirt.driver [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Deletion of /var/lib/nova/instances/35cd3508-d6a0-4daf-8bd1-af07b852cabf_del complete#033[00m
Jan 31 02:45:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:24.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:24 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:24Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:51:a5:3d 10.1.0.94
Jan 31 02:45:24 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:24Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:a5:3d 10.1.0.94
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.448 221324 DEBUG nova.virt.libvirt.host [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.449 221324 INFO nova.virt.libvirt.host [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] UEFI support detected#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.463 221324 INFO nova.compute.manager [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Took 5.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.465 221324 DEBUG oslo.service.loopingcall [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.466 221324 DEBUG nova.compute.manager [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.466 221324 DEBUG nova.network.neutron [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.600 221324 DEBUG nova.network.neutron [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.761 221324 DEBUG nova.network.neutron [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.789 221324 INFO nova.compute.manager [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Took 0.32 seconds to deallocate network for instance.#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.909 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:24 np0005603622 nova_compute[221301]: 2026-01-31 07:45:24.910 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.007 221324 DEBUG oslo_concurrency.processutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:25.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/954137829' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.547 221324 DEBUG oslo_concurrency.processutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.556 221324 DEBUG nova.compute.provider_tree [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.574 221324 DEBUG nova.scheduler.client.report [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.601 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.636 221324 INFO nova.scheduler.client.report [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Deleted allocations for instance 35cd3508-d6a0-4daf-8bd1-af07b852cabf#033[00m
Jan 31 02:45:25 np0005603622 nova_compute[221301]: 2026-01-31 07:45:25.707 221324 DEBUG oslo_concurrency.lockutils [None req-11ba901d-8de9-492e-b0fc-a6f7f672d71c 5dac6d92165448b3a1c60bea57f8e48d 5047d468cca049c2891d27def49df57f - - default default] Lock "35cd3508-d6a0-4daf-8bd1-af07b852cabf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:26 np0005603622 nova_compute[221301]: 2026-01-31 07:45:26.469 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:26 np0005603622 nova_compute[221301]: 2026-01-31 07:45:26.764 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:27.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:27 np0005603622 podman[224748]: 2026-01-31 07:45:27.537019634 +0000 UTC m=+0.140300675 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3)
Jan 31 02:45:27 np0005603622 podman[224748]: 2026-01-31 07:45:27.78163658 +0000 UTC m=+0.384917611 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:45:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:28.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.057 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.059 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.059 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.059 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.060 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.061 221324 INFO nova.compute.manager [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Terminating instance#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.063 221324 DEBUG nova.compute.manager [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:45:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:30.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.437 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845515.435621, 35cd3508-d6a0-4daf-8bd1-af07b852cabf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.437 221324 INFO nova.compute.manager [-] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.464 221324 DEBUG nova.compute.manager [None req-f141fb2a-5445-468c-9bc9-0be801ce0cef - - - - - -] [instance: 35cd3508-d6a0-4daf-8bd1-af07b852cabf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.838 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.839 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.840 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:30 np0005603622 kernel: tap631c8d69-c3 (unregistering): left promiscuous mode
Jan 31 02:45:30 np0005603622 NetworkManager[49080]: <info>  [1769845530.9116] device (tap631c8d69-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:45:30 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:30Z|00032|binding|INFO|Releasing lport 631c8d69-c396-429f-a74e-ba51262add3a from this chassis (sb_readonly=0)
Jan 31 02:45:30 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:30Z|00033|binding|INFO|Setting lport 631c8d69-c396-429f-a74e-ba51262add3a down in Southbound
Jan 31 02:45:30 np0005603622 ovn_controller[130109]: 2026-01-31T07:45:30Z|00034|binding|INFO|Removing iface tap631c8d69-c3 ovn-installed in OVS
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.923 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.934 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a5:3d 10.1.0.94 fdfe:381f:8400::9b'], port_security=['fa:16:3e:51:a5:3d 10.1.0.94 fdfe:381f:8400::9b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.94/26 fdfe:381f:8400::9b/64', 'neutron:device_id': '3cbcda3b-911f-4b3b-8fb1-ce0bad66b087', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992dcec1-3019-47a1-a14c-defd99a80f3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc2f6584d8b64364b13683f53c58617f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f48a740a-df16-488d-83ce-01edcece1d5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d5eabbe-dd4d-4e48-a2ac-b48c29338142, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=631c8d69-c396-429f-a74e-ba51262add3a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.937 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 631c8d69-c396-429f-a74e-ba51262add3a in datapath 992dcec1-3019-47a1-a14c-defd99a80f3d unbound from our chassis#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.940 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 992dcec1-3019-47a1-a14c-defd99a80f3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:45:30 np0005603622 nova_compute[221301]: 2026-01-31 07:45:30.941 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.943 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a04ec534-0f9d-4868-8ab5-99ce3196d652]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:30.945 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d namespace which is not needed anymore#033[00m
Jan 31 02:45:30 np0005603622 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 31 02:45:30 np0005603622 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.702s CPU time.
Jan 31 02:45:30 np0005603622 systemd-machined[190406]: Machine qemu-2-instance-00000004 terminated.
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.093 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [NOTICE]   (224438) : haproxy version is 2.8.14-c23fe91
Jan 31 02:45:31 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [NOTICE]   (224438) : path to executable is /usr/sbin/haproxy
Jan 31 02:45:31 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [WARNING]  (224438) : Exiting Master process...
Jan 31 02:45:31 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [ALERT]    (224438) : Current worker (224443) exited with code 143 (Terminated)
Jan 31 02:45:31 np0005603622 neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d[224434]: [WARNING]  (224438) : All workers exited. Exiting... (0)
Jan 31 02:45:31 np0005603622 systemd[1]: libpod-9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a.scope: Deactivated successfully.
Jan 31 02:45:31 np0005603622 podman[225027]: 2026-01-31 07:45:31.108274405 +0000 UTC m=+0.062051392 container died 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.114 221324 INFO nova.virt.libvirt.driver [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Instance destroyed successfully.#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.115 221324 DEBUG nova.objects.instance [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'resources' on Instance uuid 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.136 221324 DEBUG nova.virt.libvirt.vif [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1079069239-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1079069239-3',id=4,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2026-01-31T07:45:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dc2f6584d8b64364b13683f53c58617f',ramdisk_id='',reservation_id='r-jd4rofhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-2135409609',owner_user_name='tempest-AutoAllocateNetworkTest-2135409609-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:45:08Z,user_data=None,user_id='0eb58e8663574849b17616075ce5c43e',uuid=3cbcda3b-911f-4b3b-8fb1-ce0bad66b087,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.137 221324 DEBUG nova.network.os_vif_util [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converting VIF {"id": "631c8d69-c396-429f-a74e-ba51262add3a", "address": "fa:16:3e:51:a5:3d", "network": {"id": "992dcec1-3019-47a1-a14c-defd99a80f3d", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.64/26", "dns": [], "gateway": {"address": "10.1.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::9b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc2f6584d8b64364b13683f53c58617f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap631c8d69-c3", "ovs_interfaceid": "631c8d69-c396-429f-a74e-ba51262add3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:45:31 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a-userdata-shm.mount: Deactivated successfully.
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.140 221324 DEBUG nova.network.os_vif_util [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.141 221324 DEBUG os_vif [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:45:31 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0dd7588e7ab2d553395aea35eaaa22178f1eb86326a557ce29df9d54017eff36-merged.mount: Deactivated successfully.
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.145 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap631c8d69-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:45:31 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 02:45:31 np0005603622 podman[225027]: 2026-01-31 07:45:31.152057386 +0000 UTC m=+0.105834373 container cleanup 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.155 221324 INFO os_vif [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:3d,bridge_name='br-int',has_traffic_filtering=True,id=631c8d69-c396-429f-a74e-ba51262add3a,network=Network(992dcec1-3019-47a1-a14c-defd99a80f3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap631c8d69-c3')#033[00m
Jan 31 02:45:31 np0005603622 systemd[1]: libpod-conmon-9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a.scope: Deactivated successfully.
Jan 31 02:45:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:31 np0005603622 podman[225069]: 2026-01-31 07:45:31.216024758 +0000 UTC m=+0.039851308 container remove 9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.222 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa8490d-54f7-487c-bed9-c6d9b50d5b43]: (4, ('Sat Jan 31 07:45:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d (9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a)\n9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a\nSat Jan 31 07:45:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d (9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a)\n9b075525c245ab345fededa7627efcb613c6b6f1235ae9fca2afab0bea42637a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.223 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e774b8d9-9157-4dd6-8c57-8f7155d9b793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.224 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992dcec1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 kernel: tap992dcec1-30: left promiscuous mode
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.230 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e42a0b9d-c528-4083-a719-fbd27ead9ea1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:31.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.254 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dccec6-f586-4596-bea2-e308c759bdd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.255 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[05510483-62ce-4caf-b387-54701c48709d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.265 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e370ba37-c1ce-4b9d-98dd-ffa943c4abe3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476123, 'reachable_time': 33224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225102, 'error': None, 'target': 'ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.269 221324 DEBUG nova.compute.manager [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-unplugged-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.269 221324 DEBUG oslo_concurrency.lockutils [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.269 221324 DEBUG oslo_concurrency.lockutils [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.269 221324 DEBUG oslo_concurrency.lockutils [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.269 221324 DEBUG nova.compute.manager [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] No waiting events found dispatching network-vif-unplugged-631c8d69-c396-429f-a74e-ba51262add3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.270 221324 DEBUG nova.compute.manager [req-1797cabd-e410-41c6-859e-ff2c97a27ca7 req-73a34427-716c-4f23-89bb-414c80815ebf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-unplugged-631c8d69-c396-429f-a74e-ba51262add3a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:45:31 np0005603622 systemd[1]: run-netns-ovnmeta\x2d992dcec1\x2d3019\x2d47a1\x2da14c\x2ddefd99a80f3d.mount: Deactivated successfully.
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.273 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-992dcec1-3019-47a1-a14c-defd99a80f3d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:45:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:45:31.273 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6a12a6-0a72-43c9-9c44-bccc8797478d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:45:31 np0005603622 nova_compute[221301]: 2026-01-31 07:45:31.470 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:32.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:45:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:45:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:33.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.342 221324 DEBUG nova.compute.manager [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.342 221324 DEBUG oslo_concurrency.lockutils [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.343 221324 DEBUG oslo_concurrency.lockutils [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.343 221324 DEBUG oslo_concurrency.lockutils [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.344 221324 DEBUG nova.compute.manager [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] No waiting events found dispatching network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:45:33 np0005603622 nova_compute[221301]: 2026-01-31 07:45:33.344 221324 WARNING nova.compute.manager [req-60e23869-df8e-4159-9abd-33ce79506846 req-de83f9e1-8ce6-435f-b987-09223ac263f2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received unexpected event network-vif-plugged-631c8d69-c396-429f-a74e-ba51262add3a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:45:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:35.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:36 np0005603622 nova_compute[221301]: 2026-01-31 07:45:36.150 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:36.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:36 np0005603622 nova_compute[221301]: 2026-01-31 07:45:36.502 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:37.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:38.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:39.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:40 np0005603622 podman[225106]: 2026-01-31 07:45:40.226162101 +0000 UTC m=+0.144654261 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 02:45:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:40.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:41 np0005603622 nova_compute[221301]: 2026-01-31 07:45:41.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:41.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:41 np0005603622 nova_compute[221301]: 2026-01-31 07:45:41.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:42.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:43.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.075 221324 INFO nova.virt.libvirt.driver [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Deleting instance files /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_del#033[00m
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.076 221324 INFO nova.virt.libvirt.driver [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Deletion of /var/lib/nova/instances/3cbcda3b-911f-4b3b-8fb1-ce0bad66b087_del complete#033[00m
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.133 221324 INFO nova.compute.manager [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Took 14.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.134 221324 DEBUG oslo.service.loopingcall [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.134 221324 DEBUG nova.compute.manager [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:45:44 np0005603622 nova_compute[221301]: 2026-01-31 07:45:44.135 221324 DEBUG nova.network.neutron [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:45:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:45:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:44.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:45:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:45.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.111 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845531.1100357, 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.112 221324 INFO nova.compute.manager [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.136 221324 DEBUG nova.network.neutron [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:46 np0005603622 podman[225133]: 2026-01-31 07:45:46.148715398 +0000 UTC m=+0.076507389 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.148 221324 DEBUG nova.compute.manager [None req-1f3c8087-6bac-4221-bc20-08d7e2ba5bbe - - - - - -] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.153 221324 INFO nova.compute.manager [-] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Took 2.02 seconds to deallocate network for instance.#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.157 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.209 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.210 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.297 221324 DEBUG nova.compute.manager [req-5d11d99c-7e40-4a0c-a8c1-3f85515631a0 req-da24db95-ad3b-4e0e-8704-0bcf7a209a6e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087] Received event network-vif-deleted-631c8d69-c396-429f-a74e-ba51262add3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.306 221324 DEBUG oslo_concurrency.processutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:45:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:46.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:45:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3551869189' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.928 221324 DEBUG oslo_concurrency.processutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.622s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.936 221324 DEBUG nova.compute.provider_tree [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:45:46 np0005603622 nova_compute[221301]: 2026-01-31 07:45:46.988 221324 DEBUG nova.scheduler.client.report [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:45:47 np0005603622 nova_compute[221301]: 2026-01-31 07:45:47.225 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:47.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:45:47 np0005603622 nova_compute[221301]: 2026-01-31 07:45:47.397 221324 INFO nova.scheduler.client.report [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Deleted allocations for instance 3cbcda3b-911f-4b3b-8fb1-ce0bad66b087#033[00m
Jan 31 02:45:47 np0005603622 nova_compute[221301]: 2026-01-31 07:45:47.661 221324 DEBUG oslo_concurrency.lockutils [None req-a7711ac8-ba7b-4029-bfb6-3acba0c3e882 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "3cbcda3b-911f-4b3b-8fb1-ce0bad66b087" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:49.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:45:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:50.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:45:51 np0005603622 nova_compute[221301]: 2026-01-31 07:45:51.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:51.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:51 np0005603622 nova_compute[221301]: 2026-01-31 07:45:51.510 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:52.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:52 np0005603622 nova_compute[221301]: 2026-01-31 07:45:52.886 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:53.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:54.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:45:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:45:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:55.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:56.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.372 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "bc2609cd-2682-4f66-91db-7b152e03fc32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.373 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.373 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "bc2609cd-2682-4f66-91db-7b152e03fc32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.374 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.374 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.376 221324 INFO nova.compute.manager [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Terminating instance#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.377 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.378 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquired lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.378 221324 DEBUG nova.network.neutron [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:45:56 np0005603622 nova_compute[221301]: 2026-01-31 07:45:56.719 221324 DEBUG nova.network.neutron [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:45:57 np0005603622 nova_compute[221301]: 2026-01-31 07:45:57.278 221324 DEBUG nova.network.neutron [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:45:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:57.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:57 np0005603622 nova_compute[221301]: 2026-01-31 07:45:57.296 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Releasing lock "refresh_cache-bc2609cd-2682-4f66-91db-7b152e03fc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:45:57 np0005603622 nova_compute[221301]: 2026-01-31 07:45:57.297 221324 DEBUG nova.compute.manager [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:45:57 np0005603622 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 31 02:45:57 np0005603622 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 17.047s CPU time.
Jan 31 02:45:57 np0005603622 systemd-machined[190406]: Machine qemu-1-instance-00000001 terminated.
Jan 31 02:45:57 np0005603622 nova_compute[221301]: 2026-01-31 07:45:57.519 221324 INFO nova.virt.libvirt.driver [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance destroyed successfully.#033[00m
Jan 31 02:45:57 np0005603622 nova_compute[221301]: 2026-01-31 07:45:57.520 221324 DEBUG nova.objects.instance [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lazy-loading 'resources' on Instance uuid bc2609cd-2682-4f66-91db-7b152e03fc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:45:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:45:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:45:58.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:45:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:45:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:45:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:45:59.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:00.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.207 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:01.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.754 221324 INFO nova.virt.libvirt.driver [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Deleting instance files /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32_del#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.755 221324 INFO nova.virt.libvirt.driver [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Deletion of /var/lib/nova/instances/bc2609cd-2682-4f66-91db-7b152e03fc32_del complete#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.811 221324 INFO nova.compute.manager [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Took 4.51 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.812 221324 DEBUG oslo.service.loopingcall [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.812 221324 DEBUG nova.compute.manager [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:46:01 np0005603622 nova_compute[221301]: 2026-01-31 07:46:01.812 221324 DEBUG nova.network.neutron [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.031 221324 DEBUG nova.network.neutron [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.045 221324 DEBUG nova.network.neutron [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.070 221324 INFO nova.compute.manager [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Took 0.26 seconds to deallocate network for instance.#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.126 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.126 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.182 221324 DEBUG oslo_concurrency.processutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:02.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3148990516' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.626 221324 DEBUG oslo_concurrency.processutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.633 221324 DEBUG nova.compute.provider_tree [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.648 221324 DEBUG nova.scheduler.client.report [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.687 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.722 221324 INFO nova.scheduler.client.report [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Deleted allocations for instance bc2609cd-2682-4f66-91db-7b152e03fc32#033[00m
Jan 31 02:46:02 np0005603622 nova_compute[221301]: 2026-01-31 07:46:02.788 221324 DEBUG oslo_concurrency.lockutils [None req-4c56727e-76d3-40ff-82d4-a44c0bf99725 0eb58e8663574849b17616075ce5c43e dc2f6584d8b64364b13683f53c58617f - - default default] Lock "bc2609cd-2682-4f66-91db-7b152e03fc32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:03.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:04.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:05 np0005603622 nova_compute[221301]: 2026-01-31 07:46:05.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:06 np0005603622 nova_compute[221301]: 2026-01-31 07:46:06.211 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:06.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:06 np0005603622 nova_compute[221301]: 2026-01-31 07:46:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:06 np0005603622 nova_compute[221301]: 2026-01-31 07:46:06.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:06 np0005603622 nova_compute[221301]: 2026-01-31 07:46:06.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:46:06 np0005603622 nova_compute[221301]: 2026-01-31 07:46:06.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.417 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.418 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.418 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.418 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.418 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3954704212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.815 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.972 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.974 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5008MB free_disk=20.943607330322266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.974 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:07 np0005603622 nova_compute[221301]: 2026-01-31 07:46:07.974 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.068 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.068 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.088 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:08.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1981217270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.517 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.522 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.546 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.570 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:46:08 np0005603622 nova_compute[221301]: 2026-01-31 07:46:08.571 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:09.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.571 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.607 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.607 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.607 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.619 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:46:09 np0005603622 nova_compute[221301]: 2026-01-31 07:46:09.620 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:10.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:10 np0005603622 nova_compute[221301]: 2026-01-31 07:46:10.439 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:10.557 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:46:10 np0005603622 nova_compute[221301]: 2026-01-31 07:46:10.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:10.560 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:46:11 np0005603622 podman[225316]: 2026-01-31 07:46:11.188130016 +0000 UTC m=+0.103627694 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:46:11 np0005603622 nova_compute[221301]: 2026-01-31 07:46:11.213 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:11.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:11 np0005603622 nova_compute[221301]: 2026-01-31 07:46:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:46:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:11.563 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:11 np0005603622 nova_compute[221301]: 2026-01-31 07:46:11.616 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:12.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:12 np0005603622 nova_compute[221301]: 2026-01-31 07:46:12.518 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845557.516304, bc2609cd-2682-4f66-91db-7b152e03fc32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:46:12 np0005603622 nova_compute[221301]: 2026-01-31 07:46:12.518 221324 INFO nova.compute.manager [-] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:46:12 np0005603622 nova_compute[221301]: 2026-01-31 07:46:12.540 221324 DEBUG nova.compute.manager [None req-46fec208-502c-4d08-b888-2009190ef0db - - - - - -] [instance: bc2609cd-2682-4f66-91db-7b152e03fc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:13.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.719 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "3202b44e-1e10-4f51-811d-9daaad2f656e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.720 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.754 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.868 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.869 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.878 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.879 221324 INFO nova.compute.claims [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:46:13 np0005603622 nova_compute[221301]: 2026-01-31 07:46:13.973 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:14.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1655633124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.453 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.458 221324 DEBUG nova.compute.provider_tree [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.476 221324 DEBUG nova.scheduler.client.report [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.501 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.502 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.552 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.553 221324 DEBUG nova.network.neutron [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.599 221324 INFO nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.627 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.704 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.706 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.706 221324 INFO nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Creating image(s)#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.737 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.776 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.814 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.819 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.895 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.896 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.897 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.898 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.936 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:14 np0005603622 nova_compute[221301]: 2026-01-31 07:46:14.941 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3202b44e-1e10-4f51-811d-9daaad2f656e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:15 np0005603622 nova_compute[221301]: 2026-01-31 07:46:15.034 221324 DEBUG nova.network.neutron [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:46:15 np0005603622 nova_compute[221301]: 2026-01-31 07:46:15.034 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:46:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:15.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:15 np0005603622 nova_compute[221301]: 2026-01-31 07:46:15.891 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3202b44e-1e10-4f51-811d-9daaad2f656e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.950s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.001 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] resizing rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:16.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.639 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.830 221324 DEBUG nova.objects.instance [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lazy-loading 'migration_context' on Instance uuid 3202b44e-1e10-4f51-811d-9daaad2f656e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.851 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.853 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Ensure instance console log exists: /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.853 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.854 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.854 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.856 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.861 221324 WARNING nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.866 221324 DEBUG nova.virt.libvirt.host [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.867 221324 DEBUG nova.virt.libvirt.host [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.871 221324 DEBUG nova.virt.libvirt.host [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.872 221324 DEBUG nova.virt.libvirt.host [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.873 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.874 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.874 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.875 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.875 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.875 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.875 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.876 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.876 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.876 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.877 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.877 221324 DEBUG nova.virt.hardware [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:46:16 np0005603622 nova_compute[221301]: 2026-01-31 07:46:16.880 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:17 np0005603622 podman[225551]: 2026-01-31 07:46:17.130219716 +0000 UTC m=+0.052749033 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 02:46:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:46:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1668275055' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.288 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.313 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:17.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.318 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:46:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1465139941' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.726 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.729 221324 DEBUG nova.objects.instance [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3202b44e-1e10-4f51-811d-9daaad2f656e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.760 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <uuid>3202b44e-1e10-4f51-811d-9daaad2f656e</uuid>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <name>instance-0000000a</name>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-315885465</nova:name>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:46:16</nova:creationTime>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:user uuid="8c57ca3c55a94e9f93a8302208fee47e">tempest-ServerDiagnosticsNegativeTest-657742464-project-member</nova:user>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <nova:project uuid="d837f81a3fdc4cc88fc59cba1c280567">tempest-ServerDiagnosticsNegativeTest-657742464</nova:project>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="serial">3202b44e-1e10-4f51-811d-9daaad2f656e</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="uuid">3202b44e-1e10-4f51-811d-9daaad2f656e</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3202b44e-1e10-4f51-811d-9daaad2f656e_disk">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/console.log" append="off"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:46:17 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:46:17 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:46:17 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:46:17 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.830 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.831 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.832 221324 INFO nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Using config drive#033[00m
Jan 31 02:46:17 np0005603622 nova_compute[221301]: 2026-01-31 07:46:17.890 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:18 np0005603622 nova_compute[221301]: 2026-01-31 07:46:18.184 221324 INFO nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Creating config drive at /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config#033[00m
Jan 31 02:46:18 np0005603622 nova_compute[221301]: 2026-01-31 07:46:18.191 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqz31mqno execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:18 np0005603622 nova_compute[221301]: 2026-01-31 07:46:18.320 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqz31mqno" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:18.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:18 np0005603622 nova_compute[221301]: 2026-01-31 07:46:18.362 221324 DEBUG nova.storage.rbd_utils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] rbd image 3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:18 np0005603622 nova_compute[221301]: 2026-01-31 07:46:18.367 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config 3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:19.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:19 np0005603622 nova_compute[221301]: 2026-01-31 07:46:19.824 221324 DEBUG oslo_concurrency.processutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config 3202b44e-1e10-4f51-811d-9daaad2f656e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:19 np0005603622 nova_compute[221301]: 2026-01-31 07:46:19.826 221324 INFO nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Deleting local config drive /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e/disk.config because it was imported into RBD.#033[00m
Jan 31 02:46:19 np0005603622 systemd-machined[190406]: New machine qemu-4-instance-0000000a.
Jan 31 02:46:19 np0005603622 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Jan 31 02:46:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:20.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.511 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845580.5104296, 3202b44e-1e10-4f51-811d-9daaad2f656e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.512 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.516 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.516 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.521 221324 INFO nova.virt.libvirt.driver [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance spawned successfully.#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.522 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.540 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.549 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.555 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.555 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.556 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.557 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.558 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.559 221324 DEBUG nova.virt.libvirt.driver [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.570 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.570 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845580.511722, 3202b44e-1e10-4f51-811d-9daaad2f656e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.571 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] VM Started (Lifecycle Event)#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.614 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.619 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.663 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.679 221324 INFO nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Took 5.97 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.680 221324 DEBUG nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.775 221324 INFO nova.compute.manager [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Took 6.96 seconds to build instance.#033[00m
Jan 31 02:46:20 np0005603622 nova_compute[221301]: 2026-01-31 07:46:20.812 221324 DEBUG oslo_concurrency.lockutils [None req-60c93719-59ae-4ea8-bd16-5a6c382a6ec0 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:21 np0005603622 nova_compute[221301]: 2026-01-31 07:46:21.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:21.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:21 np0005603622 nova_compute[221301]: 2026-01-31 07:46:21.641 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:22.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.077 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "3202b44e-1e10-4f51-811d-9daaad2f656e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.078 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.079 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "3202b44e-1e10-4f51-811d-9daaad2f656e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.079 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.079 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.080 221324 INFO nova.compute.manager [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Terminating instance#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.081 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "refresh_cache-3202b44e-1e10-4f51-811d-9daaad2f656e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.081 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquired lock "refresh_cache-3202b44e-1e10-4f51-811d-9daaad2f656e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.081 221324 DEBUG nova.network.neutron [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.227 221324 DEBUG nova.network.neutron [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:46:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:23.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.471 221324 DEBUG nova.network.neutron [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.489 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Releasing lock "refresh_cache-3202b44e-1e10-4f51-811d-9daaad2f656e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:46:23 np0005603622 nova_compute[221301]: 2026-01-31 07:46:23.491 221324 DEBUG nova.compute.manager [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:46:24 np0005603622 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 31 02:46:24 np0005603622 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 3.543s CPU time.
Jan 31 02:46:24 np0005603622 systemd-machined[190406]: Machine qemu-4-instance-0000000a terminated.
Jan 31 02:46:24 np0005603622 nova_compute[221301]: 2026-01-31 07:46:24.316 221324 INFO nova.virt.libvirt.driver [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance destroyed successfully.#033[00m
Jan 31 02:46:24 np0005603622 nova_compute[221301]: 2026-01-31 07:46:24.316 221324 DEBUG nova.objects.instance [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lazy-loading 'resources' on Instance uuid 3202b44e-1e10-4f51-811d-9daaad2f656e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:46:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:24.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:26 np0005603622 nova_compute[221301]: 2026-01-31 07:46:26.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:26.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:26 np0005603622 nova_compute[221301]: 2026-01-31 07:46:26.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:27.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:28.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:29.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:30.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:30.839 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:30.840 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:30.840 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:31 np0005603622 nova_compute[221301]: 2026-01-31 07:46:31.282 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:46:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:31.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:46:31 np0005603622 ovn_controller[130109]: 2026-01-31T07:46:31Z|00035|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 31 02:46:31 np0005603622 nova_compute[221301]: 2026-01-31 07:46:31.647 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:46:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:32.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:46:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:33.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:34.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:35.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:36 np0005603622 nova_compute[221301]: 2026-01-31 07:46:36.285 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:46:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:36.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:46:36 np0005603622 nova_compute[221301]: 2026-01-31 07:46:36.648 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.330 221324 INFO nova.virt.libvirt.driver [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Deleting instance files /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e_del#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.333 221324 INFO nova.virt.libvirt.driver [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Deletion of /var/lib/nova/instances/3202b44e-1e10-4f51-811d-9daaad2f656e_del complete#033[00m
Jan 31 02:46:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:38.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.392 221324 INFO nova.compute.manager [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Took 14.90 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.392 221324 DEBUG oslo.service.loopingcall [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.393 221324 DEBUG nova.compute.manager [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.393 221324 DEBUG nova.network.neutron [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.551 221324 DEBUG nova.network.neutron [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.569 221324 DEBUG nova.network.neutron [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.591 221324 INFO nova.compute.manager [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Took 0.20 seconds to deallocate network for instance.#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.648 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.650 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:38 np0005603622 nova_compute[221301]: 2026-01-31 07:46:38.709 221324 DEBUG oslo_concurrency.processutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/48464231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.172 221324 DEBUG oslo_concurrency.processutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.180 221324 DEBUG nova.compute.provider_tree [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.229 221324 DEBUG nova.scheduler.client.report [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.250 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.277 221324 INFO nova.scheduler.client.report [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Deleted allocations for instance 3202b44e-1e10-4f51-811d-9daaad2f656e#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.314 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845584.3131382, 3202b44e-1e10-4f51-811d-9daaad2f656e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.314 221324 INFO nova.compute.manager [-] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.350 221324 DEBUG nova.compute.manager [None req-6eee02fc-3c78-47bf-b95d-157b5eaafe03 - - - - - -] [instance: 3202b44e-1e10-4f51-811d-9daaad2f656e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:46:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:39 np0005603622 nova_compute[221301]: 2026-01-31 07:46:39.363 221324 DEBUG oslo_concurrency.lockutils [None req-44450fbc-47be-4799-aa24-3597a9f0388d 8c57ca3c55a94e9f93a8302208fee47e d837f81a3fdc4cc88fc59cba1c280567 - - default default] Lock "3202b44e-1e10-4f51-811d-9daaad2f656e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 16.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:46:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:40.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:46:41 np0005603622 nova_compute[221301]: 2026-01-31 07:46:41.288 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:41 np0005603622 nova_compute[221301]: 2026-01-31 07:46:41.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:42 np0005603622 podman[225771]: 2026-01-31 07:46:42.166653689 +0000 UTC m=+0.093418251 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:46:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:42.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:44.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:45.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:46 np0005603622 nova_compute[221301]: 2026-01-31 07:46:46.291 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:46.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:46 np0005603622 nova_compute[221301]: 2026-01-31 07:46:46.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:48 np0005603622 podman[225928]: 2026-01-31 07:46:48.135240703 +0000 UTC m=+0.062798022 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 31 02:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:48.237 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:46:48 np0005603622 nova_compute[221301]: 2026-01-31 07:46:48.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:48.239 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:46:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:48.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:49.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:46:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:46:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:46:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:46:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:50.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:46:51 np0005603622 nova_compute[221301]: 2026-01-31 07:46:51.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:51 np0005603622 nova_compute[221301]: 2026-01-31 07:46:51.654 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:52.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:53.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:54.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.804 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.804 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.833 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.908 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.909 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.916 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:46:54 np0005603622 nova_compute[221301]: 2026-01-31 07:46:54.917 221324 INFO nova.compute.claims [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.024 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:46:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:55.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:46:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/243647529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.460 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.467 221324 DEBUG nova.compute.provider_tree [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.484 221324 DEBUG nova.scheduler.client.report [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.508 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.510 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.554 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.555 221324 DEBUG nova.network.neutron [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.573 221324 INFO nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.591 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.697 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.698 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.698 221324 INFO nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Creating image(s)#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.723 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.747 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.773 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.776 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.826 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.828 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.829 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.829 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.850 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:46:55 np0005603622 nova_compute[221301]: 2026-01-31 07:46:55.853 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:46:56 np0005603622 nova_compute[221301]: 2026-01-31 07:46:56.275 221324 DEBUG nova.network.neutron [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:46:56 np0005603622 nova_compute[221301]: 2026-01-31 07:46:56.275 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:46:56 np0005603622 nova_compute[221301]: 2026-01-31 07:46:56.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:56.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:56 np0005603622 nova_compute[221301]: 2026-01-31 07:46:56.656 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.762755) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616762817, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1978, "num_deletes": 250, "total_data_size": 4544917, "memory_usage": 4588184, "flush_reason": "Manual Compaction"}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616772830, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1817482, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20169, "largest_seqno": 22142, "table_properties": {"data_size": 1811257, "index_size": 3172, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16364, "raw_average_key_size": 20, "raw_value_size": 1797426, "raw_average_value_size": 2289, "num_data_blocks": 140, "num_entries": 785, "num_filter_entries": 785, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845456, "oldest_key_time": 1769845456, "file_creation_time": 1769845616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 10125 microseconds, and 3632 cpu microseconds.
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.772884) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1817482 bytes OK
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.772903) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.774194) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.774213) EVENT_LOG_v1 {"time_micros": 1769845616774206, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.774233) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4536029, prev total WAL file size 4536029, number of live WAL files 2.
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.775101) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1774KB)], [39(9527KB)]
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616775201, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 11573612, "oldest_snapshot_seqno": -1}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4853 keys, 8804556 bytes, temperature: kUnknown
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616843659, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 8804556, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8771874, "index_size": 19452, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12165, "raw_key_size": 120379, "raw_average_key_size": 24, "raw_value_size": 8683907, "raw_average_value_size": 1789, "num_data_blocks": 804, "num_entries": 4853, "num_filter_entries": 4853, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.844099) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 8804556 bytes
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.845917) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.1 rd, 128.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.3 +0.0 blob) out(8.4 +0.0 blob), read-write-amplify(11.2) write-amplify(4.8) OK, records in: 5298, records dropped: 445 output_compression: NoCompression
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.845951) EVENT_LOG_v1 {"time_micros": 1769845616845936, "job": 22, "event": "compaction_finished", "compaction_time_micros": 68454, "compaction_time_cpu_micros": 29295, "output_level": 6, "num_output_files": 1, "total_output_size": 8804556, "num_input_records": 5298, "num_output_records": 4853, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616846472, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845616848190, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.774926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.848241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.848249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.848252) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.848255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:46:56.848257) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:46:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:57.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:46:58.242 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:46:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:46:58.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:46:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:46:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:46:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:46:59.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:00.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:01 np0005603622 nova_compute[221301]: 2026-01-31 07:47:01.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:01.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:01 np0005603622 nova_compute[221301]: 2026-01-31 07:47:01.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:02.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:02 np0005603622 nova_compute[221301]: 2026-01-31 07:47:02.805 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.951s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:02 np0005603622 nova_compute[221301]: 2026-01-31 07:47:02.892 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] resizing rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:47:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:04.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:05.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.618 221324 DEBUG nova.objects.instance [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'migration_context' on Instance uuid 37069dd7-a48f-42ca-8238-bf5baa1fa605 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.635 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.635 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Ensure instance console log exists: /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.636 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.637 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.637 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.639 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.645 221324 WARNING nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.652 221324 DEBUG nova.virt.libvirt.host [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.652 221324 DEBUG nova.virt.libvirt.host [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.657 221324 DEBUG nova.virt.libvirt.host [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.658 221324 DEBUG nova.virt.libvirt.host [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.660 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.661 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.662 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.662 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.662 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.663 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.663 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.664 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.664 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.664 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.665 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.665 221324 DEBUG nova.virt.hardware [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:47:05 np0005603622 nova_compute[221301]: 2026-01-31 07:47:05.669 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/329396357' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.219 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.246 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.249 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:06.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:47:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3285920579' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.713 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.715 221324 DEBUG nova.objects.instance [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 37069dd7-a48f-42ca-8238-bf5baa1fa605 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.735 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <uuid>37069dd7-a48f-42ca-8238-bf5baa1fa605</uuid>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <name>instance-0000000e</name>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-MigrationsAdminTest-server-1143047912</nova:name>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:47:05</nova:creationTime>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:user uuid="8a59efd78e244f44a1c70650f82a2c50">tempest-MigrationsAdminTest-1820348317-project-member</nova:user>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <nova:project uuid="1627a71b855b4032b51e234e44a9d570">tempest-MigrationsAdminTest-1820348317</nova:project>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="serial">37069dd7-a48f-42ca-8238-bf5baa1fa605</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="uuid">37069dd7-a48f-42ca-8238-bf5baa1fa605</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/37069dd7-a48f-42ca-8238-bf5baa1fa605_disk">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/console.log" append="off"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:47:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:47:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:47:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:47:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.819 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.820 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.821 221324 INFO nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Using config drive#033[00m
Jan 31 02:47:06 np0005603622 nova_compute[221301]: 2026-01-31 07:47:06.857 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.198 221324 INFO nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Creating config drive at /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.206 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf2id4pow execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.326 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpf2id4pow" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:07.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.420 221324 DEBUG nova.storage.rbd_utils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] rbd image 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.424 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.438 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.462 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.463 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.464 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.465 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.465 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/805778429' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:07 np0005603622 nova_compute[221301]: 2026-01-31 07:47:07.937 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.086 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.087 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.239 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.240 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4960MB free_disk=20.926021575927734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.241 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.241 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.366 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 37069dd7-a48f-42ca-8238-bf5baa1fa605 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.367 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.367 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.405 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:08.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/861061921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.893 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.898 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.916 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.942 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:47:08 np0005603622 nova_compute[221301]: 2026-01-31 07:47:08.943 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:09.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:09 np0005603622 nova_compute[221301]: 2026-01-31 07:47:09.901 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:09 np0005603622 nova_compute[221301]: 2026-01-31 07:47:09.902 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:09 np0005603622 nova_compute[221301]: 2026-01-31 07:47:09.903 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:47:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.399 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.400 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.421 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.421 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.422 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:10.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.474 221324 DEBUG oslo_concurrency.processutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config 37069dd7-a48f-42ca-8238-bf5baa1fa605_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:10 np0005603622 nova_compute[221301]: 2026-01-31 07:47:10.474 221324 INFO nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Deleting local config drive /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/disk.config because it was imported into RBD.#033[00m
Jan 31 02:47:10 np0005603622 systemd-machined[190406]: New machine qemu-5-instance-0000000e.
Jan 31 02:47:10 np0005603622 systemd[1]: Started Virtual Machine qemu-5-instance-0000000e.
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.330 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:47:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:11.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.737 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845631.7373881, 37069dd7-a48f-42ca-8238-bf5baa1fa605 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.738 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.740 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.740 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.744 221324 INFO nova.virt.libvirt.driver [-] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance spawned successfully.#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.744 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.779 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.786 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.786 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.787 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.788 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.789 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.789 221324 DEBUG nova.virt.libvirt.driver [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.796 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.841 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.841 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845631.7405078, 37069dd7-a48f-42ca-8238-bf5baa1fa605 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.842 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] VM Started (Lifecycle Event)#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.859 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.863 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.866 221324 INFO nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Took 16.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.867 221324 DEBUG nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.892 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.922 221324 INFO nova.compute.manager [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Took 17.04 seconds to build instance.#033[00m
Jan 31 02:47:11 np0005603622 nova_compute[221301]: 2026-01-31 07:47:11.938 221324 DEBUG oslo_concurrency.lockutils [None req-cc5fd58a-d344-482e-b9e6-a6cc41dde490 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:12.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:12 np0005603622 podman[226383]: 2026-01-31 07:47:12.690952434 +0000 UTC m=+0.119970902 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:47:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:47:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:47:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:13.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:14.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:14 np0005603622 ovn_controller[130109]: 2026-01-31T07:47:14Z|00036|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Jan 31 02:47:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:15.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:16 np0005603622 nova_compute[221301]: 2026-01-31 07:47:16.332 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:16.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:16 np0005603622 nova_compute[221301]: 2026-01-31 07:47:16.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:17 np0005603622 nova_compute[221301]: 2026-01-31 07:47:17.852 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquiring lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:47:17 np0005603622 nova_compute[221301]: 2026-01-31 07:47:17.855 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquired lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:47:17 np0005603622 nova_compute[221301]: 2026-01-31 07:47:17.855 221324 DEBUG nova.network.neutron [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.321 221324 DEBUG nova.network.neutron [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:47:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:18.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.640 221324 DEBUG nova.network.neutron [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.655 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Releasing lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.753 221324 DEBUG nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.754 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Creating file /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/4d533ced22134a899d94d8a7ee51f44a.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 02:47:18 np0005603622 nova_compute[221301]: 2026-01-31 07:47:18.755 221324 DEBUG oslo_concurrency.processutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/4d533ced22134a899d94d8a7ee51f44a.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:19 np0005603622 podman[226436]: 2026-01-31 07:47:19.184497154 +0000 UTC m=+0.103637935 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.252 221324 DEBUG oslo_concurrency.processutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/4d533ced22134a899d94d8a7ee51f44a.tmp" returned: 1 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.253 221324 DEBUG oslo_concurrency.processutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605/4d533ced22134a899d94d8a7ee51f44a.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.253 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Creating directory /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.253 221324 DEBUG oslo_concurrency.processutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:19.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.463 221324 DEBUG oslo_concurrency.processutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/37069dd7-a48f-42ca-8238-bf5baa1fa605" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:19 np0005603622 nova_compute[221301]: 2026-01-31 07:47:19.469 221324 DEBUG nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 02:47:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:20.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:21 np0005603622 nova_compute[221301]: 2026-01-31 07:47:21.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.653204) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641653330, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 518, "num_deletes": 256, "total_data_size": 714181, "memory_usage": 725544, "flush_reason": "Manual Compaction"}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 31 02:47:21 np0005603622 nova_compute[221301]: 2026-01-31 07:47:21.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641668538, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 471342, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22147, "largest_seqno": 22660, "table_properties": {"data_size": 468543, "index_size": 771, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6417, "raw_average_key_size": 17, "raw_value_size": 462919, "raw_average_value_size": 1289, "num_data_blocks": 34, "num_entries": 359, "num_filter_entries": 359, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845616, "oldest_key_time": 1769845616, "file_creation_time": 1769845641, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 15388 microseconds, and 2558 cpu microseconds.
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.668602) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 471342 bytes OK
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.668622) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.677756) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.677795) EVENT_LOG_v1 {"time_micros": 1769845641677785, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.677818) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 711079, prev total WAL file size 711079, number of live WAL files 2.
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.678437) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(460KB)], [42(8598KB)]
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641678528, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 9275898, "oldest_snapshot_seqno": -1}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 4689 keys, 9133758 bytes, temperature: kUnknown
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641737553, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9133758, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9101316, "index_size": 19596, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 118204, "raw_average_key_size": 25, "raw_value_size": 9015427, "raw_average_value_size": 1922, "num_data_blocks": 805, "num_entries": 4689, "num_filter_entries": 4689, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845641, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.737806) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9133758 bytes
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.742469) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.9 rd, 154.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 8.4 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(39.1) write-amplify(19.4) OK, records in: 5212, records dropped: 523 output_compression: NoCompression
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.742505) EVENT_LOG_v1 {"time_micros": 1769845641742492, "job": 24, "event": "compaction_finished", "compaction_time_micros": 59128, "compaction_time_cpu_micros": 20217, "output_level": 6, "num_output_files": 1, "total_output_size": 9133758, "num_input_records": 5212, "num_output_records": 4689, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641742858, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845641743594, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.678322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.743668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.743673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.743674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.743675) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:21.743677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:22.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:24.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:25.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:26 np0005603622 nova_compute[221301]: 2026-01-31 07:47:26.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:26.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:26 np0005603622 nova_compute[221301]: 2026-01-31 07:47:26.671 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:28.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:29.194 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:47:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:29.196 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:47:29 np0005603622 nova_compute[221301]: 2026-01-31 07:47:29.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:29 np0005603622 nova_compute[221301]: 2026-01-31 07:47:29.518 221324 DEBUG nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 02:47:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:30.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:30.840 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:30.841 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:30.841 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:31 np0005603622 nova_compute[221301]: 2026-01-31 07:47:31.343 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:31.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:31 np0005603622 nova_compute[221301]: 2026-01-31 07:47:31.673 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:33.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:33 np0005603622 nova_compute[221301]: 2026-01-31 07:47:33.537 221324 INFO nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance shutdown successfully after 14 seconds.#033[00m
Jan 31 02:47:34 np0005603622 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 31 02:47:34 np0005603622 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000e.scope: Consumed 13.649s CPU time.
Jan 31 02:47:34 np0005603622 systemd-machined[190406]: Machine qemu-5-instance-0000000e terminated.
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.355 221324 INFO nova.virt.libvirt.driver [-] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance destroyed successfully.#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.360 221324 DEBUG nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.360 221324 DEBUG nova.virt.libvirt.driver [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.458 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.460 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.469 221324 INFO nova.compute.rpcapi [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.470 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:47:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:34.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.482 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Acquiring lock "37069dd7-a48f-42ca-8238-bf5baa1fa605-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.482 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:34 np0005603622 nova_compute[221301]: 2026-01-31 07:47:34.482 221324 DEBUG oslo_concurrency.lockutils [None req-e45f7415-caeb-40ce-a0d7-77b67d913463 fad82ed81c81456297863bf537d98c83 b3fcea0c21ef4dc8bbc27e3216fc550f - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:47:35.198 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:47:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 31 02:47:36 np0005603622 nova_compute[221301]: 2026-01-31 07:47:36.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:36.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:36 np0005603622 nova_compute[221301]: 2026-01-31 07:47:36.674 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:37.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:38.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:39.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:40.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:41 np0005603622 nova_compute[221301]: 2026-01-31 07:47:41.354 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:41 np0005603622 nova_compute[221301]: 2026-01-31 07:47:41.380 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:41 np0005603622 nova_compute[221301]: 2026-01-31 07:47:41.380 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:41 np0005603622 nova_compute[221301]: 2026-01-31 07:47:41.381 221324 DEBUG nova.compute.manager [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 02:47:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:41.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:41 np0005603622 nova_compute[221301]: 2026-01-31 07:47:41.675 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:42.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:42 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 31 02:47:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:42.847888) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:47:42 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 31 02:47:42 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845662847950, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 449, "num_deletes": 251, "total_data_size": 592517, "memory_usage": 602376, "flush_reason": "Manual Compaction"}
Jan 31 02:47:42 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663022163, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 390742, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22665, "largest_seqno": 23109, "table_properties": {"data_size": 388238, "index_size": 604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6114, "raw_average_key_size": 18, "raw_value_size": 383229, "raw_average_value_size": 1179, "num_data_blocks": 27, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845641, "oldest_key_time": 1769845641, "file_creation_time": 1769845662, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 174328 microseconds, and 2379 cpu microseconds.
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.022226) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 390742 bytes OK
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.022251) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.141390) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.141437) EVENT_LOG_v1 {"time_micros": 1769845663141426, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.141463) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 589747, prev total WAL file size 589747, number of live WAL files 2.
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.142347) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(381KB)], [45(8919KB)]
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663142424, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9524500, "oldest_snapshot_seqno": -1}
Jan 31 02:47:43 np0005603622 podman[226459]: 2026-01-31 07:47:43.171333378 +0000 UTC m=+0.100978783 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4500 keys, 7496106 bytes, temperature: kUnknown
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663412970, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 7496106, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7466306, "index_size": 17466, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11269, "raw_key_size": 114956, "raw_average_key_size": 25, "raw_value_size": 7385026, "raw_average_value_size": 1641, "num_data_blocks": 707, "num_entries": 4500, "num_filter_entries": 4500, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845663, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:47:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:43.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.413343) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 7496106 bytes
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.493067) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 35.2 rd, 27.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 8.7 +0.0 blob) out(7.1 +0.0 blob), read-write-amplify(43.6) write-amplify(19.2) OK, records in: 5014, records dropped: 514 output_compression: NoCompression
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.493115) EVENT_LOG_v1 {"time_micros": 1769845663493097, "job": 26, "event": "compaction_finished", "compaction_time_micros": 270649, "compaction_time_cpu_micros": 22881, "output_level": 6, "num_output_files": 1, "total_output_size": 7496106, "num_input_records": 5014, "num_output_records": 4500, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663493435, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845663494863, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.141956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.494978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.494985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.494987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.494989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:47:43.494991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:47:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:44.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:45.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.246 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.246 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.247 221324 DEBUG nova.network.neutron [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.247 221324 DEBUG nova.objects.instance [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'info_cache' on Instance uuid 37069dd7-a48f-42ca-8238-bf5baa1fa605 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.356 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:46.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:46 np0005603622 nova_compute[221301]: 2026-01-31 07:47:46.677 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:47 np0005603622 nova_compute[221301]: 2026-01-31 07:47:47.292 221324 DEBUG nova.network.neutron [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:47:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:47.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:47 np0005603622 nova_compute[221301]: 2026-01-31 07:47:47.558 221324 DEBUG nova.network.neutron [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:47:47 np0005603622 nova_compute[221301]: 2026-01-31 07:47:47.697 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-37069dd7-a48f-42ca-8238-bf5baa1fa605" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:47:47 np0005603622 nova_compute[221301]: 2026-01-31 07:47:47.698 221324 DEBUG nova.objects.instance [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'migration_context' on Instance uuid 37069dd7-a48f-42ca-8238-bf5baa1fa605 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:47:48 np0005603622 nova_compute[221301]: 2026-01-31 07:47:48.258 221324 DEBUG nova.storage.rbd_utils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] removing snapshot(nova-resize) on rbd image(37069dd7-a48f-42ca-8238-bf5baa1fa605_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 02:47:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:48.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:49 np0005603622 nova_compute[221301]: 2026-01-31 07:47:49.354 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845654.3536994, 37069dd7-a48f-42ca-8238-bf5baa1fa605 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:47:49 np0005603622 nova_compute[221301]: 2026-01-31 07:47:49.355 221324 INFO nova.compute.manager [-] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:47:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:49.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:49 np0005603622 nova_compute[221301]: 2026-01-31 07:47:49.481 221324 DEBUG nova.compute.manager [None req-c71a58fe-efa9-4b8e-b189-c36ed059a484 - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:47:49 np0005603622 nova_compute[221301]: 2026-01-31 07:47:49.486 221324 DEBUG nova.compute.manager [None req-c71a58fe-efa9-4b8e-b189-c36ed059a484 - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:47:49 np0005603622 nova_compute[221301]: 2026-01-31 07:47:49.845 221324 INFO nova.compute.manager [None req-c71a58fe-efa9-4b8e-b189-c36ed059a484 - - - - - -] [instance: 37069dd7-a48f-42ca-8238-bf5baa1fa605] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 02:47:50 np0005603622 podman[226522]: 2026-01-31 07:47:50.114673467 +0000 UTC m=+0.045542930 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:47:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:50.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:51 np0005603622 nova_compute[221301]: 2026-01-31 07:47:51.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:51.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:51 np0005603622 nova_compute[221301]: 2026-01-31 07:47:51.681 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:52.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:53.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:54.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 31 02:47:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:47:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:47:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:47:56 np0005603622 nova_compute[221301]: 2026-01-31 07:47:56.383 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:56.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:56 np0005603622 nova_compute[221301]: 2026-01-31 07:47:56.681 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:47:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:47:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:57.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:47:57 np0005603622 nova_compute[221301]: 2026-01-31 07:47:57.491 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:47:57 np0005603622 nova_compute[221301]: 2026-01-31 07:47:57.492 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:47:57 np0005603622 nova_compute[221301]: 2026-01-31 07:47:57.567 221324 DEBUG oslo_concurrency.processutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:47:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:47:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/725766275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.010 221324 DEBUG oslo_concurrency.processutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.020 221324 DEBUG nova.compute.provider_tree [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.055 221324 DEBUG nova.scheduler.client.report [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.115 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.233 221324 INFO nova.scheduler.client.report [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Deleted allocation for migration 23b20670-ad55-4ea4-8c1f-01ea26d14cc3#033[00m
Jan 31 02:47:58 np0005603622 nova_compute[221301]: 2026-01-31 07:47:58.280 221324 DEBUG oslo_concurrency.lockutils [None req-fa14c0f3-3102-4199-9eda-2c0ec64e51ab 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "37069dd7-a48f-42ca-8238-bf5baa1fa605" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 16.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:47:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:47:58.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:47:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:47:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:47:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:47:59.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:48:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:00.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:48:01 np0005603622 nova_compute[221301]: 2026-01-31 07:48:01.423 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:01.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:01 np0005603622 nova_compute[221301]: 2026-01-31 07:48:01.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:02.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:03.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:04.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:05.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:06 np0005603622 nova_compute[221301]: 2026-01-31 07:48:06.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:06.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:06 np0005603622 nova_compute[221301]: 2026-01-31 07:48:06.686 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:07 np0005603622 nova_compute[221301]: 2026-01-31 07:48:07.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:07.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:08 np0005603622 nova_compute[221301]: 2026-01-31 07:48:08.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:08.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.428 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.429 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.429 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.430 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.462 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.463 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.464 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.464 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.464 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:09.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/537984421' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:09 np0005603622 nova_compute[221301]: 2026-01-31 07:48:09.957 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.137 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.140 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5014MB free_disk=20.942729949951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.140 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.140 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:10 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.355 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.355 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.376 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:10.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3467398552' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.818 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.825 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.892 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.924 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:48:10 np0005603622 nova_compute[221301]: 2026-01-31 07:48:10.924 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:11.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.686 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.892 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.893 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.894 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.894 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.913 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:48:11 np0005603622 nova_compute[221301]: 2026-01-31 07:48:11.914 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:12.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:13 np0005603622 nova_compute[221301]: 2026-01-31 07:48:13.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:48:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:13.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:14 np0005603622 podman[226742]: 2026-01-31 07:48:14.289038861 +0000 UTC m=+0.202328726 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 02:48:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:14.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:48:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:48:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:15.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:16 np0005603622 nova_compute[221301]: 2026-01-31 07:48:16.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:16.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:16 np0005603622 nova_compute[221301]: 2026-01-31 07:48:16.740 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 31 02:48:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:17.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:17 np0005603622 nova_compute[221301]: 2026-01-31 07:48:17.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:17.576 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:17.578 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:48:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:18.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:19.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:48:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:20.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:48:21 np0005603622 podman[226768]: 2026-01-31 07:48:21.148367462 +0000 UTC m=+0.068206716 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.435 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:21.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.698 221324 DEBUG nova.compute.manager [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.741 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.851 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.852 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.885 221324 DEBUG nova.objects.instance [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_requests' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.920 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.921 221324 INFO nova.compute.claims [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.921 221324 DEBUG nova.objects.instance [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'resources' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:21 np0005603622 nova_compute[221301]: 2026-01-31 07:48:21.952 221324 DEBUG nova.objects.instance [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.038 221324 INFO nova.compute.resource_tracker [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating resource usage from migration 73e0184c-1180-4ca4-8c3b-7550a25526aa#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.039 221324 DEBUG nova.compute.resource_tracker [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Starting to track incoming migration 73e0184c-1180-4ca4-8c3b-7550a25526aa with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.141 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:22.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2095395212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.583 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.590 221324 DEBUG nova.compute.provider_tree [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.607 221324 DEBUG nova.scheduler.client.report [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.634 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:22 np0005603622 nova_compute[221301]: 2026-01-31 07:48:22.635 221324 INFO nova.compute.manager [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Migrating#033[00m
Jan 31 02:48:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:23.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:23 np0005603622 systemd-logind[817]: New session 51 of user nova.
Jan 31 02:48:23 np0005603622 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 02:48:23 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 02:48:23 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 02:48:23 np0005603622 systemd[1]: Starting User Manager for UID 42436...
Jan 31 02:48:23 np0005603622 systemd[226814]: Queued start job for default target Main User Target.
Jan 31 02:48:24 np0005603622 systemd[226814]: Created slice User Application Slice.
Jan 31 02:48:24 np0005603622 systemd[226814]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:48:24 np0005603622 systemd[226814]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 02:48:24 np0005603622 systemd[226814]: Reached target Paths.
Jan 31 02:48:24 np0005603622 systemd[226814]: Reached target Timers.
Jan 31 02:48:24 np0005603622 systemd[226814]: Starting D-Bus User Message Bus Socket...
Jan 31 02:48:24 np0005603622 systemd[226814]: Starting Create User's Volatile Files and Directories...
Jan 31 02:48:24 np0005603622 systemd[226814]: Finished Create User's Volatile Files and Directories.
Jan 31 02:48:24 np0005603622 systemd[226814]: Listening on D-Bus User Message Bus Socket.
Jan 31 02:48:24 np0005603622 systemd[226814]: Reached target Sockets.
Jan 31 02:48:24 np0005603622 systemd[226814]: Reached target Basic System.
Jan 31 02:48:24 np0005603622 systemd[226814]: Reached target Main User Target.
Jan 31 02:48:24 np0005603622 systemd[226814]: Startup finished in 170ms.
Jan 31 02:48:24 np0005603622 systemd[1]: Started User Manager for UID 42436.
Jan 31 02:48:24 np0005603622 systemd[1]: Started Session 51 of User nova.
Jan 31 02:48:24 np0005603622 systemd-logind[817]: Session 51 logged out. Waiting for processes to exit.
Jan 31 02:48:24 np0005603622 systemd[1]: session-51.scope: Deactivated successfully.
Jan 31 02:48:24 np0005603622 systemd-logind[817]: Removed session 51.
Jan 31 02:48:24 np0005603622 systemd-logind[817]: New session 53 of user nova.
Jan 31 02:48:24 np0005603622 systemd[1]: Started Session 53 of User nova.
Jan 31 02:48:24 np0005603622 systemd[1]: session-53.scope: Deactivated successfully.
Jan 31 02:48:24 np0005603622 systemd-logind[817]: Session 53 logged out. Waiting for processes to exit.
Jan 31 02:48:24 np0005603622 systemd-logind[817]: Removed session 53.
Jan 31 02:48:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:24.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:48:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:25.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:48:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:25.580 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:26 np0005603622 nova_compute[221301]: 2026-01-31 07:48:26.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:26.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:26 np0005603622 nova_compute[221301]: 2026-01-31 07:48:26.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:48:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:27.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:48:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:28.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:29.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:48:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:30.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:30.842 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:30.842 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:30.843 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:31 np0005603622 nova_compute[221301]: 2026-01-31 07:48:31.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:31.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:31 np0005603622 nova_compute[221301]: 2026-01-31 07:48:31.745 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:32.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:33.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:34 np0005603622 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 02:48:34 np0005603622 systemd[226814]: Activating special unit Exit the Session...
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped target Main User Target.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped target Basic System.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped target Paths.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped target Sockets.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped target Timers.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 02:48:34 np0005603622 systemd[226814]: Closed D-Bus User Message Bus Socket.
Jan 31 02:48:34 np0005603622 systemd[226814]: Stopped Create User's Volatile Files and Directories.
Jan 31 02:48:34 np0005603622 systemd[226814]: Removed slice User Application Slice.
Jan 31 02:48:34 np0005603622 systemd[226814]: Reached target Shutdown.
Jan 31 02:48:34 np0005603622 systemd[226814]: Finished Exit the Session.
Jan 31 02:48:34 np0005603622 systemd[226814]: Reached target Exit the Session.
Jan 31 02:48:34 np0005603622 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 02:48:34 np0005603622 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 02:48:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:34.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:34 np0005603622 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 02:48:34 np0005603622 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 02:48:34 np0005603622 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 02:48:34 np0005603622 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 02:48:34 np0005603622 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 02:48:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:35.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:36 np0005603622 nova_compute[221301]: 2026-01-31 07:48:36.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:48:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:36.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:48:36 np0005603622 nova_compute[221301]: 2026-01-31 07:48:36.750 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.016 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.017 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.062 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.165 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.166 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.174 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.174 221324 INFO nova.compute.claims [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.333 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:37.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:48:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3625939564' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.780 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.786 221324 DEBUG nova.compute.provider_tree [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.820 221324 DEBUG nova.scheduler.client.report [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.860 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.861 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.916 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.917 221324 DEBUG nova.network.neutron [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.945 221324 INFO nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:48:37 np0005603622 nova_compute[221301]: 2026-01-31 07:48:37.973 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.123 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.125 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.125 221324 INFO nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Creating image(s)#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.167 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.206 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.238 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.242 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.319 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.320 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.321 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.321 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.349 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.352 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:38.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:38 np0005603622 nova_compute[221301]: 2026-01-31 07:48:38.642 221324 DEBUG nova.policy [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '795c7f392cbc45f0885f081449883d42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:48:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:39.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.294 221324 DEBUG nova.network.neutron [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Successfully updated port: 3aff2339-ccc0-4845-8728-4ede26d0c11a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.311 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.311 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.312 221324 DEBUG nova.network.neutron [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.394 221324 DEBUG nova.compute.manager [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-changed-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.394 221324 DEBUG nova.compute.manager [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Refreshing instance network info cache due to event network-changed-3aff2339-ccc0-4845-8728-4ede26d0c11a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.395 221324 DEBUG oslo_concurrency.lockutils [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:40 np0005603622 nova_compute[221301]: 2026-01-31 07:48:40.542 221324 DEBUG nova.network.neutron [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:48:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:40.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:41.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.551 221324 DEBUG nova.network.neutron [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.575 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.575 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance network_info: |[{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.576 221324 DEBUG oslo_concurrency.lockutils [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.576 221324 DEBUG nova.network.neutron [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Refreshing network info cache for port 3aff2339-ccc0-4845-8728-4ede26d0c11a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:48:41 np0005603622 nova_compute[221301]: 2026-01-31 07:48:41.789 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:42.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:43 np0005603622 nova_compute[221301]: 2026-01-31 07:48:43.362 221324 DEBUG nova.network.neutron [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updated VIF entry in instance network info cache for port 3aff2339-ccc0-4845-8728-4ede26d0c11a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:48:43 np0005603622 nova_compute[221301]: 2026-01-31 07:48:43.363 221324 DEBUG nova.network.neutron [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:43 np0005603622 nova_compute[221301]: 2026-01-31 07:48:43.381 221324 DEBUG oslo_concurrency.lockutils [req-39b00719-ae5c-4617-962f-0ad5955e400b req-a7d74ed3-226a-49cb-8931-11d1a7f213b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:43.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:44.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:45 np0005603622 podman[227005]: 2026-01-31 07:48:45.191012482 +0000 UTC m=+0.113064696 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 02:48:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:45.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:46 np0005603622 nova_compute[221301]: 2026-01-31 07:48:46.095 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:46 np0005603622 nova_compute[221301]: 2026-01-31 07:48:46.201 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] resizing rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:48:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:46.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:46 np0005603622 nova_compute[221301]: 2026-01-31 07:48:46.852 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:46 np0005603622 nova_compute[221301]: 2026-01-31 07:48:46.854 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:48:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:47.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:48.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.096 221324 DEBUG nova.objects.instance [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lazy-loading 'migration_context' on Instance uuid 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.112 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.112 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Ensure instance console log exists: /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.113 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.113 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.113 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.116 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Start _get_guest_xml network_info=[{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.121 221324 WARNING nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.129 221324 DEBUG nova.virt.libvirt.host [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.130 221324 DEBUG nova.virt.libvirt.host [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.134 221324 DEBUG nova.virt.libvirt.host [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.135 221324 DEBUG nova.virt.libvirt.host [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.136 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.137 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.137 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.137 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.138 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.138 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.138 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.138 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.139 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.139 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.139 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.139 221324 DEBUG nova.virt.hardware [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.143 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:49.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/159214635' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.949 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.806s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.982 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:49 np0005603622 nova_compute[221301]: 2026-01-31 07:48:49.986 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:48:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1524367088' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:48:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:50.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.062 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.065 221324 DEBUG nova.virt.libvirt.vif [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:48:38Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.066 221324 DEBUG nova.network.os_vif_util [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.067 221324 DEBUG nova.network.os_vif_util [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.071 221324 DEBUG nova.objects.instance [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lazy-loading 'pci_devices' on Instance uuid 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.109 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <uuid>14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7</uuid>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <name>instance-00000011</name>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:name>tempest-LiveMigrationTest-server-1007393486</nova:name>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:48:49</nova:creationTime>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:user uuid="795c7f392cbc45f0885f081449883d42">tempest-LiveMigrationTest-126681982-project-member</nova:user>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:project uuid="cbdbb7a4b22a49b68feb3e028bb62fbb">tempest-LiveMigrationTest-126681982</nova:project>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <nova:port uuid="3aff2339-ccc0-4845-8728-4ede26d0c11a">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="serial">14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="uuid">14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:e0:f2:07"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <target dev="tap3aff2339-cc"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/console.log" append="off"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:48:51 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:48:51 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:48:51 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:48:51 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.112 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Preparing to wait for external event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.113 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.113 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.114 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.115 221324 DEBUG nova.virt.libvirt.vif [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:48:38Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.116 221324 DEBUG nova.network.os_vif_util [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.118 221324 DEBUG nova.network.os_vif_util [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.119 221324 DEBUG os_vif [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.121 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.122 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.130 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.130 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aff2339-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.131 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3aff2339-cc, col_values=(('external_ids', {'iface-id': '3aff2339-ccc0-4845-8728-4ede26d0c11a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:f2:07', 'vm-uuid': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.133 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603622 NetworkManager[49080]: <info>  [1769845731.1360] manager: (tap3aff2339-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.138 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.142 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.144 221324 INFO os_vif [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc')#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.402 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.402 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.403 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] No VIF found with MAC fa:16:3e:e0:f2:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.403 221324 INFO nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Using config drive#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.434 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:51.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.880 221324 INFO nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Creating config drive at /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config#033[00m
Jan 31 02:48:51 np0005603622 nova_compute[221301]: 2026-01-31 07:48:51.887 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx9xv7ljm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:52 np0005603622 nova_compute[221301]: 2026-01-31 07:48:52.008 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx9xv7ljm" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:52 np0005603622 nova_compute[221301]: 2026-01-31 07:48:52.054 221324 DEBUG nova.storage.rbd_utils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] rbd image 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:48:52 np0005603622 nova_compute[221301]: 2026-01-31 07:48:52.060 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:48:52 np0005603622 podman[227204]: 2026-01-31 07:48:52.181474696 +0000 UTC m=+0.103331263 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:48:52 np0005603622 nova_compute[221301]: 2026-01-31 07:48:52.594 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Creating tmpfile /var/lib/nova/instances/tmphvxots3o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:48:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:52.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:52 np0005603622 nova_compute[221301]: 2026-01-31 07:48:52.727 221324 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:48:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:53.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:53 np0005603622 nova_compute[221301]: 2026-01-31 07:48:53.960 221324 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.002 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.003 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.003 221324 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:48:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:54.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.844 221324 DEBUG oslo_concurrency.processutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.784s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.844 221324 INFO nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deleting local config drive /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7/disk.config because it was imported into RBD.#033[00m
Jan 31 02:48:54 np0005603622 kernel: tap3aff2339-cc: entered promiscuous mode
Jan 31 02:48:54 np0005603622 NetworkManager[49080]: <info>  [1769845734.9004] manager: (tap3aff2339-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.903 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00037|binding|INFO|Claiming lport 3aff2339-ccc0-4845-8728-4ede26d0c11a for this chassis.
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00038|binding|INFO|3aff2339-ccc0-4845-8728-4ede26d0c11a: Claiming fa:16:3e:e0:f2:07 10.100.0.11
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00039|binding|INFO|Claiming lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 for this chassis.
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00040|binding|INFO|fc5261b7-0e3f-49d1-8fbf-8dcf40626991: Claiming fa:16:3e:0b:cb:fc 19.80.0.218
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.914 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.927 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:f2:07 10.100.0.11'], port_security=['fa:16:3e:e0:f2:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1970562059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1970562059', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3aff2339-ccc0-4845-8728-4ede26d0c11a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.930 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:cb:fc 19.80.0.218'], port_security=['fa:16:3e:0b:cb:fc 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3aff2339-ccc0-4845-8728-4ede26d0c11a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-427751920', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3613479-5299-41cd-b6dd-df1fae2ae862', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-427751920', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0ad390ce-c29b-4af4-b946-e8404e058f9b, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5261b7-0e3f-49d1-8fbf-8dcf40626991) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.932 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3aff2339-ccc0-4845-8728-4ede26d0c11a in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 bound to our chassis#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.935 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:48:54 np0005603622 systemd-machined[190406]: New machine qemu-6-instance-00000011.
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.953 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[97ec8a69-24f4-4e30-ad24-8afca1c3197c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.955 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap850ad6ca-61 in ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:48:54 np0005603622 systemd[1]: Started Virtual Machine qemu-6-instance-00000011.
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.957 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap850ad6ca-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.957 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09478a65-1f39-456a-ada6-3cc468a47f8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.958 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[af851664-3ae9-4755-bf75-73dc7f180ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00041|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a ovn-installed in OVS
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00042|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a up in Southbound
Jan 31 02:48:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:54Z|00043|binding|INFO|Setting lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 up in Southbound
Jan 31 02:48:54 np0005603622 nova_compute[221301]: 2026-01-31 07:48:54.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.970 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c7373b77-a1ec-4d43-9634-b19dc2f6f9a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:54 np0005603622 systemd-udevd[227258]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:54 np0005603622 NetworkManager[49080]: <info>  [1769845734.9879] device (tap3aff2339-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:48:54 np0005603622 NetworkManager[49080]: <info>  [1769845734.9887] device (tap3aff2339-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:54.985 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8beed3e-c599-40ec-a1a1-12725ae33f4c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.029 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb0db62-96d8-4f93-8e8a-dbe92cbfca38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 systemd-udevd[227261]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:55 np0005603622 NetworkManager[49080]: <info>  [1769845735.0375] manager: (tap850ad6ca-60): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.036 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6e147473-2c57-4d2f-9276-9f0aaca54219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.068 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4d320d-b60d-4f2b-a731-052334ab0627]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.070 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8e15db52-334e-4067-b5cf-ed8aab788efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 NetworkManager[49080]: <info>  [1769845735.0934] device (tap850ad6ca-60): carrier: link connected
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.100 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[696982ed-b8be-4fc6-b8c1-6ea38777458d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.116 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[db2cc7fa-95e3-4f99-99c2-3149f4f3a58e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498643, 'reachable_time': 40926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227289, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.130 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a173525-5033-4cf1-9c6f-141e79d030d8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:996f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498643, 'tstamp': 498643}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227290, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.144 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1e4599-9e5f-41a8-8426-d1013cbca54b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498643, 'reachable_time': 40926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227291, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.176 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f71d4269-9a7f-4b44-99cb-5f85791625c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.239 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4f2ae9-1911-49f5-94a3-46ecd6e74008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.241 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.242 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.243 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:55 np0005603622 NetworkManager[49080]: <info>  [1769845735.2462] manager: (tap850ad6ca-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:55 np0005603622 kernel: tap850ad6ca-60: entered promiscuous mode
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.248 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.251 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:55 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:55Z|00044|binding|INFO|Releasing lport 61b6889f-b848-4873-9650-8b2715794d29 from this chassis (sb_readonly=0)
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.252 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.255 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.256 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[81d5b156-c26f-4474-9ed4-3d36366fffb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.257 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:48:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:55.258 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'env', 'PROCESS_TAG=haproxy-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/850ad6ca-6166-4382-94bb-4b7c10d9a136.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:48:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.536 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845735.5350327, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.537 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Started (Lifecycle Event)#033[00m
Jan 31 02:48:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:55.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.570 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.576 221324 DEBUG nova.compute.manager [req-9b9e23e7-86ec-41a0-ba09-d2089fe6ebbf req-d51e445f-9fe1-489e-b3a8-22c6edc95622 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.576 221324 DEBUG oslo_concurrency.lockutils [req-9b9e23e7-86ec-41a0-ba09-d2089fe6ebbf req-d51e445f-9fe1-489e-b3a8-22c6edc95622 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.577 221324 DEBUG oslo_concurrency.lockutils [req-9b9e23e7-86ec-41a0-ba09-d2089fe6ebbf req-d51e445f-9fe1-489e-b3a8-22c6edc95622 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.577 221324 DEBUG oslo_concurrency.lockutils [req-9b9e23e7-86ec-41a0-ba09-d2089fe6ebbf req-d51e445f-9fe1-489e-b3a8-22c6edc95622 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.577 221324 DEBUG nova.compute.manager [req-9b9e23e7-86ec-41a0-ba09-d2089fe6ebbf req-d51e445f-9fe1-489e-b3a8-22c6edc95622 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Processing event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.578 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.584 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.586 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845735.5354295, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.586 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.593 221324 INFO nova.virt.libvirt.driver [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance spawned successfully.#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.594 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.650 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.660 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845735.5832129, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.660 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.666 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.666 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.667 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.667 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.668 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.668 221324 DEBUG nova.virt.libvirt.driver [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.733 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.736 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:48:55 np0005603622 podman[227363]: 2026-01-31 07:48:55.646639892 +0000 UTC m=+0.028754042 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.787 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.802 221324 INFO nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Took 17.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.802 221324 DEBUG nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.882 221324 INFO nova.compute.manager [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Took 18.76 seconds to build instance.#033[00m
Jan 31 02:48:55 np0005603622 nova_compute[221301]: 2026-01-31 07:48:55.902 221324 DEBUG oslo_concurrency.lockutils [None req-2602bf48-dde7-47aa-86c9-ae60fbc326c2 795c7f392cbc45f0885f081449883d42 cbdbb7a4b22a49b68feb3e028bb62fbb - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.135 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.178 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.179 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.179 221324 DEBUG nova.network.neutron [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.212 221324 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.242 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.244 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.245 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Creating instance directory: /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.245 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Ensure instance console log exists: /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.246 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.247 221324 DEBUG nova.virt.libvirt.vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:48Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.247 221324 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.248 221324 DEBUG nova.network.os_vif_util [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.248 221324 DEBUG os_vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.250 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.251 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:56 np0005603622 podman[227363]: 2026-01-31 07:48:56.253289923 +0000 UTC m=+0.635404043 container create ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.263 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06448a4a-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.263 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06448a4a-18, col_values=(('external_ids', {'iface-id': '06448a4a-1828-42e2-810c-09e0ca21c35f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:f3:77', 'vm-uuid': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:56 np0005603622 NetworkManager[49080]: <info>  [1769845736.2674] manager: (tap06448a4a-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.275 221324 INFO os_vif [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18')#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.276 221324 DEBUG nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.276 221324 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.360 221324 DEBUG nova.network.neutron [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:48:56 np0005603622 systemd[1]: Started libpod-conmon-ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04.scope.
Jan 31 02:48:56 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:48:56 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90e6e03bec82b6cdebc825f0973edf4bd1539c33bb3c0a510130eb6736d47f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:48:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:56.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.609 221324 DEBUG nova.network.neutron [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.634 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.728 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.731 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.731 221324 INFO nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Creating image(s)#033[00m
Jan 31 02:48:56 np0005603622 nova_compute[221301]: 2026-01-31 07:48:56.782 221324 DEBUG nova.storage.rbd_utils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] creating snapshot(nova-resize) on rbd image(39aa59fc-0e1c-4a01-860c-a7ff643e442f_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 02:48:56 np0005603622 podman[227363]: 2026-01-31 07:48:56.800679644 +0000 UTC m=+1.182793854 container init ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 02:48:56 np0005603622 podman[227363]: 2026-01-31 07:48:56.80800057 +0000 UTC m=+1.190114740 container start ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:48:56 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [NOTICE]   (227409) : New worker (227420) forked
Jan 31 02:48:56 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [NOTICE]   (227409) : Loading success.
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.976 139793 INFO neutron.agent.ovn.metadata.agent [-] Port fc5261b7-0e3f-49d1-8fbf-8dcf40626991 in datapath c3613479-5299-41cd-b6dd-df1fae2ae862 unbound from our chassis#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.979 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3613479-5299-41cd-b6dd-df1fae2ae862#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.989 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cc5047-b4c8-4340-bb8a-ce2a144864ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.989 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3613479-51 in ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.991 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3613479-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.991 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe064eeb-9d0a-4b65-8426-1b3e7eaf8904]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:56.992 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4bcf8a-f489-4bb5-bde4-bd21f428f561]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.006 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[51d57570-6fb5-4308-8c40-9ac73dadf0f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.018 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8740e5d1-5947-4892-b730-6b0b8c8521e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.046 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc2326-4723-451a-9347-12b739bb52a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 systemd-udevd[227281]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.055 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e5904855-5cf6-4421-97e5-d145fc8de3c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 NetworkManager[49080]: <info>  [1769845737.0609] manager: (tapc3613479-50): new Veth device (/org/freedesktop/NetworkManager/Devices/32)
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.082 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[49531c50-65bb-451a-b152-c8ac7aeceaba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.085 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[93fd0b2c-4e8f-412b-8162-91c39f7f9909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 NetworkManager[49080]: <info>  [1769845737.1004] device (tapc3613479-50): carrier: link connected
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.106 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b84ebb-b15f-491f-8dbc-13f422dc0bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.119 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae11f87-ddc9-4281-9cd6-c1be2bf8d2e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3613479-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:bf:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498844, 'reachable_time': 19431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227444, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.130 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[de2c6660-f484-4eea-b66d-7eb7af2181d9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:bf67'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498844, 'tstamp': 498844}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227445, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.139 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[956eda5a-6a96-4eb5-b8ab-0ef1cd2a4052]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3613479-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:bf:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498844, 'reachable_time': 19431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227446, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.157 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d24f30-dc73-4b77-a9a0-3dd193703c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.202 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bec0f697-aff7-46c1-8fbc-18ffae9c433e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.204 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3613479-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.205 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.206 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3613479-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:57 np0005603622 kernel: tapc3613479-50: entered promiscuous mode
Jan 31 02:48:57 np0005603622 NetworkManager[49080]: <info>  [1769845737.2094] manager: (tapc3613479-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.215 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3613479-50, col_values=(('external_ids', {'iface-id': '9dcf2f9f-4a2b-44f0-988c-28c5222b394c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:48:57 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:57Z|00045|binding|INFO|Releasing lport 9dcf2f9f-4a2b-44f0-988c-28c5222b394c from this chassis (sb_readonly=0)
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.231 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.232 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[162ef6cf-1b02-4536-913b-cc2f2a6ea200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.233 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c3613479-5299-41cd-b6dd-df1fae2ae862
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c3613479-5299-41cd-b6dd-df1fae2ae862.pid.haproxy
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c3613479-5299-41cd-b6dd-df1fae2ae862
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:48:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:48:57.234 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'env', 'PROCESS_TAG=haproxy-c3613479-5299-41cd-b6dd-df1fae2ae862', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3613479-5299-41cd-b6dd-df1fae2ae862.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:48:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:57.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:57 np0005603622 podman[227482]: 2026-01-31 07:48:57.626325381 +0000 UTC m=+0.095518915 container create 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 02:48:57 np0005603622 podman[227482]: 2026-01-31 07:48:57.563462774 +0000 UTC m=+0.032656318 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.703 221324 DEBUG nova.compute.manager [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.704 221324 DEBUG oslo_concurrency.lockutils [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.704 221324 DEBUG oslo_concurrency.lockutils [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.705 221324 DEBUG oslo_concurrency.lockutils [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.705 221324 DEBUG nova.compute.manager [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:48:57 np0005603622 nova_compute[221301]: 2026-01-31 07:48:57.705 221324 WARNING nova.compute.manager [req-f6105591-b6dc-43de-88c6-a6243d273806 req-3d4fac42-b683-468e-b6f0-12888c8bbba8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state None.#033[00m
Jan 31 02:48:57 np0005603622 systemd[1]: Started libpod-conmon-3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e.scope.
Jan 31 02:48:57 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:48:57 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b67c36e8309f05c7693d5cb637da44a0412e7eb6e95ff6d48a06a48a75186672/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:48:57 np0005603622 podman[227482]: 2026-01-31 07:48:57.889243757 +0000 UTC m=+0.358437291 container init 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:48:57 np0005603622 podman[227482]: 2026-01-31 07:48:57.898476395 +0000 UTC m=+0.367669929 container start 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:48:57 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [NOTICE]   (227501) : New worker (227503) forked
Jan 31 02:48:57 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [NOTICE]   (227501) : Loading success.
Jan 31 02:48:58 np0005603622 nova_compute[221301]: 2026-01-31 07:48:58.491 221324 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Port 06448a4a-1828-42e2-810c-09e0ca21c35f updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:48:58 np0005603622 nova_compute[221301]: 2026-01-31 07:48:58.494 221324 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvxots3o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5e4f7ec6-bb38-4a62-88f4-5e5b869452f0',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:48:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:48:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:48:58.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:48:58 np0005603622 systemd[1]: Starting libvirt proxy daemon...
Jan 31 02:48:58 np0005603622 systemd[1]: Started libvirt proxy daemon.
Jan 31 02:48:59 np0005603622 kernel: tap06448a4a-18: entered promiscuous mode
Jan 31 02:48:59 np0005603622 NetworkManager[49080]: <info>  [1769845739.1282] manager: (tap06448a4a-18): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 31 02:48:59 np0005603622 nova_compute[221301]: 2026-01-31 07:48:59.130 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:59Z|00046|binding|INFO|Claiming lport 06448a4a-1828-42e2-810c-09e0ca21c35f for this additional chassis.
Jan 31 02:48:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:59Z|00047|binding|INFO|06448a4a-1828-42e2-810c-09e0ca21c35f: Claiming fa:16:3e:da:f3:77 10.100.0.14
Jan 31 02:48:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:59Z|00048|binding|INFO|Claiming lport 3162329c-f03f-465e-9f99-799a29d883a0 for this additional chassis.
Jan 31 02:48:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:59Z|00049|binding|INFO|3162329c-f03f-465e-9f99-799a29d883a0: Claiming fa:16:3e:a4:c3:df 19.80.0.151
Jan 31 02:48:59 np0005603622 systemd-machined[190406]: New machine qemu-7-instance-00000010.
Jan 31 02:48:59 np0005603622 systemd-udevd[227545]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:48:59 np0005603622 systemd[1]: Started Virtual Machine qemu-7-instance-00000010.
Jan 31 02:48:59 np0005603622 NetworkManager[49080]: <info>  [1769845739.1853] device (tap06448a4a-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:48:59 np0005603622 NetworkManager[49080]: <info>  [1769845739.1864] device (tap06448a4a-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:48:59 np0005603622 nova_compute[221301]: 2026-01-31 07:48:59.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:48:59Z|00050|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f ovn-installed in OVS
Jan 31 02:48:59 np0005603622 nova_compute[221301]: 2026-01-31 07:48:59.194 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:48:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:48:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:48:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:48:59.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:00 np0005603622 nova_compute[221301]: 2026-01-31 07:49:00.199 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Check if temp file /var/lib/nova/instances/tmp8ndf3l1m exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:49:00 np0005603622 nova_compute[221301]: 2026-01-31 07:49:00.201 221324 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:49:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 31 02:49:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:00.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:01 np0005603622 nova_compute[221301]: 2026-01-31 07:49:01.074 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845741.0737638, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:01 np0005603622 nova_compute[221301]: 2026-01-31 07:49:01.074 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Started (Lifecycle Event)#033[00m
Jan 31 02:49:01 np0005603622 nova_compute[221301]: 2026-01-31 07:49:01.130 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:01 np0005603622 nova_compute[221301]: 2026-01-31 07:49:01.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:01.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:01 np0005603622 nova_compute[221301]: 2026-01-31 07:49:01.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.439 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845742.4388835, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.440 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.471 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.476 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.510 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 02:49:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:02.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:02 np0005603622 nova_compute[221301]: 2026-01-31 07:49:02.955 221324 DEBUG nova.objects.instance [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.195 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.196 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Ensure instance console log exists: /var/lib/nova/instances/39aa59fc-0e1c-4a01-860c-a7ff643e442f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.198 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.198 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.198 221324 DEBUG oslo_concurrency.lockutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.200 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.205 221324 WARNING nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.212 221324 DEBUG nova.virt.libvirt.host [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.213 221324 DEBUG nova.virt.libvirt.host [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.216 221324 DEBUG nova.virt.libvirt.host [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.217 221324 DEBUG nova.virt.libvirt.host [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.218 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.218 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.219 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.219 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.219 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.219 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.219 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.220 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.220 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.220 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.220 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.220 221324 DEBUG nova.virt.hardware [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.221 221324 DEBUG nova.objects.instance [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.244 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:03.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.691 221324 DEBUG nova.compute.manager [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.691 221324 DEBUG oslo_concurrency.lockutils [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.691 221324 DEBUG oslo_concurrency.lockutils [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.692 221324 DEBUG oslo_concurrency.lockutils [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.692 221324 DEBUG nova.compute.manager [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.692 221324 DEBUG nova.compute.manager [req-5df459c0-7325-4244-9ab0-9c15d9fda280 req-afd1c812-1717-4676-b2c3-fe36be55c589 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:49:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:49:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2957560768' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.718 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:03 np0005603622 nova_compute[221301]: 2026-01-31 07:49:03.784 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:49:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2262768545' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:49:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:04.592 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.592 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:04.594 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:49:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:04.595 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:04.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.762 221324 DEBUG oslo_concurrency.processutils [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.978s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.766 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <uuid>39aa59fc-0e1c-4a01-860c-a7ff643e442f</uuid>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <name>instance-0000000f</name>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <memory>196608</memory>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:name>tempest-MigrationsAdminTest-server-426382963</nova:name>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:49:03</nova:creationTime>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.micro">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:memory>192</nova:memory>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:user uuid="8a59efd78e244f44a1c70650f82a2c50">tempest-MigrationsAdminTest-1820348317-project-member</nova:user>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <nova:project uuid="1627a71b855b4032b51e234e44a9d570">tempest-MigrationsAdminTest-1820348317</nova:project>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="serial">39aa59fc-0e1c-4a01-860c-a7ff643e442f</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="uuid">39aa59fc-0e1c-4a01-860c-a7ff643e442f</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/39aa59fc-0e1c-4a01-860c-a7ff643e442f_disk">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/39aa59fc-0e1c-4a01-860c-a7ff643e442f_disk.config">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/39aa59fc-0e1c-4a01-860c-a7ff643e442f/console.log" append="off"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:49:04 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:49:04 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:49:04 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:49:04 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.917 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.917 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:49:04 np0005603622 nova_compute[221301]: 2026-01-31 07:49:04.918 221324 INFO nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Using config drive#033[00m
Jan 31 02:49:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00051|binding|INFO|Claiming lport 06448a4a-1828-42e2-810c-09e0ca21c35f for this chassis.
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00052|binding|INFO|06448a4a-1828-42e2-810c-09e0ca21c35f: Claiming fa:16:3e:da:f3:77 10.100.0.14
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00053|binding|INFO|Claiming lport 3162329c-f03f-465e-9f99-799a29d883a0 for this chassis.
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00054|binding|INFO|3162329c-f03f-465e-9f99-799a29d883a0: Claiming fa:16:3e:a4:c3:df 19.80.0.151
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00055|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f up in Southbound
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00056|binding|INFO|Setting lport 3162329c-f03f-465e-9f99-799a29d883a0 up in Southbound
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.546 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:f3:77 10.100.0.14'], port_security=['fa:16:3e:da:f3:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-116143868', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-116143868', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=06448a4a-1828-42e2-810c-09e0ca21c35f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.548 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:c3:df 19.80.0.151'], port_security=['fa:16:3e:a4:c3:df 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['06448a4a-1828-42e2-810c-09e0ca21c35f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1877730947', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1877730947', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0d46eecb-5024-425b-affd-165dd8eae0e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3162329c-f03f-465e-9f99-799a29d883a0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.548 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 06448a4a-1828-42e2-810c-09e0ca21c35f in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d bound to our chassis#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.550 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.557 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a45b63a-d7b9-49bc-bf9e-75d0478e6eba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.558 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap60bb4bea-d1 in ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.559 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap60bb4bea-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.559 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b56c6dc5-610a-4d94-8d0e-83e8cdbd4d80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.560 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8678d08b-fac8-4ce4-ba0e-f5fd44e2de8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:05.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.568 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a1a865-6764-4537-8623-c666b99f6016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.581 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8539539d-d353-4815-ad2a-9799d02cfb76]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.609 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a924ff21-ab3f-4a96-8f2c-9bbc5c93579f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.625 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d53c2d44-ad0f-4ffd-bcf8-204411016fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 NetworkManager[49080]: <info>  [1769845745.6269] manager: (tap60bb4bea-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 31 02:49:05 np0005603622 systemd-udevd[227719]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.653 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[523b0b07-d095-4081-87fa-21b1eb8d1fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.659 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[da1a9f9d-751c-4ffb-b7c5-fa7bd076c6c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 NetworkManager[49080]: <info>  [1769845745.6829] device (tap60bb4bea-d0): carrier: link connected
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.684 221324 INFO nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Took 4.66 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.685 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[35b7e03a-7416-4d1f-9ad1-ebfd1e10ce41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.687 221324 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.698 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f80275fb-e004-48bb-be93-d9deb9bd45a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499702, 'reachable_time': 37410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227740, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.709 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c221236-e28b-4013-8eaa-2209b5421c5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:b1c0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499702, 'tstamp': 499702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227741, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.710 221324 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ndf3l1m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9755ad61-6df1-4456-8860-6bd237ad62cb),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.722 221324 DEBUG nova.objects.instance [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lazy-loading 'migration_context' on Instance uuid 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.723 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.723 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[111f9d94-feb3-4c97-9e27-e5489b60f484]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499702, 'reachable_time': 37410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227745, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.726 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.726 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.745 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[878ab1b7-316c-417f-9276-4a3c20f89969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.747 221324 INFO nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Post operation of migration started#033[00m
Jan 31 02:49:05 np0005603622 systemd-machined[190406]: New machine qemu-8-instance-0000000f.
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.757 221324 DEBUG nova.virt.libvirt.vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:55Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.758 221324 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.758 221324 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.759 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:49:05 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:e0:f2:07"/>
Jan 31 02:49:05 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 02:49:05 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:49:05 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 02:49:05 np0005603622 nova_compute[221301]:  <target dev="tap3aff2339-cc"/>
Jan 31 02:49:05 np0005603622 nova_compute[221301]: </interface>
Jan 31 02:49:05 np0005603622 nova_compute[221301]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.759 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:49:05 np0005603622 systemd[1]: Started Virtual Machine qemu-8-instance-0000000f.
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.789 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8a8bcaa3-aff0-4e60-ad57-8b76be70e673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.790 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.791 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.791 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.793 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603622 NetworkManager[49080]: <info>  [1769845745.7947] manager: (tap60bb4bea-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 31 02:49:05 np0005603622 kernel: tap60bb4bea-d0: entered promiscuous mode
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.796 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.797 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:05Z|00057|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.805 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.807 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.808 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[71fc2c50-bfe1-4864-b5ee-ea9a14a3d5e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.809 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.pid.haproxy
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:05.811 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'env', 'PROCESS_TAG=haproxy-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/60bb4bea-d9f0-41fc-9c0f-6fcd644c255d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.814 221324 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.815 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.815 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.815 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.815 221324 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.815 221324 WARNING nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.816 221324 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-changed-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.816 221324 DEBUG nova.compute.manager [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Refreshing instance network info cache due to event network-changed-3aff2339-ccc0-4845-8728-4ede26d0c11a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.816 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.816 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:05 np0005603622 nova_compute[221301]: 2026-01-31 07:49:05.816 221324 DEBUG nova.network.neutron [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Refreshing network info cache for port 3aff2339-ccc0-4845-8728-4ede26d0c11a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.157 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.158 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.158 221324 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.229 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.229 221324 INFO nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:49:06 np0005603622 podman[227789]: 2026-01-31 07:49:06.257417464 +0000 UTC m=+0.102892532 container create a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.267 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603622 podman[227789]: 2026-01-31 07:49:06.187831247 +0000 UTC m=+0.033306415 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.304 221324 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:49:06 np0005603622 systemd[1]: Started libpod-conmon-a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8.scope.
Jan 31 02:49:06 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:49:06 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a6e43addc5cd709b5860f561fd40481c8ea93defd169cb5e93efc5ff73ae17f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:49:06 np0005603622 podman[227789]: 2026-01-31 07:49:06.387799764 +0000 UTC m=+0.233274912 container init a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:06 np0005603622 podman[227789]: 2026-01-31 07:49:06.392996153 +0000 UTC m=+0.238471261 container start a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 02:49:06 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [NOTICE]   (227806) : New worker (227808) forked
Jan 31 02:49:06 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [NOTICE]   (227806) : Loading success.
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.449 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3162329c-f03f-465e-9f99-799a29d883a0 in datapath 057fed11-d4e4-4c56-8ba3-81a6235ed1bf unbound from our chassis#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.451 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 057fed11-d4e4-4c56-8ba3-81a6235ed1bf#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.456 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aa081382-5332-41da-a540-8fcde9212bfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.457 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap057fed11-d1 in ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.458 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap057fed11-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.458 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[84c95ec5-b4bd-46c6-aebd-911df543a40b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.459 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f65190df-1b9e-4f7e-8aab-8952726e4d62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.468 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[05cfb2fb-3590-4a60-9364-20935a71d726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.486 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[466e7dc5-a6c0-4c32-aec4-978e691f044f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.508 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[01ee8022-5471-43bd-be1e-0a9123079604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.512 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[602763db-79da-473b-8090-8ca6f67eb10d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 systemd-udevd[227732]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:49:06 np0005603622 NetworkManager[49080]: <info>  [1769845746.5145] manager: (tap057fed11-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.541 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[41039625-6e80-4c6d-a69e-2851ac9d463e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.544 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f3187948-b74d-44a8-b184-2cd403943d57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 NetworkManager[49080]: <info>  [1769845746.5562] device (tap057fed11-d0): carrier: link connected
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.561 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca32772-c577-48a7-9bf0-d27d654a656c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.573 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7afffce9-65b8-4556-b78b-ac5f8c3c4fd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap057fed11-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:6a:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499789, 'reachable_time': 26945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227845, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[103cb8bb-fb74-4bdd-95fd-bbd09ccdb352]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:6a78'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499789, 'tstamp': 499789}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227846, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.598 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7d3d06-1bdf-46e4-b136-cbecaaebabe2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap057fed11-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:6a:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499789, 'reachable_time': 26945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227847, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:06.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.633 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16b2ce78-ba1a-4f2a-8b9f-7a7f4edf5d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.676 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9c02df-63d9-4b15-bfc8-e2f01bb9c59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.678 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap057fed11-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.678 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.678 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap057fed11-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:06 np0005603622 kernel: tap057fed11-d0: entered promiscuous mode
Jan 31 02:49:06 np0005603622 NetworkManager[49080]: <info>  [1769845746.6813] manager: (tap057fed11-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.681 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.682 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap057fed11-d0, col_values=(('external_ids', {'iface-id': '1e50dcb0-cab9-4443-a53d-151822a6eb9a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:06 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:06Z|00058|binding|INFO|Releasing lport 1e50dcb0-cab9-4443-a53d-151822a6eb9a from this chassis (sb_readonly=0)
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.690 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.691 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68fa0827-0f0c-4d60-bd8c-06e0c5acaa49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.691 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-057fed11-d4e4-4c56-8ba3-81a6235ed1bf
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.pid.haproxy
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 057fed11-d4e4-4c56-8ba3-81a6235ed1bf
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:49:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:06.692 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'env', 'PROCESS_TAG=haproxy-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/057fed11-d4e4-4c56-8ba3-81a6235ed1bf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.813 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.814 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:06 np0005603622 nova_compute[221301]: 2026-01-31 07:49:06.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:07 np0005603622 podman[227879]: 2026-01-31 07:49:07.083634428 +0000 UTC m=+0.080109921 container create 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:07 np0005603622 systemd[1]: Started libpod-conmon-832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09.scope.
Jan 31 02:49:07 np0005603622 podman[227879]: 2026-01-31 07:49:07.039148114 +0000 UTC m=+0.035623607 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:49:07 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:49:07 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5fd6e559bdff01cbd8e6f5c9eeb2b51f2123f2f13e65bda8f8b234fca728bf9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:49:07 np0005603622 podman[227879]: 2026-01-31 07:49:07.189614052 +0000 UTC m=+0.186089615 container init 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:49:07 np0005603622 podman[227879]: 2026-01-31 07:49:07.193827155 +0000 UTC m=+0.190302668 container start 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:49:07 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [NOTICE]   (227898) : New worker (227900) forked
Jan 31 02:49:07 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [NOTICE]   (227898) : Loading success.
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.218 221324 DEBUG nova.network.neutron [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updated VIF entry in instance network info cache for port 3aff2339-ccc0-4845-8728-4ede26d0c11a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.220 221324 DEBUG nova.network.neutron [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating instance_info_cache with network_info: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.241 221324 DEBUG oslo_concurrency.lockutils [req-1310368d-004a-4098-870a-0ec4120c3476 req-87c1d382-7a2b-4f39-bc23-e1e0d1d36269 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.318 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.319 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.424 221324 DEBUG nova.network.neutron [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.465 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.482 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.482 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.483 221324 DEBUG oslo_concurrency.lockutils [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.488 221324 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:49:07 np0005603622 virtqemud[220925]: Domain id=7 name='instance-00000010' uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 is tainted: custom-monitor
Jan 31 02:49:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:07.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.821 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:07 np0005603622 nova_compute[221301]: 2026-01-31 07:49:07.822 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.496 221324 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.618 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845748.6176448, 39aa59fc-0e1c-4a01-860c-a7ff643e442f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.620 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:49:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:08.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.622 221324 DEBUG nova.compute.manager [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.626 221324 INFO nova.virt.libvirt.driver [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance running successfully.#033[00m
Jan 31 02:49:08 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.629 221324 DEBUG nova.virt.libvirt.guest [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.629 221324 DEBUG nova.virt.libvirt.driver [None req-1b02303c-227a-4379-a54f-519c12231a78 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.645 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.649 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.703 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.704 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845748.6215937, 39aa59fc-0e1c-4a01-860c-a7ff643e442f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.704 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] VM Started (Lifecycle Event)#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.953 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:08 np0005603622 nova_compute[221301]: 2026-01-31 07:49:08.954 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.078 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.084 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.122 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845748.9539382, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.122 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.137 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.142 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.159 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.457 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.458 221324 DEBUG nova.virt.libvirt.migration [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.504 221324 INFO nova.virt.libvirt.driver [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.509 221324 DEBUG nova.compute.manager [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:09 np0005603622 nova_compute[221301]: 2026-01-31 07:49:09.530 221324 DEBUG nova.objects.instance [None req-754226c5-5b2e-412e-898a-6428b72a43fb 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:49:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:09.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:10 np0005603622 kernel: tap3aff2339-cc (unregistering): left promiscuous mode
Jan 31 02:49:10 np0005603622 NetworkManager[49080]: <info>  [1769845750.2997] device (tap3aff2339-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.312 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00059|binding|INFO|Releasing lport 3aff2339-ccc0-4845-8728-4ede26d0c11a from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00060|binding|INFO|Setting lport 3aff2339-ccc0-4845-8728-4ede26d0c11a down in Southbound
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00061|binding|INFO|Releasing lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00062|binding|INFO|Setting lport fc5261b7-0e3f-49d1-8fbf-8dcf40626991 down in Southbound
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00063|binding|INFO|Removing iface tap3aff2339-cc ovn-installed in OVS
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.316 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00064|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00065|binding|INFO|Releasing lport 1e50dcb0-cab9-4443-a53d-151822a6eb9a from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00066|binding|INFO|Releasing lport 61b6889f-b848-4873-9650-8b2715794d29 from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:10Z|00067|binding|INFO|Releasing lport 9dcf2f9f-4a2b-44f0-988c-28c5222b394c from this chassis (sb_readonly=0)
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.322 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:f2:07 10.100.0.11'], port_security=['fa:16:3e:e0:f2:07 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7ec8bf38-9571-4400-a85c-6bd5ac54bdf3'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1970562059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1970562059', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3aff2339-ccc0-4845-8728-4ede26d0c11a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.325 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:cb:fc 19.80.0.218'], port_security=['fa:16:3e:0b:cb:fc 19.80.0.218'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3aff2339-ccc0-4845-8728-4ede26d0c11a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-427751920', 'neutron:cidrs': '19.80.0.218/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3613479-5299-41cd-b6dd-df1fae2ae862', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-427751920', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0ad390ce-c29b-4af4-b946-e8404e058f9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5261b7-0e3f-49d1-8fbf-8dcf40626991) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.326 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3aff2339-ccc0-4845-8728-4ede26d0c11a in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 unbound from our chassis#033[00m
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.328 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 850ad6ca-6166-4382-94bb-4b7c10d9a136, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.329 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1a3476-a804-44da-88eb-5a8ffc3a777c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:10.330 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace which is not needed anymore#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.361 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:10 np0005603622 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 31 02:49:10 np0005603622 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000011.scope: Consumed 12.118s CPU time.
Jan 31 02:49:10 np0005603622 systemd-machined[190406]: Machine qemu-6-instance-00000011 terminated.
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:49:10 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk: No such file or directory
Jan 31 02:49:10 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_disk: No such file or directory
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.517 221324 DEBUG nova.virt.libvirt.guest [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.518 221324 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migration operation has completed#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.518 221324 INFO nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] _post_live_migration() is started..#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.526 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.527 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:49:10 np0005603622 nova_compute[221301]: 2026-01-31 07:49:10.527 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:49:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:10 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [NOTICE]   (227409) : haproxy version is 2.8.14-c23fe91
Jan 31 02:49:10 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [NOTICE]   (227409) : path to executable is /usr/sbin/haproxy
Jan 31 02:49:10 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [WARNING]  (227409) : Exiting Master process...
Jan 31 02:49:10 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [ALERT]    (227409) : Current worker (227420) exited with code 143 (Terminated)
Jan 31 02:49:10 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[227380]: [WARNING]  (227409) : All workers exited. Exiting... (0)
Jan 31 02:49:10 np0005603622 systemd[1]: libpod-ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04.scope: Deactivated successfully.
Jan 31 02:49:10 np0005603622 conmon[227380]: conmon ab5b8146732a6c91f52e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04.scope/container/memory.events
Jan 31 02:49:10 np0005603622 podman[227958]: 2026-01-31 07:49:10.5710725 +0000 UTC m=+0.149789811 container died ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:49:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:10.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04-userdata-shm.mount: Deactivated successfully.
Jan 31 02:49:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-a90e6e03bec82b6cdebc825f0973edf4bd1539c33bb3c0a510130eb6736d47f4-merged.mount: Deactivated successfully.
Jan 31 02:49:11 np0005603622 podman[227958]: 2026-01-31 07:49:11.046338165 +0000 UTC m=+0.625055476 container cleanup ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:49:11 np0005603622 systemd[1]: libpod-conmon-ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04.scope: Deactivated successfully.
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.308 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:11 np0005603622 podman[228001]: 2026-01-31 07:49:11.321958952 +0000 UTC m=+0.254626034 container remove ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.327 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0567423-0abf-4af6-af87-0907eb4c529f]: (4, ('Sat Jan 31 07:49:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04)\nab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04\nSat Jan 31 07:49:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (ab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04)\nab5b8146732a6c91f52e7e5e882c1b4d583a9e202a3f1eec0bc316ccc9d19b04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.329 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[63febda5-5eb0-4ca2-80df-e957d3b77161]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.330 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:11 np0005603622 kernel: tap850ad6ca-60: left promiscuous mode
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.346 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.350 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e44a2d78-6733-4977-b160-36d310e9add9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.367 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3d00439c-8e5b-4233-b8b7-00a91824f029]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.368 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0a21d8eb-e75d-469a-8ba8-683b9b6bc44b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.385 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb96bfe5-4955-45d8-ac35-29d9bbd79aa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498636, 'reachable_time': 39995, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228020, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 systemd[1]: run-netns-ovnmeta\x2d850ad6ca\x2d6166\x2d4382\x2d94bb\x2d4b7c10d9a136.mount: Deactivated successfully.
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.390 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.391 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a36638cf-ec98-4768-bc6e-3a30fe601032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.392 139793 INFO neutron.agent.ovn.metadata.agent [-] Port fc5261b7-0e3f-49d1-8fbf-8dcf40626991 in datapath c3613479-5299-41cd-b6dd-df1fae2ae862 unbound from our chassis#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.393 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3613479-5299-41cd-b6dd-df1fae2ae862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.394 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c134305f-7b04-4175-b120-2824ac80f6b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:11.394 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 namespace which is not needed anymore#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.422 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.423 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.527 221324 DEBUG nova.compute.manager [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.528 221324 DEBUG oslo_concurrency.lockutils [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.528 221324 DEBUG oslo_concurrency.lockutils [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.528 221324 DEBUG oslo_concurrency.lockutils [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.529 221324 DEBUG nova.compute.manager [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.529 221324 DEBUG nova.compute.manager [req-89eda456-4053-43ca-90b4-17b97056a6fc req-ab555c60-bdff-4152-9485-3fde8880c04b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-unplugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:49:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:11.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:11 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [NOTICE]   (227501) : haproxy version is 2.8.14-c23fe91
Jan 31 02:49:11 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [NOTICE]   (227501) : path to executable is /usr/sbin/haproxy
Jan 31 02:49:11 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [WARNING]  (227501) : Exiting Master process...
Jan 31 02:49:11 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [ALERT]    (227501) : Current worker (227503) exited with code 143 (Terminated)
Jan 31 02:49:11 np0005603622 neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862[227497]: [WARNING]  (227501) : All workers exited. Exiting... (0)
Jan 31 02:49:11 np0005603622 systemd[1]: libpod-3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e.scope: Deactivated successfully.
Jan 31 02:49:11 np0005603622 podman[228038]: 2026-01-31 07:49:11.659624823 +0000 UTC m=+0.178568753 container died 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 02:49:11 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e-userdata-shm.mount: Deactivated successfully.
Jan 31 02:49:11 np0005603622 systemd[1]: var-lib-containers-storage-overlay-b67c36e8309f05c7693d5cb637da44a0412e7eb6e95ff6d48a06a48a75186672-merged.mount: Deactivated successfully.
Jan 31 02:49:11 np0005603622 nova_compute[221301]: 2026-01-31 07:49:11.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:12 np0005603622 podman[228038]: 2026-01-31 07:49:12.223448455 +0000 UTC m=+0.742392415 container cleanup 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 02:49:12 np0005603622 systemd[1]: libpod-conmon-3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e.scope: Deactivated successfully.
Jan 31 02:49:12 np0005603622 nova_compute[221301]: 2026-01-31 07:49:12.413 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:49:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:12.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:12 np0005603622 podman[228070]: 2026-01-31 07:49:12.798829046 +0000 UTC m=+0.548448919 container remove 3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.805 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ce5a56-9e20-4e92-b907-adf63308d849]: (4, ('Sat Jan 31 07:49:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 (3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e)\n3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e\nSat Jan 31 07:49:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 (3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e)\n3916d1b9a815a6c33afc223ba8337841f6922c19f04be3cf4b1fe0101383846e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.808 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[556018c3-66af-4362-8bec-703c2fd6aa0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.810 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3613479-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:12 np0005603622 kernel: tapc3613479-50: left promiscuous mode
Jan 31 02:49:12 np0005603622 nova_compute[221301]: 2026-01-31 07:49:12.817 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:12 np0005603622 nova_compute[221301]: 2026-01-31 07:49:12.824 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04bb3e0a-5c6b-4ec5-b39e-b9475412660c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.838 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[974e3cbd-36ad-4cc3-8cd1-b36a1aee160c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.839 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8df58c0c-9980-4428-8fa7-86ad47ca85c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.854 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06d294dc-4300-40b8-8cc5-f24c9a787b61]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498838, 'reachable_time': 43386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228089, 'error': None, 'target': 'ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:12 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc3613479\x2d5299\x2d41cd\x2db6dd\x2ddf1fae2ae862.mount: Deactivated successfully.
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.860 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3613479-5299-41cd-b6dd-df1fae2ae862 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:49:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:12.860 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[29c36b00-b081-4b69-ad3b-3f1efb4833f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:49:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:13.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:13Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:f3:77 10.100.0.14
Jan 31 02:49:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:13Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:f3:77 10.100.0.14
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.418 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.442 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.443 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.443 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.444 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.444 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.444 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.479 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.479 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.480 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.480 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.481 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.542 221324 DEBUG nova.compute.manager [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.543 221324 DEBUG oslo_concurrency.lockutils [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.543 221324 DEBUG oslo_concurrency.lockutils [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.543 221324 DEBUG oslo_concurrency.lockutils [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.544 221324 DEBUG nova.compute.manager [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:14 np0005603622 nova_compute[221301]: 2026-01-31 07:49:14.544 221324 WARNING nova.compute.manager [req-dbc8f2cf-dbdd-4fbc-bc05-738768914399 req-4f840228-f2e9-4660-b136-fda227ea0bcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:14.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.034 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.130 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.131 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.139 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.139 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.294 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.296 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4579MB free_disk=20.814437866210938GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.296 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.296 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.380 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Applying migration context for instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f as it has an incoming, in-progress migration 73e0184c-1180-4ca4-8c3b-7550a25526aa. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.412 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating resource usage from migration 73e0184c-1180-4ca4-8c3b-7550a25526aa#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.413 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Updating resource usage from migration 9755ad61-6df1-4456-8860-6bd237ad62cb#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.482 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.483 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.483 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration 9755ad61-6df1-4456-8860-6bd237ad62cb is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.484 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.484 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:49:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:15 np0005603622 nova_compute[221301]: 2026-01-31 07:49:15.573 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:15.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/514551655' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:16 np0005603622 podman[228133]: 2026-01-31 07:49:16.180454119 +0000 UTC m=+0.097683842 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.190 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.195 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.218 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.247 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.248 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:16.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.651 221324 DEBUG nova.compute.manager [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.653 221324 DEBUG oslo_concurrency.lockutils [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.653 221324 DEBUG oslo_concurrency.lockutils [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.654 221324 DEBUG oslo_concurrency.lockutils [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.654 221324 DEBUG nova.compute.manager [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.654 221324 WARNING nova.compute.manager [req-02ad4a1e-8a90-4574-8cdf-25e97ea5e9aa req-ca09cb4f-31a1-4462-a63c-fc413970b6a5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:16 np0005603622 nova_compute[221301]: 2026-01-31 07:49:16.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 31 02:49:17 np0005603622 nova_compute[221301]: 2026-01-31 07:49:17.244 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:49:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:17.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.544 221324 DEBUG nova.network.neutron [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Activated binding for port 3aff2339-ccc0-4845-8728-4ede26d0c11a and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.547 221324 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.550 221324 DEBUG nova.virt.libvirt.vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1007393486',display_name='tempest-LiveMigrationTest-server-1007393486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1007393486',id=17,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-egj05et1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:48:59Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.550 221324 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "address": "fa:16:3e:e0:f2:07", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aff2339-cc", "ovs_interfaceid": "3aff2339-ccc0-4845-8728-4ede26d0c11a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.552 221324 DEBUG nova.network.os_vif_util [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.553 221324 DEBUG os_vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.558 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aff2339-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.560 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.575 221324 INFO os_vif [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:f2:07,bridge_name='br-int',has_traffic_filtering=True,id=3aff2339-ccc0-4845-8728-4ede26d0c11a,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3aff2339-cc')#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.575 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.575 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.575 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.576 221324 DEBUG nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.576 221324 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deleting instance files /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_del#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.577 221324 INFO nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Deletion of /var/lib/nova/instances/14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7_del complete#033[00m
Jan 31 02:49:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:49:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:18.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.883 221324 DEBUG nova.compute.manager [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.883 221324 DEBUG oslo_concurrency.lockutils [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.884 221324 DEBUG oslo_concurrency.lockutils [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.884 221324 DEBUG oslo_concurrency.lockutils [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.884 221324 DEBUG nova.compute.manager [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] No waiting events found dispatching network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:49:18 np0005603622 nova_compute[221301]: 2026-01-31 07:49:18.884 221324 WARNING nova.compute.manager [req-7f36ada7-4be5-497f-a8b6-abb524197415 req-c61ee830-c42c-4bd5-a37d-7fd84938c58d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Received unexpected event network-vif-plugged-3aff2339-ccc0-4845-8728-4ede26d0c11a for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:49:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:19.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:20.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:21 np0005603622 nova_compute[221301]: 2026-01-31 07:49:21.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:22.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:23 np0005603622 podman[228161]: 2026-01-31 07:49:23.116805432 +0000 UTC m=+0.043662033 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:49:23 np0005603622 nova_compute[221301]: 2026-01-31 07:49:23.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:23.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:24.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:25 np0005603622 nova_compute[221301]: 2026-01-31 07:49:25.517 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845750.5161638, 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:49:25 np0005603622 nova_compute[221301]: 2026-01-31 07:49:25.517 221324 INFO nova.compute.manager [-] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:49:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:25.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:25 np0005603622 nova_compute[221301]: 2026-01-31 07:49:25.649 221324 DEBUG nova.compute.manager [None req-560ab038-147c-4441-b27a-a0a0767d9e14 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:49:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.540 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.540 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.541 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.589 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.589 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.590 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.590 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.590 221324 DEBUG oslo_concurrency.processutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:26.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:26 np0005603622 nova_compute[221301]: 2026-01-31 07:49:26.902 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2694668329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.065 221324 DEBUG oslo_concurrency.processutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.187 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.187 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.190 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.190 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.356 221324 WARNING nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.358 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4496MB free_disk=20.78579330444336GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.358 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.359 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.495 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration for instance 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.568 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:49:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:27.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.622 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.622 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.623 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration 9755ad61-6df1-4456-8860-6bd237ad62cb is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.623 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.623 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:49:27 np0005603622 nova_compute[221301]: 2026-01-31 07:49:27.728 221324 DEBUG oslo_concurrency.processutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:49:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:49:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2849556412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.160 221324 DEBUG oslo_concurrency.processutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.165 221324 DEBUG nova.compute.provider_tree [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.200 221324 DEBUG nova.scheduler.client.report [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.239 221324 DEBUG nova.compute.resource_tracker [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.240 221324 DEBUG oslo_concurrency.lockutils [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.248 221324 INFO nova.compute.manager [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:49:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.451 221324 INFO nova.scheduler.client.report [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Deleted allocation for migration 9755ad61-6df1-4456-8860-6bd237ad62cb#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.452 221324 DEBUG nova.virt.libvirt.driver [None req-f2dcbf6f-0cfe-4c1b-97e1-3668419a0136 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:49:28 np0005603622 nova_compute[221301]: 2026-01-31 07:49:28.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:28.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:29.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:30.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:30.842 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:30.843 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:30.844 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:49:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:31.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:31 np0005603622 nova_compute[221301]: 2026-01-31 07:49:31.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:32.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:49:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:49:33 np0005603622 nova_compute[221301]: 2026-01-31 07:49:33.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:33.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:34.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:35.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:36.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:36 np0005603622 nova_compute[221301]: 2026-01-31 07:49:36.945 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:37.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:38 np0005603622 nova_compute[221301]: 2026-01-31 07:49:38.569 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:38.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:39.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:40 np0005603622 ovn_controller[130109]: 2026-01-31T07:49:40Z|00068|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 31 02:49:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:40.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:49:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:41.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:41 np0005603622 nova_compute[221301]: 2026-01-31 07:49:41.947 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:49:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:42.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:49:43 np0005603622 nova_compute[221301]: 2026-01-31 07:49:43.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:49:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:43.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:49:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:44.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:45.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:46 np0005603622 nova_compute[221301]: 2026-01-31 07:49:46.950 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:47 np0005603622 podman[228404]: 2026-01-31 07:49:47.159878029 +0000 UTC m=+0.080059129 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:49:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:47.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:48 np0005603622 nova_compute[221301]: 2026-01-31 07:49:48.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:48.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:49.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:49:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:50.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:49:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:51.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:51 np0005603622 nova_compute[221301]: 2026-01-31 07:49:51.954 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:52.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:52 np0005603622 nova_compute[221301]: 2026-01-31 07:49:52.860 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:52.860 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:49:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:49:52.860 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:49:53 np0005603622 nova_compute[221301]: 2026-01-31 07:49:53.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:53.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:54 np0005603622 podman[228430]: 2026-01-31 07:49:54.1349711 +0000 UTC m=+0.054747541 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 02:49:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:49:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:54.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:49:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:55.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:49:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:56.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:56 np0005603622 nova_compute[221301]: 2026-01-31 07:49:56.979 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:49:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:57.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:49:58 np0005603622 nova_compute[221301]: 2026-01-31 07:49:58.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:49:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:49:58.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:49:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:49:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:49:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:49:59.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 02:50:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:00.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:01.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:01.863 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:02 np0005603622 nova_compute[221301]: 2026-01-31 07:50:02.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:02 np0005603622 nova_compute[221301]: 2026-01-31 07:50:02.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:02.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:03 np0005603622 nova_compute[221301]: 2026-01-31 07:50:03.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:03.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:04.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:05.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.030 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "745b2d82-d010-4faa-a90f-93dbf0cab318" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.031 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.060 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.192 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.193 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.200 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.201 221324 INFO nova.compute.claims [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.562 221324 DEBUG nova.scheduler.client.report [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.588 221324 DEBUG nova.scheduler.client.report [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.589 221324 DEBUG nova.compute.provider_tree [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.608 221324 DEBUG nova.scheduler.client.report [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.635 221324 DEBUG nova.scheduler.client.report [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:50:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:06.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:06 np0005603622 nova_compute[221301]: 2026-01-31 07:50:06.729 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:07 np0005603622 nova_compute[221301]: 2026-01-31 07:50:07.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3074367402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:07 np0005603622 nova_compute[221301]: 2026-01-31 07:50:07.198 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:07 np0005603622 nova_compute[221301]: 2026-01-31 07:50:07.206 221324 DEBUG nova.compute.provider_tree [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:07.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:07 np0005603622 nova_compute[221301]: 2026-01-31 07:50:07.873 221324 DEBUG nova.scheduler.client.report [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:08 np0005603622 nova_compute[221301]: 2026-01-31 07:50:08.450 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:08 np0005603622 nova_compute[221301]: 2026-01-31 07:50:08.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:50:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:08.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:50:09 np0005603622 nova_compute[221301]: 2026-01-31 07:50:09.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:09 np0005603622 nova_compute[221301]: 2026-01-31 07:50:09.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:09 np0005603622 nova_compute[221301]: 2026-01-31 07:50:09.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:50:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:09.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:10 np0005603622 nova_compute[221301]: 2026-01-31 07:50:10.276 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:10 np0005603622 nova_compute[221301]: 2026-01-31 07:50:10.278 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:50:10 np0005603622 nova_compute[221301]: 2026-01-31 07:50:10.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:10 np0005603622 nova_compute[221301]: 2026-01-31 07:50:10.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:50:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:10.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:11.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:11 np0005603622 nova_compute[221301]: 2026-01-31 07:50:11.911 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 14112a7e-9ba3-4b0c-9bbb-7b646a5e05d7] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 31 02:50:12 np0005603622 nova_compute[221301]: 2026-01-31 07:50:12.069 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:12 np0005603622 nova_compute[221301]: 2026-01-31 07:50:12.078 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:50:12 np0005603622 nova_compute[221301]: 2026-01-31 07:50:12.078 221324 DEBUG nova.network.neutron [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:50:12 np0005603622 nova_compute[221301]: 2026-01-31 07:50:12.573 221324 INFO nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:50:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:12.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:13 np0005603622 nova_compute[221301]: 2026-01-31 07:50:13.250 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:50:13 np0005603622 nova_compute[221301]: 2026-01-31 07:50:13.631 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:13.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:14.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.276 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.278 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.278 221324 INFO nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Creating image(s)#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.312 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.350 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.380 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.385 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.436 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.437 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.438 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.438 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.470 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.474 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 745b2d82-d010-4faa-a90f-93dbf0cab318_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.588 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.589 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.590 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:50:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:15.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.734 221324 DEBUG nova.network.neutron [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:50:15 np0005603622 nova_compute[221301]: 2026-01-31 07:50:15.735 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:50:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:17 np0005603622 nova_compute[221301]: 2026-01-31 07:50:17.072 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:17.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:18 np0005603622 podman[228569]: 2026-01-31 07:50:18.232643381 +0000 UTC m=+0.156827750 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:50:18 np0005603622 nova_compute[221301]: 2026-01-31 07:50:18.327 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 745b2d82-d010-4faa-a90f-93dbf0cab318_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.853s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:18 np0005603622 nova_compute[221301]: 2026-01-31 07:50:18.415 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] resizing rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:50:18 np0005603622 nova_compute[221301]: 2026-01-31 07:50:18.634 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:18.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:18 np0005603622 nova_compute[221301]: 2026-01-31 07:50:18.973 221324 DEBUG nova.objects.instance [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lazy-loading 'migration_context' on Instance uuid 745b2d82-d010-4faa-a90f-93dbf0cab318 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.096 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.097 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Ensure instance console log exists: /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.097 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.097 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.098 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.099 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.104 221324 WARNING nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.109 221324 DEBUG nova.virt.libvirt.host [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.110 221324 DEBUG nova.virt.libvirt.host [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.113 221324 DEBUG nova.virt.libvirt.host [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.114 221324 DEBUG nova.virt.libvirt.host [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.115 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.116 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.116 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.117 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.117 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.117 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.118 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.118 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.118 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.119 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.119 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.119 221324 DEBUG nova.virt.hardware [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.123 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.583 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [{"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:19.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.808 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.808 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.808 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.809 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.809 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:19 np0005603622 nova_compute[221301]: 2026-01-31 07:50:19.809 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3495425129' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.097 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.973s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.150 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.157 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3121944426' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.652 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.654 221324 DEBUG nova.objects.instance [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lazy-loading 'pci_devices' on Instance uuid 745b2d82-d010-4faa-a90f-93dbf0cab318 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:20.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.895 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.896 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.896 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.897 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:50:20 np0005603622 nova_compute[221301]: 2026-01-31 07:50:20.897 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.011 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <uuid>745b2d82-d010-4faa-a90f-93dbf0cab318</uuid>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <name>instance-00000015</name>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-TenantUsagesTestJSON-server-835032280</nova:name>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:50:19</nova:creationTime>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:user uuid="f966a9f649044fefb548779aa0a57ade">tempest-TenantUsagesTestJSON-1295575535-project-member</nova:user>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <nova:project uuid="3c305ec558a4499ca2f3874df3d0a71c">tempest-TenantUsagesTestJSON-1295575535</nova:project>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="serial">745b2d82-d010-4faa-a90f-93dbf0cab318</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="uuid">745b2d82-d010-4faa-a90f-93dbf0cab318</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/745b2d82-d010-4faa-a90f-93dbf0cab318_disk">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/console.log" append="off"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:50:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:50:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:50:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:50:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:50:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.219 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.220 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.221 221324 INFO nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Using config drive#033[00m
Jan 31 02:50:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:21.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2953999482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.893 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:21 np0005603622 nova_compute[221301]: 2026-01-31 07:50:21.906 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.074 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.074 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.075 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.080 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.081 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.084 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.085 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.107 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating tmpfile /var/lib/nova/instances/tmpkeqo2xgm to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.109 221324 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.235 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating tmpfile /var/lib/nova/instances/tmpmr0jxxoo to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.236 221324 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.339 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.341 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4415MB free_disk=20.750160217285156GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.342 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.342 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.489 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.489 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.489 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 745b2d82-d010-4faa-a90f-93dbf0cab318 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.520 221324 WARNING nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.542 221324 INFO nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Creating config drive at /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.546 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptf2gkk87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.568 221324 WARNING nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.568 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.569 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.673 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptf2gkk87" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:50:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.893 221324 DEBUG nova.storage.rbd_utils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] rbd image 745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.898 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config 745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:22 np0005603622 nova_compute[221301]: 2026-01-31 07:50:22.968 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3398141734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.395 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.402 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.426 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.461 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.462 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.462 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.462 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.478 221324 DEBUG oslo_concurrency.processutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config 745b2d82-d010-4faa-a90f-93dbf0cab318_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.580s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.479 221324 INFO nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Deleting local config drive /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318/disk.config because it was imported into RBD.#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.485 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.486 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.486 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:50:23 np0005603622 systemd-machined[190406]: New machine qemu-9-instance-00000015.
Jan 31 02:50:23 np0005603622 systemd[1]: Started Virtual Machine qemu-9-instance-00000015.
Jan 31 02:50:23 np0005603622 nova_compute[221301]: 2026-01-31 07:50:23.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:23.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.008 221324 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.049 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.050 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.050 221324 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.125 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845824.1245394, 745b2d82-d010-4faa-a90f-93dbf0cab318 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.126 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.128 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.128 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.133 221324 INFO nova.virt.libvirt.driver [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance spawned successfully.#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.134 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.176 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.180 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.188 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.188 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.189 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.189 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.190 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.190 221324 DEBUG nova.virt.libvirt.driver [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.204 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.204 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845824.1258614, 745b2d82-d010-4faa-a90f-93dbf0cab318 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.204 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.227 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.231 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.260 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.287 221324 INFO nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Took 9.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.287 221324 DEBUG nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.363 221324 INFO nova.compute.manager [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Took 18.23 seconds to build instance.#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.386 221324 DEBUG oslo_concurrency.lockutils [None req-90fc46c1-b356-41e3-9f16-ced26a1d4591 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.501 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.502 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.591 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.625 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.625 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 745b2d82-d010-4faa-a90f-93dbf0cab318 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.626 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.626 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.626 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "745b2d82-d010-4faa-a90f-93dbf0cab318" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.703 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.703 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:24.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:24 np0005603622 nova_compute[221301]: 2026-01-31 07:50:24.744 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:25 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 31 02:50:25 np0005603622 podman[228890]: 2026-01-31 07:50:25.150487136 +0000 UTC m=+0.064412900 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:50:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:25.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:26 np0005603622 nova_compute[221301]: 2026-01-31 07:50:26.135 221324 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:26.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:26Z|00069|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:27.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.877 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "745b2d82-d010-4faa-a90f-93dbf0cab318" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.877 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.878 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "745b2d82-d010-4faa-a90f-93dbf0cab318-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.878 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.878 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.879 221324 INFO nova.compute.manager [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Terminating instance#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.880 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "refresh_cache-745b2d82-d010-4faa-a90f-93dbf0cab318" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.880 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquired lock "refresh_cache-745b2d82-d010-4faa-a90f-93dbf0cab318" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.880 221324 DEBUG nova.network.neutron [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.881 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.883 221324 DEBUG os_brick.utils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:50:27 np0005603622 nova_compute[221301]: 2026-01-31 07:50:27.884 221324 INFO oslo.privsep.daemon [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp95tipxzo/privsep.sock']#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.534 221324 DEBUG nova.network.neutron [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.563 221324 INFO oslo.privsep.daemon [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.410 228913 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.414 228913 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.417 228913 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.417 228913 INFO oslo.privsep.daemon [-] privsep daemon running as pid 228913#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.566 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[10339daf-6c1d-4da1-9c0e-29a48f7eb755]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.674 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.709 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:50:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:28.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.759 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.760 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[db6f23d3-c2ed-4c7f-a190-b863a6745b23]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.763 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.772 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.773 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[474b7e90-d833-40fd-85a6-53470c33136b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.778 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.789 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.790 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[c2caaa8a-f5b6-44d6-87dc-353bb53339ac]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.802 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[513b55e2-3f28-4c07-becb-e91d8fcf5df5]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.803 221324 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.835 221324 DEBUG oslo_concurrency.processutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.840 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.842 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.842 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:50:28 np0005603622 nova_compute[221301]: 2026-01-31 07:50:28.843 221324 DEBUG os_brick.utils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] <== get_connector_properties: return (959ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:50:29 np0005603622 nova_compute[221301]: 2026-01-31 07:50:29.353 221324 DEBUG nova.network.neutron [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:29 np0005603622 nova_compute[221301]: 2026-01-31 07:50:29.385 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Releasing lock "refresh_cache-745b2d82-d010-4faa-a90f-93dbf0cab318" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:29 np0005603622 nova_compute[221301]: 2026-01-31 07:50:29.386 221324 DEBUG nova.compute.manager [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:50:29 np0005603622 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 31 02:50:29 np0005603622 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000015.scope: Consumed 5.829s CPU time.
Jan 31 02:50:29 np0005603622 systemd-machined[190406]: Machine qemu-9-instance-00000015 terminated.
Jan 31 02:50:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:29.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:29 np0005603622 nova_compute[221301]: 2026-01-31 07:50:29.812 221324 INFO nova.virt.libvirt.driver [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance destroyed successfully.#033[00m
Jan 31 02:50:29 np0005603622 nova_compute[221301]: 2026-01-31 07:50:29.813 221324 DEBUG nova.objects.instance [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lazy-loading 'resources' on Instance uuid 745b2d82-d010-4faa-a90f-93dbf0cab318 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:50:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2518808239' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.367 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='4b9eb10a-fc66-42c4-9e17-3b8136950b54'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.369 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Creating instance directory: /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.369 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Ensure instance console log exists: /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.370 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.371 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.371 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.373 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.380 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.382 221324 DEBUG nova.virt.libvirt.vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:05Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.382 221324 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.383 221324 DEBUG nova.network.os_vif_util [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.384 221324 DEBUG os_vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.385 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.386 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.390 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab24842b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.391 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab24842b-00, col_values=(('external_ids', {'iface-id': 'ab24842b-0045-41e6-b6dc-51b110b51829', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:a1:9e', 'vm-uuid': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:30 np0005603622 NetworkManager[49080]: <info>  [1769845830.3955] manager: (tapab24842b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.402 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.403 221324 INFO os_vif [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.406 221324 DEBUG nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:50:30 np0005603622 nova_compute[221301]: 2026-01-31 07:50:30.406 221324 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='4b9eb10a-fc66-42c4-9e17-3b8136950b54'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:50:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:30.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:30.844 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:30.845 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:30.847 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:31.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.581 221324 INFO nova.virt.libvirt.driver [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Deleting instance files /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318_del#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.582 221324 INFO nova.virt.libvirt.driver [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Deletion of /var/lib/nova/instances/745b2d82-d010-4faa-a90f-93dbf0cab318_del complete#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.745 221324 INFO nova.compute.manager [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Took 3.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.746 221324 DEBUG oslo.service.loopingcall [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.746 221324 DEBUG nova.compute.manager [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:50:32 np0005603622 nova_compute[221301]: 2026-01-31 07:50:32.747 221324 DEBUG nova.network.neutron [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:50:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:32.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.574 221324 DEBUG nova.network.neutron [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.609 221324 DEBUG nova.network.neutron [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.636 221324 INFO nova.compute.manager [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Took 0.89 seconds to deallocate network for instance.#033[00m
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.701 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.702 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:33.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:33 np0005603622 nova_compute[221301]: 2026-01-31 07:50:33.937 221324 DEBUG oslo_concurrency.processutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.164 221324 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Port ab24842b-0045-41e6-b6dc-51b110b51829 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:50:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 31 02:50:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:50:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2923705469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.368 221324 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmr0jxxoo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='4b9eb10a-fc66-42c4-9e17-3b8136950b54'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.377 221324 DEBUG oslo_concurrency.processutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.382 221324 DEBUG nova.compute.provider_tree [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.397 221324 DEBUG nova.scheduler.client.report [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.453 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.497 221324 INFO nova.scheduler.client.report [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Deleted allocations for instance 745b2d82-d010-4faa-a90f-93dbf0cab318#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.646 221324 DEBUG oslo_concurrency.lockutils [None req-01048d50-45e5-459c-9daf-e5419658e419 f966a9f649044fefb548779aa0a57ade 3c305ec558a4499ca2f3874df3d0a71c - - default default] Lock "745b2d82-d010-4faa-a90f-93dbf0cab318" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:34 np0005603622 kernel: tapab24842b-00: entered promiscuous mode
Jan 31 02:50:34 np0005603622 NetworkManager[49080]: <info>  [1769845834.6525] manager: (tapab24842b-00): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:34Z|00070|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this additional chassis.
Jan 31 02:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:34Z|00071|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:34Z|00072|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 ovn-installed in OVS
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:34 np0005603622 nova_compute[221301]: 2026-01-31 07:50:34.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:34 np0005603622 systemd-udevd[228982]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:34 np0005603622 systemd-machined[190406]: New machine qemu-10-instance-00000014.
Jan 31 02:50:34 np0005603622 NetworkManager[49080]: <info>  [1769845834.7050] device (tapab24842b-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:50:34 np0005603622 NetworkManager[49080]: <info>  [1769845834.7058] device (tapab24842b-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:50:34 np0005603622 systemd[1]: Started Virtual Machine qemu-10-instance-00000014.
Jan 31 02:50:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:34.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:35 np0005603622 nova_compute[221301]: 2026-01-31 07:50:35.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:35 np0005603622 nova_compute[221301]: 2026-01-31 07:50:35.521 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845835.5208018, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:35 np0005603622 nova_compute[221301]: 2026-01-31 07:50:35.521 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:35 np0005603622 nova_compute[221301]: 2026-01-31 07:50:35.647 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:35.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.069 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845836.0692618, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.070 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.092 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.098 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.146 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 02:50:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:36.815 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:36 np0005603622 nova_compute[221301]: 2026-01-31 07:50:36.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:36.817 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:50:37 np0005603622 nova_compute[221301]: 2026-01-31 07:50:37.088 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:37.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:38.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:39 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:39Z|00073|binding|INFO|Claiming lport ab24842b-0045-41e6-b6dc-51b110b51829 for this chassis.
Jan 31 02:50:39 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:39Z|00074|binding|INFO|ab24842b-0045-41e6-b6dc-51b110b51829: Claiming fa:16:3e:b4:a1:9e 10.100.0.9
Jan 31 02:50:39 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:39Z|00075|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 up in Southbound
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.600 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.602 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d bound to our chassis#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.604 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.638 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[caf64c61-52fc-4ba6-b6fc-f26e1711f493]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.665 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[95590131-3121-424d-8851-624c0bd15dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.669 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[680dd5b4-00a4-4548-8bea-f67fe8ddcfb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.690 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[19ff569b-a0ea-4d11-8e5b-0b531821c4b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.704 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04a8d89a-be40-4184-8cf9-37163bcaa009]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499702, 'reachable_time': 37410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229038, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:39.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.719 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[20231b66-bb79-481c-99d8-2bd7b039ab2c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap60bb4bea-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499710, 'tstamp': 499710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229039, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap60bb4bea-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499712, 'tstamp': 499712}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229039, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.721 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:39 np0005603622 nova_compute[221301]: 2026-01-31 07:50:39.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.724 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.724 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.725 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.725 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:39 np0005603622 nova_compute[221301]: 2026-01-31 07:50:39.769 221324 INFO nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Post operation of migration started#033[00m
Jan 31 02:50:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:39.819 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:40 np0005603622 nova_compute[221301]: 2026-01-31 07:50:40.352 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:40 np0005603622 nova_compute[221301]: 2026-01-31 07:50:40.353 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:40 np0005603622 nova_compute[221301]: 2026-01-31 07:50:40.353 221324 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:40 np0005603622 nova_compute[221301]: 2026-01-31 07:50:40.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:40.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:50:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:50:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:42 np0005603622 nova_compute[221301]: 2026-01-31 07:50:42.092 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:42.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.266 221324 DEBUG nova.network.neutron [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.287 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.308 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.308 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.309 221324 DEBUG oslo_concurrency.lockutils [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:43 np0005603622 nova_compute[221301]: 2026-01-31 07:50:43.313 221324 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:50:43 np0005603622 virtqemud[220925]: Domain id=10 name='instance-00000014' uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 is tainted: custom-monitor
Jan 31 02:50:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:43.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:44 np0005603622 nova_compute[221301]: 2026-01-31 07:50:44.321 221324 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:50:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:44.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:44 np0005603622 nova_compute[221301]: 2026-01-31 07:50:44.810 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845829.8095546, 745b2d82-d010-4faa-a90f-93dbf0cab318 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:44 np0005603622 nova_compute[221301]: 2026-01-31 07:50:44.811 221324 INFO nova.compute.manager [-] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:50:44 np0005603622 nova_compute[221301]: 2026-01-31 07:50:44.851 221324 DEBUG nova.compute.manager [None req-66008e66-a2d2-4273-9da8-e4fdf4571695 - - - - - -] [instance: 745b2d82-d010-4faa-a90f-93dbf0cab318] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:45 np0005603622 nova_compute[221301]: 2026-01-31 07:50:45.326 221324 INFO nova.virt.libvirt.driver [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:50:45 np0005603622 nova_compute[221301]: 2026-01-31 07:50:45.335 221324 DEBUG nova.compute.manager [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:45 np0005603622 nova_compute[221301]: 2026-01-31 07:50:45.369 221324 DEBUG nova.objects.instance [None req-3617e078-2792-4def-b0fe-7472ceae326c 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:50:45 np0005603622 nova_compute[221301]: 2026-01-31 07:50:45.398 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:45.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:46.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:47 np0005603622 nova_compute[221301]: 2026-01-31 07:50:47.094 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:47.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:48 np0005603622 nova_compute[221301]: 2026-01-31 07:50:48.307 221324 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 31 02:50:48 np0005603622 nova_compute[221301]: 2026-01-31 07:50:48.339 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:48 np0005603622 nova_compute[221301]: 2026-01-31 07:50:48.340 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:48 np0005603622 nova_compute[221301]: 2026-01-31 07:50:48.340 221324 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:50:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:50:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:48.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:50:49 np0005603622 podman[229221]: 2026-01-31 07:50:49.1840543 +0000 UTC m=+0.102478641 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:50:49 np0005603622 nova_compute[221301]: 2026-01-31 07:50:49.211 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Check if temp file /var/lib/nova/instances/tmpwobuud1k exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:50:49 np0005603622 nova_compute[221301]: 2026-01-31 07:50:49.212 221324 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:50:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:49.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.131 221324 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.156 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.159 221324 DEBUG os_brick.utils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.160 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.170 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.171 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[5632f58e-4e3c-4795-abdd-fc53b18bfccc]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.172 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.181 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.181 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[72c0d559-b8a7-4205-a2a5-0f7825bd3f24]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.183 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.193 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.193 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[3788528d-4d97-4e76-83fc-6c0cb7ca4dbe]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.195 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a4a661-3e67-4a02-883e-1d97297a4d4d]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.196 221324 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.217 221324 DEBUG oslo_concurrency.processutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.220 221324 DEBUG os_brick.initiator.connectors.lightos [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.221 221324 DEBUG os_brick.initiator.connectors.lightos [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.222 221324 DEBUG os_brick.initiator.connectors.lightos [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.222 221324 DEBUG os_brick.utils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] <== get_connector_properties: return (63ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:50:50 np0005603622 nova_compute[221301]: 2026-01-31 07:50:50.400 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:50.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.446 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='da773656-1631-4e7b-855d-ed146c908f6b'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.447 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Creating instance directory: /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.448 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Ensure instance console log exists: /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.448 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.451 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.452 221324 DEBUG nova.virt.libvirt.vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:04Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.453 221324 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.453 221324 DEBUG nova.network.os_vif_util [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.454 221324 DEBUG os_vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.455 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.455 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.459 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31ab3c80-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.459 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31ab3c80-79, col_values=(('external_ids', {'iface-id': '31ab3c80-791f-418d-a70b-fcb0d523a037', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:11:0b', 'vm-uuid': '4b48cc05-9edd-4e4d-a58e-84564afb0612'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:51 np0005603622 NetworkManager[49080]: <info>  [1769845851.4627] manager: (tap31ab3c80-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.469 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.470 221324 INFO os_vif [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.473 221324 DEBUG nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 31 02:50:51 np0005603622 nova_compute[221301]: 2026-01-31 07:50:51.473 221324 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='da773656-1631-4e7b-855d-ed146c908f6b'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 31 02:50:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:51.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:52 np0005603622 nova_compute[221301]: 2026-01-31 07:50:52.098 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:50:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:52.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.364 221324 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.539 221324 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkeqo2xgm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='da773656-1631-4e7b-855d-ed146c908f6b'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 31 02:50:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:53.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:53 np0005603622 kernel: tap31ab3c80-79: entered promiscuous mode
Jan 31 02:50:53 np0005603622 NetworkManager[49080]: <info>  [1769845853.8144] manager: (tap31ab3c80-79): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:53 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:53Z|00076|binding|INFO|Claiming lport 31ab3c80-791f-418d-a70b-fcb0d523a037 for this additional chassis.
Jan 31 02:50:53 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:53Z|00077|binding|INFO|31ab3c80-791f-418d-a70b-fcb0d523a037: Claiming fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:53 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:53Z|00078|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 ovn-installed in OVS
Jan 31 02:50:53 np0005603622 systemd-udevd[229268]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:53 np0005603622 nova_compute[221301]: 2026-01-31 07:50:53.844 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:53 np0005603622 NetworkManager[49080]: <info>  [1769845853.8574] device (tap31ab3c80-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:50:53 np0005603622 NetworkManager[49080]: <info>  [1769845853.8585] device (tap31ab3c80-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:50:53 np0005603622 systemd-machined[190406]: New machine qemu-11-instance-00000013.
Jan 31 02:50:53 np0005603622 systemd[1]: Started Virtual Machine qemu-11-instance-00000013.
Jan 31 02:50:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:54.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:54 np0005603622 nova_compute[221301]: 2026-01-31 07:50:54.995 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845854.994744, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:54 np0005603622 nova_compute[221301]: 2026-01-31 07:50:54.995 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Started (Lifecycle Event)#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.028 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.572 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845855.5721126, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.573 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.603 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.608 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:50:55 np0005603622 nova_compute[221301]: 2026-01-31 07:50:55.637 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 02:50:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:55.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:50:56 np0005603622 podman[229322]: 2026-01-31 07:50:56.178620146 +0000 UTC m=+0.093056909 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:50:56 np0005603622 nova_compute[221301]: 2026-01-31 07:50:56.463 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:56.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:57 np0005603622 nova_compute[221301]: 2026-01-31 07:50:57.100 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:57.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:58Z|00079|binding|INFO|Claiming lport 31ab3c80-791f-418d-a70b-fcb0d523a037 for this chassis.
Jan 31 02:50:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:58Z|00080|binding|INFO|31ab3c80-791f-418d-a70b-fcb0d523a037: Claiming fa:16:3e:8b:11:0b 10.100.0.6
Jan 31 02:50:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:58Z|00081|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 up in Southbound
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.139 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.140 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 bound to our chassis#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.141 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 850ad6ca-6166-4382-94bb-4b7c10d9a136#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.149 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d105e51e-fbc7-4ab5-bef5-a7e4bb2aa6d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.150 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap850ad6ca-61 in ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.152 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap850ad6ca-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.152 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d85c36bc-0e10-404c-b904-c7f6713cbed8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.154 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[66cd916e-7310-4d4c-a7e0-bbd407b7bf6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.168 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b086a5-ab46-4894-ac82-8186573691fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.182 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[72cd3f88-0ec7-4a87-8ff8-e7bc597b2384]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.213 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[710650f1-9d17-4c3e-98c5-cdf68d7641b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 NetworkManager[49080]: <info>  [1769845858.2229] manager: (tap850ad6ca-60): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.221 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[62e706bc-f26f-48fb-b2e5-d6e4d73e3ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.249 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e2095322-818d-426b-9dbd-8d96ca005f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.255 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[39381fa1-bd75-472f-bdc2-da77b1f6459f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 systemd-udevd[229350]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:50:58 np0005603622 NetworkManager[49080]: <info>  [1769845858.2903] device (tap850ad6ca-60): carrier: link connected
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.294 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6d501750-7675-4df4-813c-71cbb41bc753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.312 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe522c0b-ee0a-451a-864b-88a621d860c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510963, 'reachable_time': 33661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229369, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.331 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4203b4-0ffa-4be1-8311-8800040b3e0f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:996f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510963, 'tstamp': 510963}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229370, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.348 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1080b9-bd0d-4326-8cc9-2afde29e5f37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap850ad6ca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:99:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510963, 'reachable_time': 33661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229371, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.382 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6d7e34-2a3b-4e35-a796-287066edbd83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.433 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b90f87d1-bce8-41f5-a03f-c65b1cf94b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.435 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.435 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.435 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap850ad6ca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:58 np0005603622 kernel: tap850ad6ca-60: entered promiscuous mode
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603622 NetworkManager[49080]: <info>  [1769845858.4797] manager: (tap850ad6ca-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.483 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap850ad6ca-60, col_values=(('external_ids', {'iface-id': '61b6889f-b848-4873-9650-8b2715794d29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:50:58Z|00082|binding|INFO|Releasing lport 61b6889f-b848-4873-9650-8b2715794d29 from this chassis (sb_readonly=0)
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.486 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.487 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.488 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[586e4968-c4b2-47a4-938b-db8777b931cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.489 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/850ad6ca-6166-4382-94bb-4b7c10d9a136.pid.haproxy
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 850ad6ca-6166-4382-94bb-4b7c10d9a136
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:50:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:50:58.490 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'env', 'PROCESS_TAG=haproxy-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/850ad6ca-6166-4382-94bb-4b7c10d9a136.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.538 221324 INFO nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Post operation of migration started#033[00m
Jan 31 02:50:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:50:58.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.806 221324 INFO nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Took 8.39 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.806 221324 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.863 221324 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwobuud1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79350fb7-3eed-4a3b-a7e9-f0ec90460ac3',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(c91a31e3-ac93-4843-9897-f5679755f4a7),old_vol_attachment_ids={317b1c6b-4f89-402c-94d1-f4852844f1e2='c20de688-0876-4e49-80fa-40bec74574ff'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.870 221324 DEBUG nova.objects.instance [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.872 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.873 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.873 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.904 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Find same serial number: pos=1, serial=317b1c6b-4f89-402c-94d1-f4852844f1e2 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.905 221324 DEBUG nova.virt.libvirt.vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:45Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.905 221324 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.906 221324 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.906 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:50:58 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:b4:a1:9e"/>
Jan 31 02:50:58 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 02:50:58 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:50:58 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 02:50:58 np0005603622 nova_compute[221301]:  <target dev="tapab24842b-00"/>
Jan 31 02:50:58 np0005603622 nova_compute[221301]: </interface>
Jan 31 02:50:58 np0005603622 nova_compute[221301]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.907 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:50:58 np0005603622 podman[229404]: 2026-01-31 07:50:58.864028043 +0000 UTC m=+0.031326841 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.960 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.961 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.961 221324 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.993 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.995 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.995 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.996 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.996 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.996 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.997 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.997 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.997 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.998 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.998 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.998 221324 WARNING nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.998 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.999 221324 DEBUG nova.compute.manager [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing instance network info cache due to event network-changed-ab24842b-0045-41e6-b6dc-51b110b51829. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.999 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:50:58 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.999 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:58.999 221324 DEBUG nova.network.neutron [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Refreshing network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:50:59 np0005603622 podman[229404]: 2026-01-31 07:50:59.043946272 +0000 UTC m=+0.211245070 container create b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 02:50:59 np0005603622 systemd[1]: Started libpod-conmon-b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d.scope.
Jan 31 02:50:59 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:50:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f48d353b313b11b93c1869a3391dad6f8960ffe4507a363f2eab10a49f37257/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:50:59 np0005603622 podman[229404]: 2026-01-31 07:50:59.222439562 +0000 UTC m=+0.389738350 container init b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:50:59 np0005603622 podman[229404]: 2026-01-31 07:50:59.230742995 +0000 UTC m=+0.398041783 container start b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 02:50:59 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [NOTICE]   (229423) : New worker (229425) forked
Jan 31 02:50:59 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [NOTICE]   (229423) : Loading success.
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:59.376 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:59.378 221324 INFO nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:59.474 221324 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:50:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:50:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:50:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:50:59.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:59.977 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:50:59 np0005603622 nova_compute[221301]: 2026-01-31 07:50:59.978 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:00 np0005603622 nova_compute[221301]: 2026-01-31 07:51:00.480 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:00 np0005603622 nova_compute[221301]: 2026-01-31 07:51:00.480 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:00.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:00 np0005603622 nova_compute[221301]: 2026-01-31 07:51:00.984 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:00 np0005603622 nova_compute[221301]: 2026-01-31 07:51:00.985 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:01 np0005603622 nova_compute[221301]: 2026-01-31 07:51:01.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:01.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.159 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.160 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.496 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845862.4961307, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.496 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.515 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.519 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.541 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.664 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.664 221324 DEBUG nova.virt.libvirt.migration [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.670 221324 DEBUG nova.network.neutron [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.682 221324 DEBUG nova.network.neutron [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updated VIF entry in instance network info cache for port ab24842b-0045-41e6-b6dc-51b110b51829. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.683 221324 DEBUG nova.network.neutron [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Updating instance_info_cache with network_info: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.703 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.707 221324 DEBUG oslo_concurrency.lockutils [req-9e169319-8ed5-4bd9-873b-6915d57ec0fe req-6adb78af-8cd9-4d7a-8edf-bd7c43181873 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-79350fb7-3eed-4a3b-a7e9-f0ec90460ac3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.718 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.719 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.719 221324 DEBUG oslo_concurrency.lockutils [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:02 np0005603622 nova_compute[221301]: 2026-01-31 07:51:02.726 221324 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 31 02:51:02 np0005603622 virtqemud[220925]: Domain id=11 name='instance-00000013' uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612 is tainted: custom-monitor
Jan 31 02:51:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:02.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:03 np0005603622 nova_compute[221301]: 2026-01-31 07:51:03.738 221324 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 31 02:51:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:03.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:04 np0005603622 kernel: tapab24842b-00 (unregistering): left promiscuous mode
Jan 31 02:51:04 np0005603622 NetworkManager[49080]: <info>  [1769845864.2571] device (tapab24842b-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:04Z|00083|binding|INFO|Releasing lport ab24842b-0045-41e6-b6dc-51b110b51829 from this chassis (sb_readonly=0)
Jan 31 02:51:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:04Z|00084|binding|INFO|Setting lport ab24842b-0045-41e6-b6dc-51b110b51829 down in Southbound
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.266 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:04Z|00085|binding|INFO|Removing iface tapab24842b-00 ovn-installed in OVS
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.281 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.294 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a1:9e 10.100.0.9'], port_security=['fa:16:3e:b4:a1:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7ec8bf38-9571-4400-a85c-6bd5ac54bdf3'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '79350fb7-3eed-4a3b-a7e9-f0ec90460ac3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '20', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ab24842b-0045-41e6-b6dc-51b110b51829) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.296 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ab24842b-0045-41e6-b6dc-51b110b51829 in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.297 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.312 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b4593530-ff91-443b-a0e2-4aff1a26e7cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 31 02:51:04 np0005603622 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Consumed 2.251s CPU time.
Jan 31 02:51:04 np0005603622 systemd-machined[190406]: Machine qemu-10-instance-00000014 terminated.
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.338 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba02670-5a6a-44e2-a7d8-ebf1161f5177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.341 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9bd421-eb68-42f9-bf33-118fe925bff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2: No such file or directory
Jan 31 02:51:04 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-317b1c6b-4f89-402c-94d1-f4852844f1e2: No such file or directory
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.365 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dd026171-ae27-46f8-bcf1-eac5c61344c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.373 221324 DEBUG nova.virt.libvirt.guest [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.373 221324 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation has completed#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.373 221324 INFO nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] _post_live_migration() is started..#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.374 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.375 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.375 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.394 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fa731f82-ec99-4851-b8ee-4ceb98bde394]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap60bb4bea-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:b1:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499702, 'reachable_time': 37410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229459, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.412 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[43aa3d04-a04e-41c9-88a2-5de67c477944]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap60bb4bea-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499710, 'tstamp': 499710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229461, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap60bb4bea-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499712, 'tstamp': 499712}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229461, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.414 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.416 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.421 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60bb4bea-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.422 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.422 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap60bb4bea-d0, col_values=(('external_ids', {'iface-id': 'eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:04.422 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.743 221324 INFO nova.virt.libvirt.driver [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.748 221324 DEBUG nova.compute.manager [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:04.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:04 np0005603622 nova_compute[221301]: 2026-01-31 07:51:04.827 221324 DEBUG nova.objects.instance [None req-5201c8d7-3401-4a44-8a37-183a9725f31f 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.028 221324 DEBUG nova.compute.manager [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.029 221324 DEBUG oslo_concurrency.lockutils [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.029 221324 DEBUG oslo_concurrency.lockutils [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.029 221324 DEBUG oslo_concurrency.lockutils [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.029 221324 DEBUG nova.compute.manager [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.030 221324 DEBUG nova.compute.manager [req-9cbaec61-5f8d-4a6c-9413-9df9b04ffd4d req-be389fad-3f8d-4f2f-b028-060643517c44 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.684 221324 DEBUG nova.compute.manager [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.686 221324 DEBUG oslo_concurrency.lockutils [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.686 221324 DEBUG oslo_concurrency.lockutils [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.687 221324 DEBUG oslo_concurrency.lockutils [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.687 221324 DEBUG nova.compute.manager [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.687 221324 DEBUG nova.compute.manager [req-b282a436-ae21-42a8-99dc-424355168869 req-c0543bed-5673-47fe-a75a-eb900963cdcf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-unplugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.726 221324 DEBUG nova.network.neutron [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Activated binding for port ab24842b-0045-41e6-b6dc-51b110b51829 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.727 221324 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.728 221324 DEBUG nova.virt.libvirt.vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1973231276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1973231276',id=20,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-kz6k0bwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:50:48Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=79350fb7-3eed-4a3b-a7e9-f0ec90460ac3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.728 221324 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converting VIF {"id": "ab24842b-0045-41e6-b6dc-51b110b51829", "address": "fa:16:3e:b4:a1:9e", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab24842b-00", "ovs_interfaceid": "ab24842b-0045-41e6-b6dc-51b110b51829", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.729 221324 DEBUG nova.network.os_vif_util [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.729 221324 DEBUG os_vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.732 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab24842b-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.740 221324 INFO os_vif [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:a1:9e,bridge_name='br-int',has_traffic_filtering=True,id=ab24842b-0045-41e6-b6dc-51b110b51829,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab24842b-00')#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.741 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.741 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.741 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.741 221324 DEBUG nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.742 221324 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deleting instance files /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del#033[00m
Jan 31 02:51:05 np0005603622 nova_compute[221301]: 2026-01-31 07:51:05.743 221324 INFO nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Deletion of /var/lib/nova/instances/79350fb7-3eed-4a3b-a7e9-f0ec90460ac3_del complete#033[00m
Jan 31 02:51:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:05.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:06.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.108 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.142 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.143 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.144 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.144 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.145 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.145 221324 WARNING nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.146 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.146 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.147 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.148 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.148 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.149 221324 WARNING nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.149 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.150 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.150 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.151 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.151 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.152 221324 WARNING nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.152 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.153 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.153 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.153 221324 DEBUG oslo_concurrency.lockutils [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.154 221324 DEBUG nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] No waiting events found dispatching network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:07 np0005603622 nova_compute[221301]: 2026-01-31 07:51:07.154 221324 WARNING nova.compute.manager [req-15ed1bd3-8c7f-4db0-941d-ae5809f79890 req-b30b1c8c-a6b8-4a3e-811c-396efeba79b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Received unexpected event network-vif-plugged-ab24842b-0045-41e6-b6dc-51b110b51829 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:07.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:08 np0005603622 nova_compute[221301]: 2026-01-31 07:51:08.342 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Check if temp file /var/lib/nova/instances/tmp5hfaw34r exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 31 02:51:08 np0005603622 nova_compute[221301]: 2026-01-31 07:51:08.342 221324 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 31 02:51:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:08.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:09 np0005603622 nova_compute[221301]: 2026-01-31 07:51:09.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:09 np0005603622 nova_compute[221301]: 2026-01-31 07:51:09.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:51:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:09.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.735 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.804 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.804 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.805 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "79350fb7-3eed-4a3b-a7e9-f0ec90460ac3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:10.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.837 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.838 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.838 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.838 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:51:10 np0005603622 nova_compute[221301]: 2026-01-31 07:51:10.839 221324 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2135461713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.582 221324 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.743s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.698 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.699 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.704 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.704 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.708 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.708 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:11.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.888 221324 WARNING nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.889 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4284MB free_disk=20.71560287475586GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.889 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.890 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:11 np0005603622 nova_compute[221301]: 2026-01-31 07:51:11.971 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration for instance 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.041 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.042 221324 INFO nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating resource usage from migration aad546b8-9239-4ce8-aa2b-899d02b2684e#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.078 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.079 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.080 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration c91a31e3-ac93-4843-9897-f5679755f4a7 is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.081 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Migration aad546b8-9239-4ce8-aa2b-899d02b2684e is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.081 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.082 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.110 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.212 221324 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.598 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.599 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.599 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.600 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:12.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.827 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.928 221324 DEBUG oslo_concurrency.processutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.716s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.934 221324 DEBUG nova.compute.provider_tree [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.950 221324 DEBUG nova.scheduler.client.report [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.980 221324 DEBUG nova.compute.resource_tracker [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.981 221324 DEBUG oslo_concurrency.lockutils [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:12 np0005603622 nova_compute[221301]: 2026-01-31 07:51:12.988 221324 INFO nova.compute.manager [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.133 221324 INFO nova.scheduler.client.report [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] Deleted allocation for migration c91a31e3-ac93-4843-9897-f5679755f4a7#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.134 221324 DEBUG nova.virt.libvirt.driver [None req-2add2b49-7be7-49b6-be90-138f1d8a2694 2281b211104541868e559053ffafa8db af359d31615c49d28000f10b153d6e39 - - default default] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.414 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.632 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.632 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.633 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.634 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.634 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.660 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.661 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.661 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.662 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:51:13 np0005603622 nova_compute[221301]: 2026-01-31 07:51:13.662 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:13.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3112140315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.144 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.230 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.231 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.236 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.236 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.241 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.242 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.427 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.429 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4251MB free_disk=20.71546173095703GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.485 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating resource usage from migration aad546b8-9239-4ce8-aa2b-899d02b2684e#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.510 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.511 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.511 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration aad546b8-9239-4ce8-aa2b-899d02b2684e is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.511 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.511 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:51:14 np0005603622 nova_compute[221301]: 2026-01-31 07:51:14.581 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:14.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2943114883' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.011 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.016 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.053 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.055 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.055 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.738 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:15.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:15 np0005603622 nova_compute[221301]: 2026-01-31 07:51:15.817 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:51:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:16.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:17 np0005603622 nova_compute[221301]: 2026-01-31 07:51:17.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:17.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.101 221324 INFO nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Took 4.93 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.102 221324 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.128 221324 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5hfaw34r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4b48cc05-9edd-4e4d-a58e-84564afb0612',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(aad546b8-9239-4ce8-aa2b-899d02b2684e),old_vol_attachment_ids={74f8a6d0-259e-466b-a484-4c7bffded2e1='e6430b41-2bd9-4a2a-92ee-fef9061b8529'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.133 221324 DEBUG nova.objects.instance [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b48cc05-9edd-4e4d-a58e-84564afb0612 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.134 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.136 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.136 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.161 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Find same serial number: pos=1, serial=74f8a6d0-259e-466b-a484-4c7bffded2e1 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.164 221324 DEBUG nova.virt.libvirt.vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:51:04Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.164 221324 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.165 221324 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.165 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating guest XML with vif config: <interface type="ethernet">
Jan 31 02:51:18 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:8b:11:0b"/>
Jan 31 02:51:18 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 02:51:18 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:51:18 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 02:51:18 np0005603622 nova_compute[221301]:  <target dev="tap31ab3c80-79"/>
Jan 31 02:51:18 np0005603622 nova_compute[221301]: </interface>
Jan 31 02:51:18 np0005603622 nova_compute[221301]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.166 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.639 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.639 221324 INFO nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.741 221324 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 31 02:51:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:18.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.978 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.979 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.979 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.980 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.980 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.980 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.980 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.980 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.981 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.981 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.981 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.981 221324 WARNING nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.982 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.982 221324 DEBUG nova.compute.manager [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing instance network info cache due to event network-changed-31ab3c80-791f-418d-a70b-fcb0d523a037. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.982 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.982 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:18 np0005603622 nova_compute[221301]: 2026-01-31 07:51:18.982 221324 DEBUG nova.network.neutron [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Refreshing network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.243 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.244 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.373 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845864.3727553, 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.374 221324 INFO nova.compute.manager [-] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.395 221324 DEBUG nova.compute.manager [None req-c2c49698-8ecf-4373-9b7b-f851d00e1d46 - - - - - -] [instance: 79350fb7-3eed-4a3b-a7e9-f0ec90460ac3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.746 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:19 np0005603622 nova_compute[221301]: 2026-01-31 07:51:19.748 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:19.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:20 np0005603622 podman[229555]: 2026-01-31 07:51:20.206137925 +0000 UTC m=+0.110162297 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.253 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.254 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.286 221324 DEBUG nova.network.neutron [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updated VIF entry in instance network info cache for port 31ab3c80-791f-418d-a70b-fcb0d523a037. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.287 221324 DEBUG nova.network.neutron [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Updating instance_info_cache with network_info: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.310 221324 DEBUG oslo_concurrency.lockutils [req-081b9596-3cab-461c-b2f1-7ac3b7e822e3 req-866231ce-e62b-4421-af95-6b7209cd79a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4b48cc05-9edd-4e4d-a58e-84564afb0612" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.326 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845880.3259351, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.326 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.346 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.350 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.367 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.779 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.780 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 31 02:51:20 np0005603622 nova_compute[221301]: 2026-01-31 07:51:20.781 221324 DEBUG nova.virt.libvirt.migration [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 31 02:51:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:51:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:20.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:51:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:21.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:22 np0005603622 nova_compute[221301]: 2026-01-31 07:51:22.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:22.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:22 np0005603622 kernel: tap31ab3c80-79 (unregistering): left promiscuous mode
Jan 31 02:51:22 np0005603622 NetworkManager[49080]: <info>  [1769845882.9017] device (tap31ab3c80-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:22 np0005603622 nova_compute[221301]: 2026-01-31 07:51:22.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:22 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:22Z|00086|binding|INFO|Releasing lport 31ab3c80-791f-418d-a70b-fcb0d523a037 from this chassis (sb_readonly=0)
Jan 31 02:51:22 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:22Z|00087|binding|INFO|Setting lport 31ab3c80-791f-418d-a70b-fcb0d523a037 down in Southbound
Jan 31 02:51:22 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:22Z|00088|binding|INFO|Removing iface tap31ab3c80-79 ovn-installed in OVS
Jan 31 02:51:22 np0005603622 nova_compute[221301]: 2026-01-31 07:51:22.959 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:22 np0005603622 nova_compute[221301]: 2026-01-31 07:51:22.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:22.977 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:11:0b 10.100.0.6'], port_security=['fa:16:3e:8b:11:0b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7ec8bf38-9571-4400-a85c-6bd5ac54bdf3'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4b48cc05-9edd-4e4d-a58e-84564afb0612', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbdbb7a4b22a49b68feb3e028bb62fbb', 'neutron:revision_number': '17', 'neutron:security_group_ids': 'a60a5d2f-886d-4841-8ef6-f9e7838468dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f860fcac-4f6a-4e88-8005-0fd323fc8053, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=31ab3c80-791f-418d-a70b-fcb0d523a037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:22.978 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 31ab3c80-791f-418d-a70b-fcb0d523a037 in datapath 850ad6ca-6166-4382-94bb-4b7c10d9a136 unbound from our chassis#033[00m
Jan 31 02:51:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:22.979 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 850ad6ca-6166-4382-94bb-4b7c10d9a136, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:22.981 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a43e9238-e4f5-4e45-b50d-77b8d6f6e0a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:22.981 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 namespace which is not needed anymore#033[00m
Jan 31 02:51:23 np0005603622 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 31 02:51:23 np0005603622 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Consumed 2.909s CPU time.
Jan 31 02:51:23 np0005603622 systemd-machined[190406]: Machine qemu-11-instance-00000013 terminated.
Jan 31 02:51:23 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1: No such file or directory
Jan 31 02:51:23 np0005603622 virtqemud[220925]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-74f8a6d0-259e-466b-a484-4c7bffded2e1: No such file or directory
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.052 221324 DEBUG nova.virt.libvirt.guest [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.053 221324 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation has completed#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.054 221324 INFO nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] _post_live_migration() is started..#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.055 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.055 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.055 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.290 221324 DEBUG nova.compute.manager [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.291 221324 DEBUG oslo_concurrency.lockutils [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.291 221324 DEBUG oslo_concurrency.lockutils [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.292 221324 DEBUG oslo_concurrency.lockutils [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.292 221324 DEBUG nova.compute.manager [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.293 221324 DEBUG nova.compute.manager [req-c8e77a69-026e-402e-9ccf-2a9d4fbd2ce1 req-250aad08-c1d4-4b0d-a3b9-561c612eead5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [NOTICE]   (229423) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [NOTICE]   (229423) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [WARNING]  (229423) : Exiting Master process...
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [WARNING]  (229423) : Exiting Master process...
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [ALERT]    (229423) : Current worker (229425) exited with code 143 (Terminated)
Jan 31 02:51:23 np0005603622 neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136[229419]: [WARNING]  (229423) : All workers exited. Exiting... (0)
Jan 31 02:51:23 np0005603622 systemd[1]: libpod-b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d.scope: Deactivated successfully.
Jan 31 02:51:23 np0005603622 podman[229616]: 2026-01-31 07:51:23.684975178 +0000 UTC m=+0.619519107 container died b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:51:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:23.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.940 221324 DEBUG nova.compute.manager [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.940 221324 DEBUG oslo_concurrency.lockutils [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.941 221324 DEBUG oslo_concurrency.lockutils [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.942 221324 DEBUG oslo_concurrency.lockutils [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.942 221324 DEBUG nova.compute.manager [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:23 np0005603622 nova_compute[221301]: 2026-01-31 07:51:23.943 221324 DEBUG nova.compute.manager [req-f1014647-03b4-4f5c-bf87-85a961ca98eb req-f263fe84-5f2d-4d5c-b8ee-b13646f2717a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-unplugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.037 221324 DEBUG nova.network.neutron [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Activated binding for port 31ab3c80-791f-418d-a70b-fcb0d523a037 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.038 221324 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.039 221324 DEBUG nova.virt.libvirt.vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1009708622',display_name='tempest-LiveMigrationTest-server-1009708622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1009708622',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:50:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cbdbb7a4b22a49b68feb3e028bb62fbb',ramdisk_id='',reservation_id='r-878znybl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-126681982',owner_user_name='tempest-LiveMigrationTest-126681982-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:51:07Z,user_data=None,user_id='795c7f392cbc45f0885f081449883d42',uuid=4b48cc05-9edd-4e4d-a58e-84564afb0612,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.040 221324 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converting VIF {"id": "31ab3c80-791f-418d-a70b-fcb0d523a037", "address": "fa:16:3e:8b:11:0b", "network": {"id": "850ad6ca-6166-4382-94bb-4b7c10d9a136", "bridge": "br-int", "label": "tempest-LiveMigrationTest-663920826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cbdbb7a4b22a49b68feb3e028bb62fbb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ab3c80-79", "ovs_interfaceid": "31ab3c80-791f-418d-a70b-fcb0d523a037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.041 221324 DEBUG nova.network.os_vif_util [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.042 221324 DEBUG os_vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.045 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.045 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31ab3c80-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.050 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.053 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.056 221324 INFO os_vif [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:11:0b,bridge_name='br-int',has_traffic_filtering=True,id=31ab3c80-791f-418d-a70b-fcb0d523a037,network=Network(850ad6ca-6166-4382-94bb-4b7c10d9a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ab3c80-79')#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.057 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.057 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.058 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.058 221324 DEBUG nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.059 221324 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deleting instance files /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del#033[00m
Jan 31 02:51:24 np0005603622 nova_compute[221301]: 2026-01-31 07:51:24.059 221324 INFO nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Deletion of /var/lib/nova/instances/4b48cc05-9edd-4e4d-a58e-84564afb0612_del complete#033[00m
Jan 31 02:51:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0f48d353b313b11b93c1869a3391dad6f8960ffe4507a363f2eab10a49f37257-merged.mount: Deactivated successfully.
Jan 31 02:51:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:24.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.387 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.388 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.388 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.389 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.389 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.390 221324 WARNING nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.391 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.391 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.392 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.392 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.393 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.394 221324 WARNING nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.394 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.395 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.395 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.396 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.396 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.397 221324 WARNING nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.398 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.398 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.399 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.399 221324 DEBUG oslo_concurrency.lockutils [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.400 221324 DEBUG nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] No waiting events found dispatching network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:25 np0005603622 nova_compute[221301]: 2026-01-31 07:51:25.400 221324 WARNING nova.compute.manager [req-cb9b338e-ee64-46c5-a087-31c204d1f600 req-00358d9a-e1ea-4f76-afc8-03e2132decdb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Received unexpected event network-vif-plugged-31ab3c80-791f-418d-a70b-fcb0d523a037 for instance with vm_state active and task_state migrating.#033[00m
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.426003) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885426068, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2455, "num_deletes": 254, "total_data_size": 5898711, "memory_usage": 5958736, "flush_reason": "Manual Compaction"}
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 31 02:51:25 np0005603622 podman[229616]: 2026-01-31 07:51:25.461118564 +0000 UTC m=+2.395662463 container cleanup b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 02:51:25 np0005603622 systemd[1]: libpod-conmon-b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d.scope: Deactivated successfully.
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885574899, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3868424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23114, "largest_seqno": 25564, "table_properties": {"data_size": 3858382, "index_size": 6344, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21257, "raw_average_key_size": 20, "raw_value_size": 3838263, "raw_average_value_size": 3744, "num_data_blocks": 278, "num_entries": 1025, "num_filter_entries": 1025, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845664, "oldest_key_time": 1769845664, "file_creation_time": 1769845885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 148963 microseconds, and 11412 cpu microseconds.
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.574964) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3868424 bytes OK
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.574991) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.777203) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.777247) EVENT_LOG_v1 {"time_micros": 1769845885777239, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.777268) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5887920, prev total WAL file size 5887920, number of live WAL files 2.
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.778340) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3777KB)], [48(7320KB)]
Jan 31 02:51:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845885778455, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11364530, "oldest_snapshot_seqno": -1}
Jan 31 02:51:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:25.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.002 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.004 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.004 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.005 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.005 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.007 221324 INFO nova.compute.manager [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Terminating instance#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.009 221324 DEBUG nova.compute.manager [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5000 keys, 9281533 bytes, temperature: kUnknown
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886409117, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9281533, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9247029, "index_size": 20890, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126347, "raw_average_key_size": 25, "raw_value_size": 9155829, "raw_average_value_size": 1831, "num_data_blocks": 850, "num_entries": 5000, "num_filter_entries": 5000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769845885, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.409507) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9281533 bytes
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.467923) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 18.0 rd, 14.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 7.1 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.3) write-amplify(2.4) OK, records in: 5525, records dropped: 525 output_compression: NoCompression
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.467970) EVENT_LOG_v1 {"time_micros": 1769845886467954, "job": 28, "event": "compaction_finished", "compaction_time_micros": 630802, "compaction_time_cpu_micros": 18274, "output_level": 6, "num_output_files": 1, "total_output_size": 9281533, "num_input_records": 5525, "num_output_records": 5000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886468591, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769845886469522, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:25.778184) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.469581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.469588) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.469590) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.469593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:51:26.469596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:51:26 np0005603622 podman[229647]: 2026-01-31 07:51:26.577624309 +0000 UTC m=+1.097404723 container remove b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.583 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb0a572-b613-4f54-b01e-bcaf0e3580f5]: (4, ('Sat Jan 31 07:51:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d)\nb10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d\nSat Jan 31 07:51:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 (b10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d)\nb10248d4a8f4e17dd01df2384368aaebf380d1cb66b78d6591e78e8215637b5d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d51bcf0f-bcc1-44e1-add0-f36f33b1282b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.587 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap850ad6ca-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.589 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 kernel: tap850ad6ca-60: left promiscuous mode
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.597 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.601 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ac70451c-80b1-498c-9edc-612d450fb209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.617 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[61f387a0-3c18-4e60-9596-b13ac8fded0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.619 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f82551-8ff6-4ead-a68e-6b2faf2b860b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 kernel: tap06448a4a-18 (unregistering): left promiscuous mode
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.636 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4f39b4-ee34-46a8-bdbd-ecddc670106a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510954, 'reachable_time': 42656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229667, 'error': None, 'target': 'ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 systemd[1]: run-netns-ovnmeta\x2d850ad6ca\x2d6166\x2d4382\x2d94bb\x2d4b7c10d9a136.mount: Deactivated successfully.
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.639 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-850ad6ca-6166-4382-94bb-4b7c10d9a136 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.639 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[baec99b1-b34a-4a8f-bdfa-e45c971b6bae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 NetworkManager[49080]: <info>  [1769845886.6464] device (tap06448a4a-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.643 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00089|binding|INFO|Releasing lport 06448a4a-1828-42e2-810c-09e0ca21c35f from this chassis (sb_readonly=0)
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00090|binding|INFO|Setting lport 06448a4a-1828-42e2-810c-09e0ca21c35f down in Southbound
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00091|binding|INFO|Releasing lport 3162329c-f03f-465e-9f99-799a29d883a0 from this chassis (sb_readonly=0)
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00092|binding|INFO|Setting lport 3162329c-f03f-465e-9f99-799a29d883a0 down in Southbound
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00093|binding|INFO|Removing iface tap06448a4a-18 ovn-installed in OVS
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.671 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:f3:77 10.100.0.14'], port_security=['fa:16:3e:da:f3:77 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-116143868', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5e4f7ec6-bb38-4a62-88f4-5e5b869452f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-116143868', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227ca833-938d-48d2-86c8-5d09dd658c40, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=06448a4a-1828-42e2-810c-09e0ca21c35f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00094|binding|INFO|Releasing lport eefb3f31-55e8-4b1d-a07a-d5c925fc9fd8 from this chassis (sb_readonly=0)
Jan 31 02:51:26 np0005603622 ovn_controller[130109]: 2026-01-31T07:51:26Z|00095|binding|INFO|Releasing lport 1e50dcb0-cab9-4443-a53d-151822a6eb9a from this chassis (sb_readonly=0)
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.679 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.680 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:c3:df 19.80.0.151'], port_security=['fa:16:3e:a4:c3:df 19.80.0.151'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['06448a4a-1828-42e2-810c-09e0ca21c35f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1877730947', 'neutron:cidrs': '19.80.0.151/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1877730947', 'neutron:project_id': 'e66a774f63ae4139a4e75c7973fbe077', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a4a96739-bb2f-4e95-bbe5-76a81d2aa557', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=0d46eecb-5024-425b-affd-165dd8eae0e4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3162329c-f03f-465e-9f99-799a29d883a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.682 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 06448a4a-1828-42e2-810c-09e0ca21c35f in datapath 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d unbound from our chassis#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.685 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.686 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d291be-717f-4678-a05e-cb6e9b96231e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:26.687 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d namespace which is not needed anymore#033[00m
Jan 31 02:51:26 np0005603622 podman[229661]: 2026-01-31 07:51:26.69466229 +0000 UTC m=+0.078788066 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 02:51:26 np0005603622 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 31 02:51:26 np0005603622 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Consumed 9.204s CPU time.
Jan 31 02:51:26 np0005603622 systemd-machined[190406]: Machine qemu-7-instance-00000010 terminated.
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.727 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:26.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.847 221324 INFO nova.virt.libvirt.driver [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Instance destroyed successfully.#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.847 221324 DEBUG nova.objects.instance [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lazy-loading 'resources' on Instance uuid 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.862 221324 DEBUG nova.virt.libvirt.vif [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T07:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1707336579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1707336579',id=16,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:48:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e66a774f63ae4139a4e75c7973fbe077',ramdisk_id='',reservation_id='r-q29px744',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-2072827810',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-2072827810-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:49:09Z,user_data=None,user_id='37ed25cc14814a29867ac308b3cce8cf',uuid=5e4f7ec6-bb38-4a62-88f4-5e5b869452f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.863 221324 DEBUG nova.network.os_vif_util [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converting VIF {"id": "06448a4a-1828-42e2-810c-09e0ca21c35f", "address": "fa:16:3e:da:f3:77", "network": {"id": "60bb4bea-d9f0-41fc-9c0f-6fcd644c255d", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-19992648-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e66a774f63ae4139a4e75c7973fbe077", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06448a4a-18", "ovs_interfaceid": "06448a4a-1828-42e2-810c-09e0ca21c35f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.864 221324 DEBUG nova.network.os_vif_util [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.864 221324 DEBUG os_vif [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.866 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.867 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06448a4a-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:26 np0005603622 nova_compute[221301]: 2026-01-31 07:51:26.873 221324 INFO os_vif [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:f3:77,bridge_name='br-int',has_traffic_filtering=True,id=06448a4a-1828-42e2-810c-09e0ca21c35f,network=Network(60bb4bea-d9f0-41fc-9c0f-6fcd644c255d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06448a4a-18')#033[00m
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [NOTICE]   (227806) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [NOTICE]   (227806) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [WARNING]  (227806) : Exiting Master process...
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [WARNING]  (227806) : Exiting Master process...
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [ALERT]    (227806) : Current worker (227808) exited with code 143 (Terminated)
Jan 31 02:51:26 np0005603622 neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d[227802]: [WARNING]  (227806) : All workers exited. Exiting... (0)
Jan 31 02:51:26 np0005603622 systemd[1]: libpod-a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8.scope: Deactivated successfully.
Jan 31 02:51:27 np0005603622 podman[229706]: 2026-01-31 07:51:27.003507388 +0000 UTC m=+0.236080227 container died a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:51:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 31 02:51:27 np0005603622 nova_compute[221301]: 2026-01-31 07:51:27.117 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay-6a6e43addc5cd709b5860f561fd40481c8ea93defd169cb5e93efc5ff73ae17f-merged.mount: Deactivated successfully.
Jan 31 02:51:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:27.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:27 np0005603622 podman[229706]: 2026-01-31 07:51:27.889303039 +0000 UTC m=+1.121875868 container cleanup a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:51:27 np0005603622 systemd[1]: libpod-conmon-a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8.scope: Deactivated successfully.
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.715 221324 DEBUG nova.compute.manager [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.716 221324 DEBUG oslo_concurrency.lockutils [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.716 221324 DEBUG oslo_concurrency.lockutils [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.717 221324 DEBUG oslo_concurrency.lockutils [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.717 221324 DEBUG nova.compute.manager [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.718 221324 DEBUG nova.compute.manager [req-e57f91a0-6c7c-4c58-a85b-9c3541a84309 req-0c3553a2-f05f-420e-89c0-fb5348d82493 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-unplugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:51:28 np0005603622 podman[229767]: 2026-01-31 07:51:28.780132337 +0000 UTC m=+0.870952345 container remove a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.785 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8c009dc9-1e20-4a03-960c-ef29d8123f4c]: (4, ('Sat Jan 31 07:51:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8)\na20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8\nSat Jan 31 07:51:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d (a20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8)\na20e9142d3d21f0ec0de1d053380750354e89b2dca11d59071d7c758efc7afb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.788 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ae33bbfa-1c01-4f7b-b211-3800246ccf83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.789 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60bb4bea-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:28 np0005603622 kernel: tap60bb4bea-d0: left promiscuous mode
Jan 31 02:51:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:28.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.791 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.803 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3d684285-cd18-45a2-b012-c6691ea930c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 nova_compute[221301]: 2026-01-31 07:51:28.851 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.863 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd3eb02-ba9e-478d-ba4e-824738634d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.864 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b7dce698-f054-4c43-8466-ca13824af7e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.881 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7a482d-57b6-4499-a492-3de026d7417a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499694, 'reachable_time': 19833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229783, 'error': None, 'target': 'ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 systemd[1]: run-netns-ovnmeta\x2d60bb4bea\x2dd9f0\x2d41fc\x2d9c0f\x2d6fcd644c255d.mount: Deactivated successfully.
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.883 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-60bb4bea-d9f0-41fc-9c0f-6fcd644c255d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.883 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[74aa581a-0a6e-4478-84bf-60d017fe6ef8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.884 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3162329c-f03f-465e-9f99-799a29d883a0 in datapath 057fed11-d4e4-4c56-8ba3-81a6235ed1bf unbound from our chassis#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.887 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 057fed11-d4e4-4c56-8ba3-81a6235ed1bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.888 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[77317b43-ece6-4abf-8f81-e5f1eedba239]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:28.889 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf namespace which is not needed anymore#033[00m
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [NOTICE]   (227898) : haproxy version is 2.8.14-c23fe91
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [NOTICE]   (227898) : path to executable is /usr/sbin/haproxy
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [WARNING]  (227898) : Exiting Master process...
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [WARNING]  (227898) : Exiting Master process...
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [ALERT]    (227898) : Current worker (227900) exited with code 143 (Terminated)
Jan 31 02:51:29 np0005603622 neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf[227894]: [WARNING]  (227898) : All workers exited. Exiting... (0)
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.192 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.193 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.193 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "4b48cc05-9edd-4e4d-a58e-84564afb0612-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:29 np0005603622 systemd[1]: libpod-832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09.scope: Deactivated successfully.
Jan 31 02:51:29 np0005603622 podman[229801]: 2026-01-31 07:51:29.203932351 +0000 UTC m=+0.202219238 container died 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.219 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.220 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.220 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.220 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.221 221324 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:29 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09-userdata-shm.mount: Deactivated successfully.
Jan 31 02:51:29 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f5fd6e559bdff01cbd8e6f5c9eeb2b51f2123f2f13e65bda8f8b234fca728bf9-merged.mount: Deactivated successfully.
Jan 31 02:51:29 np0005603622 podman[229801]: 2026-01-31 07:51:29.498678581 +0000 UTC m=+0.496965428 container cleanup 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:51:29 np0005603622 systemd[1]: libpod-conmon-832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09.scope: Deactivated successfully.
Jan 31 02:51:29 np0005603622 podman[229850]: 2026-01-31 07:51:29.63761618 +0000 UTC m=+0.120773032 container remove 832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.644 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f32b40-e4a8-48cf-9888-ed338b77a59e]: (4, ('Sat Jan 31 07:51:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf (832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09)\n832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09\nSat Jan 31 07:51:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf (832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09)\n832db2b8abbafdec938259ac6c1781e2af2f18e489b6ae99d2856c237b29ad09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.648 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ff08295f-bea5-4a9d-9351-1dc4fba21649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.649 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap057fed11-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:29 np0005603622 kernel: tap057fed11-d0: left promiscuous mode
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.662 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8549a0a4-3182-4319-994d-bbdc91c0de56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/969941327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.675 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e1132f-15a0-4ddb-85b0-bc19076f5f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.676 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c5fc6db5-4c25-401b-b618-b908742b6571]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.684 221324 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.690 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[850526da-3e13-4b9b-9b8f-0b6e7b2e7c78]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499784, 'reachable_time': 17133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229865, 'error': None, 'target': 'ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 systemd[1]: run-netns-ovnmeta\x2d057fed11\x2dd4e4\x2d4c56\x2d8ba3\x2d81a6235ed1bf.mount: Deactivated successfully.
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.694 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-057fed11-d4e4-4c56-8ba3-81a6235ed1bf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:29.694 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb73efc-cf8e-43fe-a1c0-6c0d5a322b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.782 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.782 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:29.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.789 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.789 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.907 221324 WARNING nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.908 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4598MB free_disk=20.71487808227539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.908 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.908 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.959 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration for instance 4b48cc05-9edd-4e4d-a58e-84564afb0612 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 02:51:29 np0005603622 nova_compute[221301]: 2026-01-31 07:51:29.993 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.020 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.021 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.021 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Migration aad546b8-9239-4ce8-aa2b-899d02b2684e is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.022 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.022 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.133 221324 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996861229' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.617 221324 DEBUG oslo_concurrency.processutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.622 221324 DEBUG nova.compute.provider_tree [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.635 221324 DEBUG nova.scheduler.client.report [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.660 221324 DEBUG nova.compute.resource_tracker [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.661 221324 DEBUG oslo_concurrency.lockutils [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.666 221324 INFO nova.compute.manager [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.738 221324 INFO nova.scheduler.client.report [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] Deleted allocation for migration aad546b8-9239-4ce8-aa2b-899d02b2684e#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.738 221324 DEBUG nova.virt.libvirt.driver [None req-46f24317-3aa0-4786-bc6e-c68058bcc66c 3a97c6056ac04f13b6b8a954619eb4f3 ace85892fea146fea0409a7b19662de3 - - default default] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.801 221324 DEBUG nova.compute.manager [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.801 221324 DEBUG oslo_concurrency.lockutils [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.801 221324 DEBUG oslo_concurrency.lockutils [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.801 221324 DEBUG oslo_concurrency.lockutils [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.801 221324 DEBUG nova.compute.manager [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] No waiting events found dispatching network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:51:30 np0005603622 nova_compute[221301]: 2026-01-31 07:51:30.802 221324 WARNING nova.compute.manager [req-8283422c-fdbf-401d-bda7-8361a93eeb42 req-67f8a776-4323-44b3-acbd-4d7844613ec8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Received unexpected event network-vif-plugged-06448a4a-1828-42e2-810c-09e0ca21c35f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:51:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:51:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:30.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:30.845 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:30.846 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:30.846 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:31.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:31 np0005603622 nova_compute[221301]: 2026-01-31 07:51:31.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:32 np0005603622 nova_compute[221301]: 2026-01-31 07:51:32.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:32.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:34.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.168 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.169 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.198 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.285 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.285 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.294 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.295 221324 INFO nova.compute.claims [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.441 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:35.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1010967894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.925 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.931 221324 DEBUG nova.compute.provider_tree [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.950 221324 DEBUG nova.scheduler.client.report [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.972 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:35 np0005603622 nova_compute[221301]: 2026-01-31 07:51:35.973 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.024 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.025 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.059 221324 INFO nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.079 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.184 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.186 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.187 221324 INFO nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Creating image(s)#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.397 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.430 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.459 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.463 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "04f9862a6688592352a3324f53b6597ee300b813" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.463 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "04f9862a6688592352a3324f53b6597ee300b813" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.470 221324 DEBUG nova.policy [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69d26c1015144fbda08ee4b9db0d7b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea1f64b5b90d4a90a6cb764c4cdcca59', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.700 221324 DEBUG nova.virt.libvirt.imagebackend [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/e0fa43e0-9f79-4992-bc07-130eef6d5592/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/e0fa43e0-9f79-4992-bc07-130eef6d5592/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 02:51:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:36.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.976 221324 INFO nova.virt.libvirt.driver [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deleting instance files /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_del#033[00m
Jan 31 02:51:36 np0005603622 nova_compute[221301]: 2026-01-31 07:51:36.977 221324 INFO nova.virt.libvirt.driver [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deletion of /var/lib/nova/instances/5e4f7ec6-bb38-4a62-88f4-5e5b869452f0_del complete#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.041 221324 INFO nova.compute.manager [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Took 11.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.042 221324 DEBUG oslo.service.loopingcall [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.042 221324 DEBUG nova.compute.manager [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.042 221324 DEBUG nova.network.neutron [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:37.173 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:51:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:37.175 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:51:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:37.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:37 np0005603622 nova_compute[221301]: 2026-01-31 07:51:37.971 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Successfully created port: 7aad9859-08d5-41b1-aa3a-9879376fb833 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.053 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845883.0520983, 4b48cc05-9edd-4e4d-a58e-84564afb0612 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.054 221324 INFO nova.compute.manager [-] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.077 221324 DEBUG nova.compute.manager [None req-54fbb9b5-68b5-4bb7-8753-df13f435acf1 - - - - - -] [instance: 4b48cc05-9edd-4e4d-a58e-84564afb0612] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.287 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.369 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.part --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.371 221324 DEBUG nova.virt.images [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] e0fa43e0-9f79-4992-bc07-130eef6d5592 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.373 221324 DEBUG nova.privsep.utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 02:51:38 np0005603622 nova_compute[221301]: 2026-01-31 07:51:38.374 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.part /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:38.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.631 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Successfully updated port: 7aad9859-08d5-41b1-aa3a-9879376fb833 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.646 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.647 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquired lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.647 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.730 221324 DEBUG nova.compute.manager [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-changed-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.730 221324 DEBUG nova.compute.manager [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Refreshing instance network info cache due to event network-changed-7aad9859-08d5-41b1-aa3a-9879376fb833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.731 221324 DEBUG oslo_concurrency.lockutils [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:51:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:39.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:39 np0005603622 nova_compute[221301]: 2026-01-31 07:51:39.904 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.274 221324 DEBUG nova.network.neutron [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.367 221324 INFO nova.compute.manager [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Took 3.32 seconds to deallocate network for instance.#033[00m
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.469 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.470 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.550 221324 DEBUG oslo_concurrency.processutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:40.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:40 np0005603622 nova_compute[221301]: 2026-01-31 07:51:40.995 221324 DEBUG nova.network.neutron [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updating instance_info_cache with network_info: [{"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.026 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Releasing lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.026 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Instance network_info: |[{"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.027 221324 DEBUG oslo_concurrency.lockutils [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.027 221324 DEBUG nova.network.neutron [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Refreshing network info cache for port 7aad9859-08d5-41b1-aa3a-9879376fb833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:51:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:41.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.845 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845886.844125, 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.846 221324 INFO nova.compute.manager [-] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.870 221324 DEBUG nova.compute.manager [None req-5e8e20c1-9242-4c58-b540-b34d97c19021 - - - - - -] [instance: 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:51:41 np0005603622 nova_compute[221301]: 2026-01-31 07:51:41.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:42 np0005603622 nova_compute[221301]: 2026-01-31 07:51:42.161 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:42.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:42 np0005603622 nova_compute[221301]: 2026-01-31 07:51:42.995 221324 DEBUG nova.network.neutron [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updated VIF entry in instance network info cache for port 7aad9859-08d5-41b1-aa3a-9879376fb833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:51:42 np0005603622 nova_compute[221301]: 2026-01-31 07:51:42.996 221324 DEBUG nova.network.neutron [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updating instance_info_cache with network_info: [{"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:51:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:51:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4068086355' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.020 221324 DEBUG oslo_concurrency.lockutils [req-de252b51-f499-48ca-a68d-a292399c5ae7 req-61e81691-65f7-44f2-a700-b3b779def42e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.041 221324 DEBUG oslo_concurrency.processutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.049 221324 DEBUG nova.compute.provider_tree [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.071 221324 DEBUG nova.scheduler.client.report [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.125 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.163 221324 INFO nova.scheduler.client.report [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Deleted allocations for instance 5e4f7ec6-bb38-4a62-88f4-5e5b869452f0#033[00m
Jan 31 02:51:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:51:43.177 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.257 221324 DEBUG oslo_concurrency.lockutils [None req-50c13203-8e3a-4a62-8778-a76404ffb97b 37ed25cc14814a29867ac308b3cce8cf e66a774f63ae4139a4e75c7973fbe077 - - default default] Lock "5e4f7ec6-bb38-4a62-88f4-5e5b869452f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:51:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:43.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.874 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.part /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.converted" returned: 0 in 5.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.880 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.933 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813.converted --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.935 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "04f9862a6688592352a3324f53b6597ee300b813" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 7.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.973 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:51:43 np0005603622 nova_compute[221301]: 2026-01-31 07:51:43.977 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813 104c67e8-fcdb-4437-9060-8d66250d4c46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:44.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:45.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:46.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:46 np0005603622 nova_compute[221301]: 2026-01-31 07:51:46.882 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:47 np0005603622 nova_compute[221301]: 2026-01-31 07:51:47.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:47.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:48 np0005603622 nova_compute[221301]: 2026-01-31 07:51:48.770 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813 104c67e8-fcdb-4437-9060-8d66250d4c46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.793s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:48.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:48 np0005603622 nova_compute[221301]: 2026-01-31 07:51:48.897 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] resizing rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:51:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 31 02:51:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:51:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/337087150' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:51:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:51:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/337087150' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:51:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:49.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:50.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:51 np0005603622 podman[230350]: 2026-01-31 07:51:51.192204264 +0000 UTC m=+0.107935267 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 02:51:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:51.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.863 221324 DEBUG nova.objects.instance [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'migration_context' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.878 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.879 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Ensure instance console log exists: /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.880 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.880 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.881 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.885 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Start _get_guest_xml network_info=[{"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:51:17Z,direct_url=<?>,disk_format='qcow2',id=e0fa43e0-9f79-4992-bc07-130eef6d5592,min_disk=0,min_ram=0,name='',owner='18a55c4784fc43908a027752a1a5fc3c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:51:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/sda', 'disk_bus': 'scsi', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': 'e0fa43e0-9f79-4992-bc07-130eef6d5592'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.886 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.891 221324 WARNING nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.901 221324 DEBUG nova.virt.libvirt.host [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.902 221324 DEBUG nova.virt.libvirt.host [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.910 221324 DEBUG nova.virt.libvirt.host [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.910 221324 DEBUG nova.virt.libvirt.host [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.912 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.912 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:51:17Z,direct_url=<?>,disk_format='qcow2',id=e0fa43e0-9f79-4992-bc07-130eef6d5592,min_disk=0,min_ram=0,name='',owner='18a55c4784fc43908a027752a1a5fc3c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:51:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.913 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.914 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.914 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.914 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.915 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.915 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.916 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.916 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.916 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.917 221324 DEBUG nova.virt.hardware [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:51:51 np0005603622 nova_compute[221301]: 2026-01-31 07:51:51.921 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:52 np0005603622 nova_compute[221301]: 2026-01-31 07:51:52.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:51:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1213237818' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:51:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:52 np0005603622 nova_compute[221301]: 2026-01-31 07:51:52.949 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:52 np0005603622 nova_compute[221301]: 2026-01-31 07:51:52.984 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:51:52 np0005603622 nova_compute[221301]: 2026-01-31 07:51:52.989 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:51:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:51:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3424656159' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:51:53 np0005603622 nova_compute[221301]: 2026-01-31 07:51:53.455 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:51:53 np0005603622 nova_compute[221301]: 2026-01-31 07:51:53.456 221324 DEBUG nova.virt.libvirt.vif [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:51:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1338764688',display_name='tempest-AttachSCSIVolumeTestJSON-server-1338764688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-1338764688',id=23,image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLEAYGAQGZPVYr3zBFF3S3S+ffBjem+DIIT0q0YvmJf/PpnnYZ/mpMrcMLuITZe4Vw6fqp9NHNWE7S1vV97FsPcL/y3psuRr5A22EUZj130xa5C4EXScW+eZYfLJZ4+Xdw==',key_name='tempest-keypair-485795967',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea1f64b5b90d4a90a6cb764c4cdcca59',ramdisk_id='',reservation_id='r-8xrgtxlo',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-948431203',owner_user_name='tempest-AttachSCSIVolumeTestJSON-948431203-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:51:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69d26c1015144fbda08ee4b9db0d7b70',uuid=104c67e8-fcdb-4437-9060-8d66250d4c46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:51:53 np0005603622 nova_compute[221301]: 2026-01-31 07:51:53.457 221324 DEBUG nova.network.os_vif_util [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converting VIF {"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:51:53 np0005603622 nova_compute[221301]: 2026-01-31 07:51:53.458 221324 DEBUG nova.network.os_vif_util [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:51:53 np0005603622 nova_compute[221301]: 2026-01-31 07:51:53.459 221324 DEBUG nova.objects.instance [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'pci_devices' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:51:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:53.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:54.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:55.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:51:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:56.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:51:56 np0005603622 nova_compute[221301]: 2026-01-31 07:51:56.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:57 np0005603622 podman[230457]: 2026-01-31 07:51:57.126835134 +0000 UTC m=+0.052028017 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:51:57 np0005603622 nova_compute[221301]: 2026-01-31 07:51:57.168 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:51:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:57.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:51:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:51:58.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:51:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:51:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:51:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:51:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:00.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.361 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <uuid>104c67e8-fcdb-4437-9060-8d66250d4c46</uuid>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <name>instance-00000017</name>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachSCSIVolumeTestJSON-server-1338764688</nova:name>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:51:51</nova:creationTime>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:user uuid="69d26c1015144fbda08ee4b9db0d7b70">tempest-AttachSCSIVolumeTestJSON-948431203-project-member</nova:user>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:project uuid="ea1f64b5b90d4a90a6cb764c4cdcca59">tempest-AttachSCSIVolumeTestJSON-948431203</nova:project>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="e0fa43e0-9f79-4992-bc07-130eef6d5592"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <nova:port uuid="7aad9859-08d5-41b1-aa3a-9879376fb833">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="serial">104c67e8-fcdb-4437-9060-8d66250d4c46</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="uuid">104c67e8-fcdb-4437-9060-8d66250d4c46</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/104c67e8-fcdb-4437-9060-8d66250d4c46_disk">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <target dev="sda" bus="scsi"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <address type="drive" controller="0" unit="0"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <target dev="sdb" bus="scsi"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <address type="drive" controller="0" unit="1"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="scsi" index="0" model="virtio-scsi"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:a7:f7:da"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <target dev="tap7aad9859-08"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/console.log" append="off"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:52:01 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:52:01 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:52:01 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:52:01 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.362 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Preparing to wait for external event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.362 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.363 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.363 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.364 221324 DEBUG nova.virt.libvirt.vif [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:51:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1338764688',display_name='tempest-AttachSCSIVolumeTestJSON-server-1338764688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-1338764688',id=23,image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLEAYGAQGZPVYr3zBFF3S3S+ffBjem+DIIT0q0YvmJf/PpnnYZ/mpMrcMLuITZe4Vw6fqp9NHNWE7S1vV97FsPcL/y3psuRr5A22EUZj130xa5C4EXScW+eZYfLJZ4+Xdw==',key_name='tempest-keypair-485795967',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea1f64b5b90d4a90a6cb764c4cdcca59',ramdisk_id='',reservation_id='r-8xrgtxlo',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-948431203',owner_user_name='tempest-AttachSCSIVolumeTestJSON-948431203-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:51:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69d26c1015144fbda08ee4b9db0d7b70',uuid=104c67e8-fcdb-4437-9060-8d66250d4c46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.364 221324 DEBUG nova.network.os_vif_util [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converting VIF {"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.365 221324 DEBUG nova.network.os_vif_util [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.366 221324 DEBUG os_vif [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.366 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.367 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.367 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.371 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.371 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7aad9859-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.372 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7aad9859-08, col_values=(('external_ids', {'iface-id': '7aad9859-08d5-41b1-aa3a-9879376fb833', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:f7:da', 'vm-uuid': '104c67e8-fcdb-4437-9060-8d66250d4c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:01 np0005603622 NetworkManager[49080]: <info>  [1769845921.3747] manager: (tap7aad9859-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.380 221324 INFO os_vif [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08')#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.473 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.474 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.475 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No VIF found with MAC fa:16:3e:a7:f7:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.475 221324 INFO nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Using config drive#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.509 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.832 221324 INFO nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Creating config drive at /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config#033[00m
Jan 31 02:52:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:01.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.837 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdy6bkbv6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:01 np0005603622 nova_compute[221301]: 2026-01-31 07:52:01.961 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdy6bkbv6" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:52:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 48K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 3067 syncs, 3.77 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5402 writes, 22K keys, 5402 commit groups, 1.0 writes per commit group, ingest: 23.73 MB, 0.04 MB/s#012Interval WAL: 5401 writes, 1971 syncs, 2.74 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 02:52:02 np0005603622 nova_compute[221301]: 2026-01-31 07:52:02.369 221324 DEBUG nova.storage.rbd_utils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] rbd image 104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:52:02 np0005603622 nova_compute[221301]: 2026-01-31 07:52:02.374 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config 104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:02 np0005603622 nova_compute[221301]: 2026-01-31 07:52:02.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:02.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:03.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:05.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:06 np0005603622 nova_compute[221301]: 2026-01-31 07:52:06.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:06.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:07 np0005603622 nova_compute[221301]: 2026-01-31 07:52:07.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:07.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:09 np0005603622 nova_compute[221301]: 2026-01-31 07:52:09.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:09 np0005603622 nova_compute[221301]: 2026-01-31 07:52:09.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:52:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:09.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:10 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 02:52:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:10.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:11 np0005603622 nova_compute[221301]: 2026-01-31 07:52:11.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:11 np0005603622 nova_compute[221301]: 2026-01-31 07:52:11.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:11 np0005603622 nova_compute[221301]: 2026-01-31 07:52:11.437 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:11.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:12 np0005603622 nova_compute[221301]: 2026-01-31 07:52:12.223 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:12 np0005603622 nova_compute[221301]: 2026-01-31 07:52:12.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:12 np0005603622 nova_compute[221301]: 2026-01-31 07:52:12.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:52:12 np0005603622 nova_compute[221301]: 2026-01-31 07:52:12.561 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:52:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:12.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:13 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_commit, latency = 7.992090702s
Jan 31 02:52:13 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_sync, latency = 7.992090702s
Jan 31 02:52:13 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.992351532s, txc = 0x55bb28278900
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:13 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.995981216s, txc = 0x55bb27f4f200
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.460 221324 DEBUG oslo_concurrency.processutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config 104c67e8-fcdb-4437-9060-8d66250d4c46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 11.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.460 221324 INFO nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Deleting local config drive /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46/disk.config because it was imported into RBD.#033[00m
Jan 31 02:52:13 np0005603622 kernel: tap7aad9859-08: entered promiscuous mode
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.5127] manager: (tap7aad9859-08): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Jan 31 02:52:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:13Z|00096|binding|INFO|Claiming lport 7aad9859-08d5-41b1-aa3a-9879376fb833 for this chassis.
Jan 31 02:52:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:13Z|00097|binding|INFO|7aad9859-08d5-41b1-aa3a-9879376fb833: Claiming fa:16:3e:a7:f7:da 10.100.0.11
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.520 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 systemd-udevd[230548]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.536 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:f7:da 10.100.0.11'], port_security=['fa:16:3e:a7:f7:da 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '104c67e8-fcdb-4437-9060-8d66250d4c46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-545204cc-4005-4efa-9f61-78402c013259', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea1f64b5b90d4a90a6cb764c4cdcca59', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba228104-4430-4a80-9d82-20035205851a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0ca3a0-be06-4a81-8971-bf69a1384288, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=7aad9859-08d5-41b1-aa3a-9879376fb833) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.537 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 7aad9859-08d5-41b1-aa3a-9879376fb833 in datapath 545204cc-4005-4efa-9f61-78402c013259 bound to our chassis#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.539 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 545204cc-4005-4efa-9f61-78402c013259#033[00m
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.5459] device (tap7aad9859-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.5470] device (tap7aad9859-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:52:13 np0005603622 systemd-machined[190406]: New machine qemu-12-instance-00000017.
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.551 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d2925a46-b28c-450d-8d63-b66892c3248c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.553 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap545204cc-41 in ovnmeta-545204cc-4005-4efa-9f61-78402c013259 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:52:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:13Z|00098|binding|INFO|Setting lport 7aad9859-08d5-41b1-aa3a-9879376fb833 ovn-installed in OVS
Jan 31 02:52:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:13Z|00099|binding|INFO|Setting lport 7aad9859-08d5-41b1-aa3a-9879376fb833 up in Southbound
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.556 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap545204cc-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.556 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d4db26-5deb-4da8-bde0-ec87b803f503]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 systemd[1]: Started Virtual Machine qemu-12-instance-00000017.
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.559 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5f30db-3909-44ca-b377-0cc3fb7ddc59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.571 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[993d81a9-001a-4c2c-a8ab-d19b84e20328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.593 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b3f8b8-2bef-4415-9db7-ec6d218afc25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.618 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f8611a05-6681-45cc-9052-17af08dff1f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.6247] manager: (tap545204cc-40): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.625 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2737dfb8-ec6e-4ae1-bc8c-b40af86efae2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.650 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5d477-de9a-446f-a26f-e1d55d4f8d24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.652 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[19caee60-acc7-4d5e-8d85-e7379a5692b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.6750] device (tap545204cc-40): carrier: link connected
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.681 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6e52de52-e1e5-494d-b88c-48c07be5b34c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.693 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[66eb7bb2-3dee-4fc7-8f3b-dfb68097d09f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap545204cc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:b8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518501, 'reachable_time': 37452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230584, 'error': None, 'target': 'ovnmeta-545204cc-4005-4efa-9f61-78402c013259', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.704 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2da0d7-a7ee-4e99-923c-b807a6166f45]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:b899'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518501, 'tstamp': 518501}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230586, 'error': None, 'target': 'ovnmeta-545204cc-4005-4efa-9f61-78402c013259', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.714 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74b68683-a1f0-476b-89e2-8e03f31bddfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap545204cc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:b8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518501, 'reachable_time': 37452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230587, 'error': None, 'target': 'ovnmeta-545204cc-4005-4efa-9f61-78402c013259', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.734 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe69dbb-9367-40ce-ab37-afbdd7576c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.769 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fd2eb5-065e-46f8-9e2f-27723b74c9d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.770 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap545204cc-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.771 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.771 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap545204cc-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:13 np0005603622 kernel: tap545204cc-40: entered promiscuous mode
Jan 31 02:52:13 np0005603622 NetworkManager[49080]: <info>  [1769845933.7734] manager: (tap545204cc-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.772 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.779 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap545204cc-40, col_values=(('external_ids', {'iface-id': 'f5863ff4-e876-4327-ae9c-e2daeaf0d0ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.780 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:13Z|00100|binding|INFO|Releasing lport f5863ff4-e876-4327-ae9c-e2daeaf0d0ff from this chassis (sb_readonly=0)
Jan 31 02:52:13 np0005603622 nova_compute[221301]: 2026-01-31 07:52:13.785 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.785 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/545204cc-4005-4efa-9f61-78402c013259.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/545204cc-4005-4efa-9f61-78402c013259.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.786 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb529f0-f4fa-4aac-8569-1be8ffa2d203]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.787 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-545204cc-4005-4efa-9f61-78402c013259
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/545204cc-4005-4efa-9f61-78402c013259.pid.haproxy
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 545204cc-4005-4efa-9f61-78402c013259
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:52:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:13.787 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-545204cc-4005-4efa-9f61-78402c013259', 'env', 'PROCESS_TAG=haproxy-545204cc-4005-4efa-9f61-78402c013259', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/545204cc-4005-4efa-9f61-78402c013259.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:52:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:13.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:14 np0005603622 podman[230637]: 2026-01-31 07:52:14.071318016 +0000 UTC m=+0.022290239 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.192 221324 DEBUG nova.compute.manager [req-767b47cd-09c5-4c83-8431-c97b72ea9fd4 req-f65b4ee8-2c15-4e96-9533-57005df83746 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.194 221324 DEBUG oslo_concurrency.lockutils [req-767b47cd-09c5-4c83-8431-c97b72ea9fd4 req-f65b4ee8-2c15-4e96-9533-57005df83746 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.194 221324 DEBUG oslo_concurrency.lockutils [req-767b47cd-09c5-4c83-8431-c97b72ea9fd4 req-f65b4ee8-2c15-4e96-9533-57005df83746 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.195 221324 DEBUG oslo_concurrency.lockutils [req-767b47cd-09c5-4c83-8431-c97b72ea9fd4 req-f65b4ee8-2c15-4e96-9533-57005df83746 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.195 221324 DEBUG nova.compute.manager [req-767b47cd-09c5-4c83-8431-c97b72ea9fd4 req-f65b4ee8-2c15-4e96-9533-57005df83746 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Processing event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:52:14 np0005603622 podman[230637]: 2026-01-31 07:52:14.266796222 +0000 UTC m=+0.217768395 container create a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:52:14 np0005603622 systemd[1]: Started libpod-conmon-a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153.scope.
Jan 31 02:52:14 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:52:14 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ba67b07a94eb75559be6a4ba1429823a9482bbbc04867c8ff431e9148fccf1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:14 np0005603622 podman[230637]: 2026-01-31 07:52:14.42956024 +0000 UTC m=+0.380532433 container init a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 02:52:14 np0005603622 podman[230637]: 2026-01-31 07:52:14.43775414 +0000 UTC m=+0.388726353 container start a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.440 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.440 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.442 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845934.442212, 104c67e8-fcdb-4437-9060-8d66250d4c46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.443 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] VM Started (Lifecycle Event)#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.445 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.450 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:52:14 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [NOTICE]   (230680) : New worker (230682) forked
Jan 31 02:52:14 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [NOTICE]   (230680) : Loading success.
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.457 221324 INFO nova.virt.libvirt.driver [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Instance spawned successfully.#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.457 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.462 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.463 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.463 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.464 221324 DEBUG nova.virt.libvirt.driver [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.517 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.518 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.518 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.518 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.518 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.539 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.545 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.595 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.595 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845934.4424736, 104c67e8-fcdb-4437-9060-8d66250d4c46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.596 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/9255750' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/9255750' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.633 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.638 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769845934.4490783, 104c67e8-fcdb-4437-9060-8d66250d4c46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.639 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.706 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.710 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.716 221324 INFO nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Took 38.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.717 221324 DEBUG nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.754 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.867 221324 INFO nova.compute.manager [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Took 39.60 seconds to build instance.#033[00m
Jan 31 02:52:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:14.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:14 np0005603622 nova_compute[221301]: 2026-01-31 07:52:14.921 221324 DEBUG oslo_concurrency.lockutils [None req-491ad0f3-acda-4926-b978-b0cbcd9098a8 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 39.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2517363108' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.006 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.154 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000017 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.155 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000017 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.160 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.161 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.342 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.344 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4505MB free_disk=20.83074951171875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.345 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.345 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:52:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.452 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.453 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 104c67e8-fcdb-4437-9060-8d66250d4c46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.453 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.453 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:52:15 np0005603622 nova_compute[221301]: 2026-01-31 07:52:15.587 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:15.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/814200090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.055 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.062 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.096 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.174 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.174 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.893 221324 DEBUG nova.compute.manager [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.894 221324 DEBUG oslo_concurrency.lockutils [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.895 221324 DEBUG oslo_concurrency.lockutils [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.895 221324 DEBUG oslo_concurrency.lockutils [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.895 221324 DEBUG nova.compute.manager [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] No waiting events found dispatching network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:52:16 np0005603622 nova_compute[221301]: 2026-01-31 07:52:16.896 221324 WARNING nova.compute.manager [req-1abfc5d8-a0ba-489e-adc2-d78833d88df8 req-ece13b90-bbcd-4272-ac30-8389d559aa24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received unexpected event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 for instance with vm_state active and task_state None.#033[00m
Jan 31 02:52:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:17 np0005603622 nova_compute[221301]: 2026-01-31 07:52:17.130 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:52:17 np0005603622 nova_compute[221301]: 2026-01-31 07:52:17.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:17.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:18.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:19.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:20.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:21 np0005603622 nova_compute[221301]: 2026-01-31 07:52:21.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2222] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/49)
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2230] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <warn>  [1769845941.2232] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2240] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/50)
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2245] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <warn>  [1769845941.2246] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2252] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2264] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2271] device (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:52:21 np0005603622 NetworkManager[49080]: <info>  [1769845941.2275] device (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 31 02:52:21 np0005603622 nova_compute[221301]: 2026-01-31 07:52:21.286 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:21Z|00101|binding|INFO|Releasing lport f5863ff4-e876-4327-ae9c-e2daeaf0d0ff from this chassis (sb_readonly=0)
Jan 31 02:52:21 np0005603622 nova_compute[221301]: 2026-01-31 07:52:21.302 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603622 nova_compute[221301]: 2026-01-31 07:52:21.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:21Z|00102|binding|INFO|Releasing lport f5863ff4-e876-4327-ae9c-e2daeaf0d0ff from this chassis (sb_readonly=0)
Jan 31 02:52:21 np0005603622 nova_compute[221301]: 2026-01-31 07:52:21.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:21.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:22 np0005603622 podman[230787]: 2026-01-31 07:52:22.182526757 +0000 UTC m=+0.098760281 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.324 221324 DEBUG nova.compute.manager [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-changed-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.325 221324 DEBUG nova.compute.manager [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Refreshing instance network info cache due to event network-changed-7aad9859-08d5-41b1-aa3a-9879376fb833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.326 221324 DEBUG oslo_concurrency.lockutils [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.326 221324 DEBUG oslo_concurrency.lockutils [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:52:22 np0005603622 nova_compute[221301]: 2026-01-31 07:52:22.327 221324 DEBUG nova.network.neutron [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Refreshing network info cache for port 7aad9859-08d5-41b1-aa3a-9879376fb833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:52:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:22.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:23 np0005603622 nova_compute[221301]: 2026-01-31 07:52:23.530 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:23.530 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:52:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:23.531 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:52:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:23.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:24.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:25.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:26 np0005603622 nova_compute[221301]: 2026-01-31 07:52:26.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:26.535 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:26.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:26 np0005603622 nova_compute[221301]: 2026-01-31 07:52:26.938 221324 DEBUG nova.network.neutron [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updated VIF entry in instance network info cache for port 7aad9859-08d5-41b1-aa3a-9879376fb833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:52:26 np0005603622 nova_compute[221301]: 2026-01-31 07:52:26.938 221324 DEBUG nova.network.neutron [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updating instance_info_cache with network_info: [{"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:52:26 np0005603622 nova_compute[221301]: 2026-01-31 07:52:26.998 221324 DEBUG oslo_concurrency.lockutils [req-59347f31-2913-4475-90ca-e5819f055fdb req-16ed4745-e718-4e4f-acc6-b42fe06b1c79 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-104c67e8-fcdb-4437-9060-8d66250d4c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:52:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 02:52:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 4982 writes, 26K keys, 4982 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 4982 writes, 4982 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1489 writes, 7194 keys, 1489 commit groups, 1.0 writes per commit group, ingest: 15.37 MB, 0.03 MB/s#012Interval WAL: 1489 writes, 1489 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     44.6      0.68              0.08        14    0.049       0      0       0.0       0.0#012  L6      1/0    8.85 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.6     71.5     59.7      1.82              0.31        13    0.140     61K   6812       0.0       0.0#012 Sum      1/0    8.85 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.6     52.0     55.6      2.51              0.40        27    0.093     61K   6812       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   5.2     34.5     35.4      1.48              0.15        10    0.148     26K   2530       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     71.5     59.7      1.82              0.31        13    0.140     61K   6812       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     44.7      0.68              0.08        13    0.052       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 2.5 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 12.30 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000214 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(711,11.80 MB,3.88317%) FilterBlock(27,177.42 KB,0.0569946%) IndexBlock(27,334.36 KB,0.107409%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 02:52:27 np0005603622 nova_compute[221301]: 2026-01-31 07:52:27.267 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:27.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:28 np0005603622 podman[230814]: 2026-01-31 07:52:28.175341639 +0000 UTC m=+0.088105345 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:52:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:28 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:28Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:f7:da 10.100.0.11
Jan 31 02:52:28 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:28Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:f7:da 10.100.0.11
Jan 31 02:52:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:28.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:29.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:30.845 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:30.847 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:30.849 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:30.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:31 np0005603622 nova_compute[221301]: 2026-01-31 07:52:31.452 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:31.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:32 np0005603622 nova_compute[221301]: 2026-01-31 07:52:32.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:32.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:52:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:33.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:52:34 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:34Z|00103|binding|INFO|Releasing lport f5863ff4-e876-4327-ae9c-e2daeaf0d0ff from this chassis (sb_readonly=0)
Jan 31 02:52:34 np0005603622 nova_compute[221301]: 2026-01-31 07:52:34.478 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:34.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.870 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.870 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.871 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.871 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.871 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.874 221324 INFO nova.compute.manager [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Terminating instance#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.875 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.875 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquired lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:52:35 np0005603622 nova_compute[221301]: 2026-01-31 07:52:35.876 221324 DEBUG nova.network.neutron [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:52:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:35.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:36 np0005603622 nova_compute[221301]: 2026-01-31 07:52:36.373 221324 DEBUG nova.network.neutron [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:52:36 np0005603622 nova_compute[221301]: 2026-01-31 07:52:36.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:36.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.270 221324 DEBUG nova.network.neutron [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.275 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.295 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Releasing lock "refresh_cache-39aa59fc-0e1c-4a01-860c-a7ff643e442f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.298 221324 DEBUG nova.compute.manager [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:52:37 np0005603622 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 31 02:52:37 np0005603622 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Consumed 21.378s CPU time.
Jan 31 02:52:37 np0005603622 systemd-machined[190406]: Machine qemu-8-instance-0000000f terminated.
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.727 221324 INFO nova.virt.libvirt.driver [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance destroyed successfully.#033[00m
Jan 31 02:52:37 np0005603622 nova_compute[221301]: 2026-01-31 07:52:37.729 221324 DEBUG nova.objects.instance [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lazy-loading 'resources' on Instance uuid 39aa59fc-0e1c-4a01-860c-a7ff643e442f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:52:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:37.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:38.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.544 221324 INFO nova.virt.libvirt.driver [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Deleting instance files /var/lib/nova/instances/39aa59fc-0e1c-4a01-860c-a7ff643e442f_del#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.545 221324 INFO nova.virt.libvirt.driver [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Deletion of /var/lib/nova/instances/39aa59fc-0e1c-4a01-860c-a7ff643e442f_del complete#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.633 221324 INFO nova.compute.manager [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Took 2.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.633 221324 DEBUG oslo.service.loopingcall [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.634 221324 DEBUG nova.compute.manager [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.634 221324 DEBUG nova.network.neutron [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:52:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:39.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.949 221324 DEBUG nova.network.neutron [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.967 221324 DEBUG nova.network.neutron [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:52:39 np0005603622 nova_compute[221301]: 2026-01-31 07:52:39.991 221324 INFO nova.compute.manager [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Took 0.36 seconds to deallocate network for instance.#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.149 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.150 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.256 221324 DEBUG oslo_concurrency.processutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.634 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.635 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:52:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1621179879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.675 221324 DEBUG nova.objects.instance [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'flavor' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.679 221324 DEBUG oslo_concurrency.processutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.687 221324 DEBUG nova.compute.provider_tree [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.733 221324 DEBUG nova.scheduler.client.report [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.779 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.816 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:40 np0005603622 nova_compute[221301]: 2026-01-31 07:52:40.865 221324 INFO nova.scheduler.client.report [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Deleted allocations for instance 39aa59fc-0e1c-4a01-860c-a7ff643e442f#033[00m
Jan 31 02:52:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:40.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:41 np0005603622 nova_compute[221301]: 2026-01-31 07:52:41.137 221324 DEBUG oslo_concurrency.lockutils [None req-247d7628-eb1d-4d15-8f33-b594563adebb 8a59efd78e244f44a1c70650f82a2c50 1627a71b855b4032b51e234e44a9d570 - - default default] Lock "39aa59fc-0e1c-4a01-860c-a7ff643e442f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:41 np0005603622 nova_compute[221301]: 2026-01-31 07:52:41.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:41 np0005603622 nova_compute[221301]: 2026-01-31 07:52:41.738 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:41 np0005603622 nova_compute[221301]: 2026-01-31 07:52:41.739 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:41 np0005603622 nova_compute[221301]: 2026-01-31 07:52:41.739 221324 INFO nova.compute.manager [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Attaching volume 7df70e23-e53e-4df4-8aa9-3917f141bf82 to /dev/sdc#033[00m
Jan 31 02:52:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:41.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.194 221324 DEBUG os_brick.utils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.197 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.211 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.211 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[13513c7a-dde9-46a0-acea-89279b59daf3]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.213 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.221 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.222 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4aa277-d4e7-4b00-915f-a9fa4e38ca82]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.223 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.235 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.235 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccca702-75be-44f3-87d7-cfaaa2cf2915]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.237 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca5c29d-4484-42fa-91c2-8b29e4a6813d]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.237 221324 DEBUG oslo_concurrency.processutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.260 221324 DEBUG oslo_concurrency.processutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.264 221324 DEBUG os_brick.initiator.connectors.lightos [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.264 221324 DEBUG os_brick.initiator.connectors.lightos [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.265 221324 DEBUG os_brick.initiator.connectors.lightos [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.265 221324 DEBUG os_brick.utils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] <== get_connector_properties: return (71ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.266 221324 DEBUG nova.virt.block_device [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updating existing volume attachment record: bfa23ad9-7293-4402-b518-12c0a1ae25da _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 02:52:42 np0005603622 nova_compute[221301]: 2026-01-31 07:52:42.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:52:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:42.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:52:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.528 221324 DEBUG nova.objects.instance [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'flavor' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.609 221324 DEBUG nova.virt.libvirt.guest [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7df70e23-e53e-4df4-8aa9-3917f141bf82">
Jan 31 02:52:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  </source>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 02:52:43 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  </auth>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <target dev="sdc" bus="scsi"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <serial>7df70e23-e53e-4df4-8aa9-3917f141bf82</serial>
Jan 31 02:52:43 np0005603622 nova_compute[221301]:  <address type="drive" controller="0" unit="2"/>
Jan 31 02:52:43 np0005603622 nova_compute[221301]: </disk>
Jan 31 02:52:43 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.783 221324 DEBUG nova.virt.libvirt.driver [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.784 221324 DEBUG nova.virt.libvirt.driver [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.784 221324 DEBUG nova.virt.libvirt.driver [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No BDM found with device name sdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:52:43 np0005603622 nova_compute[221301]: 2026-01-31 07:52:43.784 221324 DEBUG nova.virt.libvirt.driver [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] No VIF found with MAC fa:16:3e:a7:f7:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:52:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:43.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:44.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:45 np0005603622 nova_compute[221301]: 2026-01-31 07:52:45.101 221324 DEBUG oslo_concurrency.lockutils [None req-880b7074-f814-4985-84a1-7fbc136d0902 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:45.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:46 np0005603622 nova_compute[221301]: 2026-01-31 07:52:46.502 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:46.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.228 221324 DEBUG oslo_concurrency.lockutils [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.228 221324 DEBUG oslo_concurrency.lockutils [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.254 221324 INFO nova.compute.manager [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Detaching volume 7df70e23-e53e-4df4-8aa9-3917f141bf82#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.308 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.546 221324 INFO nova.virt.block_device [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Attempting to driver detach volume 7df70e23-e53e-4df4-8aa9-3917f141bf82 from mountpoint /dev/sdc#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.557 221324 DEBUG nova.virt.libvirt.driver [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Attempting to detach device sdc from instance 104c67e8-fcdb-4437-9060-8d66250d4c46 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.558 221324 DEBUG nova.virt.libvirt.guest [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7df70e23-e53e-4df4-8aa9-3917f141bf82">
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  </source>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <target dev="sdc" bus="scsi"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <serial>7df70e23-e53e-4df4-8aa9-3917f141bf82</serial>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]: </disk>
Jan 31 02:52:47 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.591 221324 INFO nova.virt.libvirt.driver [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Successfully detached device sdc from instance 104c67e8-fcdb-4437-9060-8d66250d4c46 from the persistent domain config.#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.594 221324 DEBUG nova.virt.libvirt.driver [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] (1/8): Attempting to detach device sdc with device alias scsi0-0-0-2 from instance 104c67e8-fcdb-4437-9060-8d66250d4c46 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.595 221324 DEBUG nova.virt.libvirt.guest [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7df70e23-e53e-4df4-8aa9-3917f141bf82">
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  </source>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <target dev="sdc" bus="scsi"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <serial>7df70e23-e53e-4df4-8aa9-3917f141bf82</serial>
Jan 31 02:52:47 np0005603622 nova_compute[221301]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Jan 31 02:52:47 np0005603622 nova_compute[221301]: </disk>
Jan 31 02:52:47 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.721 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769845967.7207994, 104c67e8-fcdb-4437-9060-8d66250d4c46 => scsi0-0-0-2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.722 221324 DEBUG nova.virt.libvirt.driver [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Start waiting for the detach event from libvirt for device sdc with device alias scsi0-0-0-2 for instance 104c67e8-fcdb-4437-9060-8d66250d4c46 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 02:52:47 np0005603622 nova_compute[221301]: 2026-01-31 07:52:47.725 221324 INFO nova.virt.libvirt.driver [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Successfully detached device sdc from instance 104c67e8-fcdb-4437-9060-8d66250d4c46 from the live domain config.#033[00m
Jan 31 02:52:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:47.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:48 np0005603622 nova_compute[221301]: 2026-01-31 07:52:48.454 221324 DEBUG nova.objects.instance [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'flavor' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:52:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:48 np0005603622 nova_compute[221301]: 2026-01-31 07:52:48.524 221324 DEBUG oslo_concurrency.lockutils [None req-788724ac-d81d-4b24-bdd1-2b20ab077bdd 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:48.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:49.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:50.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.380 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.381 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.381 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.381 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.381 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.382 221324 INFO nova.compute.manager [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Terminating instance#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.383 221324 DEBUG nova.compute.manager [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.506 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603622 kernel: tap7aad9859-08 (unregistering): left promiscuous mode
Jan 31 02:52:51 np0005603622 NetworkManager[49080]: <info>  [1769845971.7740] device (tap7aad9859-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:51Z|00104|binding|INFO|Releasing lport 7aad9859-08d5-41b1-aa3a-9879376fb833 from this chassis (sb_readonly=0)
Jan 31 02:52:51 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:51Z|00105|binding|INFO|Setting lport 7aad9859-08d5-41b1-aa3a-9879376fb833 down in Southbound
Jan 31 02:52:51 np0005603622 ovn_controller[130109]: 2026-01-31T07:52:51Z|00106|binding|INFO|Removing iface tap7aad9859-08 ovn-installed in OVS
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.787 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603622 nova_compute[221301]: 2026-01-31 07:52:51.791 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:51.800 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:f7:da 10.100.0.11'], port_security=['fa:16:3e:a7:f7:da 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '104c67e8-fcdb-4437-9060-8d66250d4c46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-545204cc-4005-4efa-9f61-78402c013259', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea1f64b5b90d4a90a6cb764c4cdcca59', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba228104-4430-4a80-9d82-20035205851a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0ca3a0-be06-4a81-8971-bf69a1384288, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=7aad9859-08d5-41b1-aa3a-9879376fb833) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:52:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:51.801 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 7aad9859-08d5-41b1-aa3a-9879376fb833 in datapath 545204cc-4005-4efa-9f61-78402c013259 unbound from our chassis#033[00m
Jan 31 02:52:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:51.803 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 545204cc-4005-4efa-9f61-78402c013259, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:52:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:51.804 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b324608b-ac7d-4fb5-9e99-c7d901ffcb8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:51.805 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-545204cc-4005-4efa-9f61-78402c013259 namespace which is not needed anymore#033[00m
Jan 31 02:52:51 np0005603622 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 31 02:52:51 np0005603622 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Consumed 14.965s CPU time.
Jan 31 02:52:51 np0005603622 systemd-machined[190406]: Machine qemu-12-instance-00000017 terminated.
Jan 31 02:52:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:51.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:51 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [NOTICE]   (230680) : haproxy version is 2.8.14-c23fe91
Jan 31 02:52:51 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [NOTICE]   (230680) : path to executable is /usr/sbin/haproxy
Jan 31 02:52:51 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [WARNING]  (230680) : Exiting Master process...
Jan 31 02:52:51 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [ALERT]    (230680) : Current worker (230682) exited with code 143 (Terminated)
Jan 31 02:52:51 np0005603622 neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259[230675]: [WARNING]  (230680) : All workers exited. Exiting... (0)
Jan 31 02:52:51 np0005603622 systemd[1]: libpod-a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153.scope: Deactivated successfully.
Jan 31 02:52:51 np0005603622 podman[230930]: 2026-01-31 07:52:51.99022369 +0000 UTC m=+0.102049498 container died a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.019 221324 INFO nova.virt.libvirt.driver [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Instance destroyed successfully.#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.021 221324 DEBUG nova.objects.instance [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lazy-loading 'resources' on Instance uuid 104c67e8-fcdb-4437-9060-8d66250d4c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.043 221324 DEBUG nova.virt.libvirt.vif [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:51:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1338764688',display_name='tempest-AttachSCSIVolumeTestJSON-server-1338764688',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-1338764688',id=23,image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLEAYGAQGZPVYr3zBFF3S3S+ffBjem+DIIT0q0YvmJf/PpnnYZ/mpMrcMLuITZe4Vw6fqp9NHNWE7S1vV97FsPcL/y3psuRr5A22EUZj130xa5C4EXScW+eZYfLJZ4+Xdw==',key_name='tempest-keypair-485795967',keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:52:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ea1f64b5b90d4a90a6cb764c4cdcca59',ramdisk_id='',reservation_id='r-8xrgtxlo',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e0fa43e0-9f79-4992-bc07-130eef6d5592',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-948431203',owner_user_name='tempest-AttachSCSIVolumeTestJSON-948431203-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:52:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69d26c1015144fbda08ee4b9db0d7b70',uuid=104c67e8-fcdb-4437-9060-8d66250d4c46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.044 221324 DEBUG nova.network.os_vif_util [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converting VIF {"id": "7aad9859-08d5-41b1-aa3a-9879376fb833", "address": "fa:16:3e:a7:f7:da", "network": {"id": "545204cc-4005-4efa-9f61-78402c013259", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1252039387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea1f64b5b90d4a90a6cb764c4cdcca59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad9859-08", "ovs_interfaceid": "7aad9859-08d5-41b1-aa3a-9879376fb833", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.045 221324 DEBUG nova.network.os_vif_util [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.045 221324 DEBUG os_vif [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.048 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.048 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aad9859-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.057 221324 INFO os_vif [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:f7:da,bridge_name='br-int',has_traffic_filtering=True,id=7aad9859-08d5-41b1-aa3a-9879376fb833,network=Network(545204cc-4005-4efa-9f61-78402c013259),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad9859-08')#033[00m
Jan 31 02:52:52 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153-userdata-shm.mount: Deactivated successfully.
Jan 31 02:52:52 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f9ba67b07a94eb75559be6a4ba1429823a9482bbbc04867c8ff431e9148fccf1-merged.mount: Deactivated successfully.
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603622 podman[230930]: 2026-01-31 07:52:52.349756122 +0000 UTC m=+0.461581940 container cleanup a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Jan 31 02:52:52 np0005603622 systemd[1]: libpod-conmon-a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153.scope: Deactivated successfully.
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.475 221324 DEBUG nova.compute.manager [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-unplugged-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.476 221324 DEBUG oslo_concurrency.lockutils [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.476 221324 DEBUG oslo_concurrency.lockutils [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.477 221324 DEBUG oslo_concurrency.lockutils [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.477 221324 DEBUG nova.compute.manager [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] No waiting events found dispatching network-vif-unplugged-7aad9859-08d5-41b1-aa3a-9879376fb833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.477 221324 DEBUG nova.compute.manager [req-82f32b40-9c7b-4880-8bcf-2e6be5115c98 req-dc121ca2-9cf4-4453-abb1-17d4125988a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-unplugged-7aad9859-08d5-41b1-aa3a-9879376fb833 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.725 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845957.7238321, 39aa59fc-0e1c-4a01-860c-a7ff643e442f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.726 221324 INFO nova.compute.manager [-] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.797 221324 DEBUG nova.compute.manager [None req-b24e02e5-781c-4470-a770-62d8550c18f0 - - - - - -] [instance: 39aa59fc-0e1c-4a01-860c-a7ff643e442f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:52:52 np0005603622 podman[230987]: 2026-01-31 07:52:52.927079074 +0000 UTC m=+0.556652109 container remove a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.932 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3d97c6d2-9364-4d7a-b56e-5e64804f92f2]: (4, ('Sat Jan 31 07:52:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259 (a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153)\na3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153\nSat Jan 31 07:52:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-545204cc-4005-4efa-9f61-78402c013259 (a3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153)\na3b798ceed91424aa09414bb1690b98fcb4931db87128999ecfcd3ad175a5153\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.935 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5cad2778-e176-4a30-bba1-b0256a5344b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.936 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap545204cc-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603622 kernel: tap545204cc-40: left promiscuous mode
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.944 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f523a40-fc9e-4264-955a-2e918e3028fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 nova_compute[221301]: 2026-01-31 07:52:52.949 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.960 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16e87aa2-05a3-42bd-9d78-4539950f19ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.962 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d05fa381-5e9c-4faf-a759-914e5635da54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:52.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.978 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[035d7b0a-a637-4042-ae24-6d6d049b3720]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518495, 'reachable_time': 40110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231015, 'error': None, 'target': 'ovnmeta-545204cc-4005-4efa-9f61-78402c013259', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.981 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-545204cc-4005-4efa-9f61-78402c013259 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:52:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:52:52.981 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a09a0f8b-3213-4ef3-af37-cfe28e08a437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:52:52 np0005603622 systemd[1]: run-netns-ovnmeta\x2d545204cc\x2d4005\x2d4efa\x2d9f61\x2d78402c013259.mount: Deactivated successfully.
Jan 31 02:52:53 np0005603622 podman[230988]: 2026-01-31 07:52:53.021127377 +0000 UTC m=+0.637493467 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:52:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:52:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:53.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.704 221324 DEBUG nova.compute.manager [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.705 221324 DEBUG oslo_concurrency.lockutils [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.705 221324 DEBUG oslo_concurrency.lockutils [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.705 221324 DEBUG oslo_concurrency.lockutils [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.706 221324 DEBUG nova.compute.manager [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] No waiting events found dispatching network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:52:54 np0005603622 nova_compute[221301]: 2026-01-31 07:52:54.706 221324 WARNING nova.compute.manager [req-7afa0cc8-0d7a-4ebf-a72f-e4d09173d367 req-107ccf28-5c04-488c-b536-e1cbe9225a4a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received unexpected event network-vif-plugged-7aad9859-08d5-41b1-aa3a-9879376fb833 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:52:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:52:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:54.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:52:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:55.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:56 np0005603622 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 31 02:52:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:56.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:57 np0005603622 nova_compute[221301]: 2026-01-31 07:52:57.052 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:57 np0005603622 nova_compute[221301]: 2026-01-31 07:52:57.313 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:52:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:57.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:52:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:52:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:52:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:52:58.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:52:59 np0005603622 podman[231031]: 2026-01-31 07:52:59.144205405 +0000 UTC m=+0.066083794 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 02:52:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:52:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:52:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:52:59.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:00.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.057 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.108 221324 INFO nova.virt.libvirt.driver [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Deleting instance files /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46_del#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.109 221324 INFO nova.virt.libvirt.driver [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Deletion of /var/lib/nova/instances/104c67e8-fcdb-4437-9060-8d66250d4c46_del complete#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.187 221324 INFO nova.compute.manager [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Took 10.80 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.187 221324 DEBUG oslo.service.loopingcall [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.187 221324 DEBUG nova.compute.manager [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.188 221324 DEBUG nova.network.neutron [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:53:02 np0005603622 nova_compute[221301]: 2026-01-31 07:53:02.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:02.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.591 221324 DEBUG nova.network.neutron [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.633 221324 INFO nova.compute.manager [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Took 2.45 seconds to deallocate network for instance.#033[00m
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.715 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.715 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.739 221324 DEBUG nova.compute.manager [req-2274b9db-5664-4c07-a721-8e5e43a846b7 req-01b4923d-72eb-437e-86bb-c8e6fbb1a391 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Received event network-vif-deleted-7aad9859-08d5-41b1-aa3a-9879376fb833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:53:04 np0005603622 nova_compute[221301]: 2026-01-31 07:53:04.898 221324 DEBUG oslo_concurrency.processutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:04.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3561225786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.403 221324 DEBUG oslo_concurrency.processutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.409 221324 DEBUG nova.compute.provider_tree [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.439 221324 DEBUG nova.scheduler.client.report [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.482 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.535 221324 INFO nova.scheduler.client.report [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Deleted allocations for instance 104c67e8-fcdb-4437-9060-8d66250d4c46#033[00m
Jan 31 02:53:05 np0005603622 nova_compute[221301]: 2026-01-31 07:53:05.650 221324 DEBUG oslo_concurrency.lockutils [None req-a945984d-a6b3-41c9-bbaa-6ca26ee72372 69d26c1015144fbda08ee4b9db0d7b70 ea1f64b5b90d4a90a6cb764c4cdcca59 - - default default] Lock "104c67e8-fcdb-4437-9060-8d66250d4c46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:06.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:07 np0005603622 nova_compute[221301]: 2026-01-31 07:53:07.016 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769845972.0144148, 104c67e8-fcdb-4437-9060-8d66250d4c46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:53:07 np0005603622 nova_compute[221301]: 2026-01-31 07:53:07.016 221324 INFO nova.compute.manager [-] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:53:07 np0005603622 nova_compute[221301]: 2026-01-31 07:53:07.096 221324 DEBUG nova.compute.manager [None req-56a8f38a-8c5c-4ea2-8d42-ac4c27fb98e1 - - - - - -] [instance: 104c67e8-fcdb-4437-9060-8d66250d4c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:07 np0005603622 nova_compute[221301]: 2026-01-31 07:53:07.096 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:07 np0005603622 nova_compute[221301]: 2026-01-31 07:53:07.318 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:08.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 31 02:53:10 np0005603622 nova_compute[221301]: 2026-01-31 07:53:10.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:10 np0005603622 nova_compute[221301]: 2026-01-31 07:53:10.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:53:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:10.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:11.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:12 np0005603622 nova_compute[221301]: 2026-01-31 07:53:12.101 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:12 np0005603622 nova_compute[221301]: 2026-01-31 07:53:12.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:12 np0005603622 nova_compute[221301]: 2026-01-31 07:53:12.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:12.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:13 np0005603622 nova_compute[221301]: 2026-01-31 07:53:13.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:13 np0005603622 nova_compute[221301]: 2026-01-31 07:53:13.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:13 np0005603622 nova_compute[221301]: 2026-01-31 07:53:13.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:14 np0005603622 nova_compute[221301]: 2026-01-31 07:53:14.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:14 np0005603622 nova_compute[221301]: 2026-01-31 07:53:14.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:53:14 np0005603622 nova_compute[221301]: 2026-01-31 07:53:14.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:53:14 np0005603622 nova_compute[221301]: 2026-01-31 07:53:14.462 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:53:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:14.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:15 np0005603622 nova_compute[221301]: 2026-01-31 07:53:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:15 np0005603622 nova_compute[221301]: 2026-01-31 07:53:15.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:15.820 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:53:15 np0005603622 nova_compute[221301]: 2026-01-31 07:53:15.820 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:15.822 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:53:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:15.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.428 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.429 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1737104502' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:16 np0005603622 nova_compute[221301]: 2026-01-31 07:53:16.879 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:17.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.079 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.081 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4852MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.081 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.081 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.105 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.161 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.161 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.188 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1936486195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.649 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.655 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.672 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.720 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:53:17 np0005603622 nova_compute[221301]: 2026-01-31 07:53:17.721 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:17.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:18.825 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:53:18 np0005603622 podman[231512]: 2026-01-31 07:53:18.753059512 +0000 UTC m=+0.030049107 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:53:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:19.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.09347422 +0000 UTC m=+0.370463795 container create 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:53:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 31 02:53:19 np0005603622 systemd[1]: Started libpod-conmon-1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b.scope.
Jan 31 02:53:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.260642994 +0000 UTC m=+0.537632589 container init 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.268258898 +0000 UTC m=+0.545248473 container start 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.273970071 +0000 UTC m=+0.550959726 container attach 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 02:53:19 np0005603622 sleepy_cannon[231528]: 167 167
Jan 31 02:53:19 np0005603622 systemd[1]: libpod-1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b.scope: Deactivated successfully.
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.286914029 +0000 UTC m=+0.563903614 container died 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 31 02:53:19 np0005603622 systemd[1]: var-lib-containers-storage-overlay-280b5b01f8f62284fa0a39c6cbbaec67d33e95362c0ab1b085918638ddcbb238-merged.mount: Deactivated successfully.
Jan 31 02:53:19 np0005603622 podman[231512]: 2026-01-31 07:53:19.329176382 +0000 UTC m=+0.606165957 container remove 1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_cannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 31 02:53:19 np0005603622 systemd[1]: libpod-conmon-1ff3df5aa4b257a00ed14415e889fa22e7e6484740404153720a3131bda9275b.scope: Deactivated successfully.
Jan 31 02:53:19 np0005603622 podman[231552]: 2026-01-31 07:53:19.474268533 +0000 UTC m=+0.050727251 container create ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:53:19 np0005603622 systemd[1]: Started libpod-conmon-ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9.scope.
Jan 31 02:53:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:53:19 np0005603622 podman[231552]: 2026-01-31 07:53:19.454337479 +0000 UTC m=+0.030796197 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 02:53:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50883f489a0771b74093b4c0d9694fc80062d43a72fab0484c800b04f5481732/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50883f489a0771b74093b4c0d9694fc80062d43a72fab0484c800b04f5481732/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50883f489a0771b74093b4c0d9694fc80062d43a72fab0484c800b04f5481732/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50883f489a0771b74093b4c0d9694fc80062d43a72fab0484c800b04f5481732/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 02:53:19 np0005603622 podman[231552]: 2026-01-31 07:53:19.57034232 +0000 UTC m=+0.146801048 container init ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 31 02:53:19 np0005603622 podman[231552]: 2026-01-31 07:53:19.57669305 +0000 UTC m=+0.153151748 container start ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 31 02:53:19 np0005603622 podman[231552]: 2026-01-31 07:53:19.581374585 +0000 UTC m=+0.157833283 container attach ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 02:53:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:19.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:53:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/893751708' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:53:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:53:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/893751708' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]: [
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:    {
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "available": false,
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "ceph_device": false,
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "lsm_data": {},
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "lvs": [],
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "path": "/dev/sr0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "rejected_reasons": [
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "Insufficient space (<5GB)",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "Has a FileSystem"
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        ],
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        "sys_api": {
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "actuators": null,
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "device_nodes": "sr0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "devname": "sr0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "human_readable_size": "482.00 KB",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "id_bus": "ata",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "model": "QEMU DVD-ROM",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "nr_requests": "2",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "parent": "/dev/sr0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "partitions": {},
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "path": "/dev/sr0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "removable": "1",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "rev": "2.5+",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "ro": "0",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "rotational": "1",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "sas_address": "",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "sas_device_handle": "",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "scheduler_mode": "mq-deadline",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "sectors": 0,
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "sectorsize": "2048",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "size": 493568.0,
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "support_discard": "2048",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "type": "disk",
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:            "vendor": "QEMU"
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:        }
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]:    }
Jan 31 02:53:20 np0005603622 nice_lumiere[231569]: ]
Jan 31 02:53:20 np0005603622 systemd[1]: libpod-ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9.scope: Deactivated successfully.
Jan 31 02:53:20 np0005603622 systemd[1]: libpod-ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9.scope: Consumed 1.284s CPU time.
Jan 31 02:53:20 np0005603622 podman[232829]: 2026-01-31 07:53:20.918535896 +0000 UTC m=+0.026586784 container died ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 02:53:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:21.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:21 np0005603622 systemd[1]: var-lib-containers-storage-overlay-50883f489a0771b74093b4c0d9694fc80062d43a72fab0484c800b04f5481732-merged.mount: Deactivated successfully.
Jan 31 02:53:21 np0005603622 podman[232829]: 2026-01-31 07:53:21.10290714 +0000 UTC m=+0.210958028 container remove ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_lumiere, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 02:53:21 np0005603622 systemd[1]: libpod-conmon-ac62616c2bdb083290ccf5a551b557aafb84afccecb98620fb9f4637654580e9.scope: Deactivated successfully.
Jan 31 02:53:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:21.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:22 np0005603622 nova_compute[221301]: 2026-01-31 07:53:22.110 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:22 np0005603622 nova_compute[221301]: 2026-01-31 07:53:22.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:23.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:23 np0005603622 podman[232844]: 2026-01-31 07:53:23.231437212 +0000 UTC m=+0.144050284 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:53:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:23.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:25.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:25 np0005603622 nova_compute[221301]: 2026-01-31 07:53:25.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:25 np0005603622 nova_compute[221301]: 2026-01-31 07:53:25.806 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:25.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 02:53:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:53:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:53:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:27.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:27 np0005603622 nova_compute[221301]: 2026-01-31 07:53:27.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:27 np0005603622 nova_compute[221301]: 2026-01-31 07:53:27.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:27.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:28 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 31 02:53:28 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 31 02:53:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:29.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:29.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:30 np0005603622 podman[232871]: 2026-01-31 07:53:30.168418647 +0000 UTC m=+0.088800333 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 02:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:30.846 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:30.847 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:53:30.847 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:30 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 31 02:53:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 31 02:53:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:31.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 31 02:53:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:31.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:32 np0005603622 nova_compute[221301]: 2026-01-31 07:53:32.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:32 np0005603622 nova_compute[221301]: 2026-01-31 07:53:32.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:33.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:33.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:35.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:53:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:35.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:36 np0005603622 nova_compute[221301]: 2026-01-31 07:53:36.713 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:36 np0005603622 nova_compute[221301]: 2026-01-31 07:53:36.714 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:36 np0005603622 nova_compute[221301]: 2026-01-31 07:53:36.740 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:53:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:37.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.236 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.237 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.247 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.248 221324 INFO nova.compute.claims [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.361 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.438 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:53:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2613458752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.912 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.921 221324 DEBUG nova.compute.provider_tree [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.951 221324 DEBUG nova.scheduler.client.report [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:53:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:37.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.987 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:37 np0005603622 nova_compute[221301]: 2026-01-31 07:53:37.989 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.132 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.158 221324 INFO nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.178 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.286 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.288 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.288 221324 INFO nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Creating image(s)#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.328 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.380 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.429 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.434 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.506 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.507 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.508 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.508 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.540 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:38 np0005603622 nova_compute[221301]: 2026-01-31 07:53:38.544 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:39.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:39.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:40 np0005603622 nova_compute[221301]: 2026-01-31 07:53:40.513 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.969s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:40 np0005603622 nova_compute[221301]: 2026-01-31 07:53:40.590 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] resizing rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:53:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:41.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.606 221324 DEBUG nova.objects.instance [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lazy-loading 'migration_context' on Instance uuid 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.621 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.622 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Ensure instance console log exists: /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.623 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.623 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.623 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.625 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.630 221324 WARNING nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.640 221324 DEBUG nova.virt.libvirt.host [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.641 221324 DEBUG nova.virt.libvirt.host [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.644 221324 DEBUG nova.virt.libvirt.host [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.645 221324 DEBUG nova.virt.libvirt.host [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.646 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.646 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.647 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.647 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.647 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.648 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.648 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.648 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.648 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.649 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.649 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.649 221324 DEBUG nova.virt.hardware [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:53:41 np0005603622 nova_compute[221301]: 2026-01-31 07:53:41.652 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:41.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/951659092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.099 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.135 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.141 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.160 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.364 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:53:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3800256277' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.573 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.575 221324 DEBUG nova.objects.instance [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.620 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <uuid>10bcd00c-3fce-4785-94ad-d6dde3c9cee9</uuid>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <name>instance-0000001a</name>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-842541918</nova:name>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:53:41</nova:creationTime>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:user uuid="784fa1eae9554e8ba1780d9a15d86b07">tempest-ServerDiagnosticsV248Test-1630997622-project-member</nova:user>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <nova:project uuid="4eaf18b9bc8c4afd86bc06639fac4764">tempest-ServerDiagnosticsV248Test-1630997622</nova:project>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="serial">10bcd00c-3fce-4785-94ad-d6dde3c9cee9</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="uuid">10bcd00c-3fce-4785-94ad-d6dde3c9cee9</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/console.log" append="off"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:53:42 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:53:42 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:53:42 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:53:42 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.692 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.693 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.694 221324 INFO nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Using config drive#033[00m
Jan 31 02:53:42 np0005603622 nova_compute[221301]: 2026-01-31 07:53:42.721 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:43.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.145 221324 INFO nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Creating config drive at /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config#033[00m
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.151 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9f2picns execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.286 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9f2picns" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.328 221324 DEBUG nova.storage.rbd_utils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] rbd image 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.334 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:53:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.560 221324 DEBUG oslo_concurrency.processutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config 10bcd00c-3fce-4785-94ad-d6dde3c9cee9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:53:43 np0005603622 nova_compute[221301]: 2026-01-31 07:53:43.561 221324 INFO nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Deleting local config drive /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9/disk.config because it was imported into RBD.#033[00m
Jan 31 02:53:43 np0005603622 systemd-machined[190406]: New machine qemu-13-instance-0000001a.
Jan 31 02:53:43 np0005603622 systemd[1]: Started Virtual Machine qemu-13-instance-0000001a.
Jan 31 02:53:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:43.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.366 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846024.3657029, 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.369 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.374 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.374 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.381 221324 INFO nova.virt.libvirt.driver [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Instance spawned successfully.#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.382 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.402 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.412 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.418 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.419 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.420 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.420 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.421 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.422 221324 DEBUG nova.virt.libvirt.driver [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.465 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.466 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846024.3674076, 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.466 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] VM Started (Lifecycle Event)#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.510 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.516 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.552 221324 INFO nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Took 6.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.553 221324 DEBUG nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.556 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.639 221324 INFO nova.compute.manager [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Took 7.83 seconds to build instance.#033[00m
Jan 31 02:53:44 np0005603622 nova_compute[221301]: 2026-01-31 07:53:44.659 221324 DEBUG oslo_concurrency.lockutils [None req-6878b460-0c32-476a-a0f8-7ace09e6eec6 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:45.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:45.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:46 np0005603622 nova_compute[221301]: 2026-01-31 07:53:46.648 221324 DEBUG nova.compute.manager [None req-75581676-2dfc-4d1c-9037-39c744312129 b0ff126eb9b74deea0033ae0badab500 a5275ac0df7242498feffe6147cb00e4 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:46 np0005603622 nova_compute[221301]: 2026-01-31 07:53:46.652 221324 INFO nova.compute.manager [None req-75581676-2dfc-4d1c-9037-39c744312129 b0ff126eb9b74deea0033ae0badab500 a5275ac0df7242498feffe6147cb00e4 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Retrieving diagnostics#033[00m
Jan 31 02:53:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:53:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:47.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:53:47 np0005603622 nova_compute[221301]: 2026-01-31 07:53:47.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:47 np0005603622 nova_compute[221301]: 2026-01-31 07:53:47.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:47.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:49.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:49.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:51.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:52.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:52 np0005603622 nova_compute[221301]: 2026-01-31 07:53:52.170 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:52 np0005603622 nova_compute[221301]: 2026-01-31 07:53:52.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:53.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:54.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:54 np0005603622 podman[233309]: 2026-01-31 07:53:54.209039155 +0000 UTC m=+0.116997227 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:53:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:55.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:56.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:57.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.091 221324 DEBUG nova.compute.manager [None req-9c698bbd-c137-4b5d-bc3d-6914869b9e2c b0ff126eb9b74deea0033ae0badab500 a5275ac0df7242498feffe6147cb00e4 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.097 221324 INFO nova.compute.manager [None req-9c698bbd-c137-4b5d-bc3d-6914869b9e2c b0ff126eb9b74deea0033ae0badab500 a5275ac0df7242498feffe6147cb00e4 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Retrieving diagnostics#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.370 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.578 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.579 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.580 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.580 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.581 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.583 221324 INFO nova.compute.manager [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Terminating instance#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.585 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "refresh_cache-10bcd00c-3fce-4785-94ad-d6dde3c9cee9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.585 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquired lock "refresh_cache-10bcd00c-3fce-4785-94ad-d6dde3c9cee9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.586 221324 DEBUG nova.network.neutron [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:53:57 np0005603622 nova_compute[221301]: 2026-01-31 07:53:57.810 221324 DEBUG nova.network.neutron [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:53:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:53:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:53:58.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:53:58 np0005603622 nova_compute[221301]: 2026-01-31 07:53:58.164 221324 DEBUG nova.network.neutron [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:53:58 np0005603622 nova_compute[221301]: 2026-01-31 07:53:58.186 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Releasing lock "refresh_cache-10bcd00c-3fce-4785-94ad-d6dde3c9cee9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:53:58 np0005603622 nova_compute[221301]: 2026-01-31 07:53:58.187 221324 DEBUG nova.compute.manager [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:53:58 np0005603622 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 31 02:53:58 np0005603622 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Consumed 12.812s CPU time.
Jan 31 02:53:58 np0005603622 systemd-machined[190406]: Machine qemu-13-instance-0000001a terminated.
Jan 31 02:53:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:53:58 np0005603622 nova_compute[221301]: 2026-01-31 07:53:58.614 221324 INFO nova.virt.libvirt.driver [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Instance destroyed successfully.#033[00m
Jan 31 02:53:58 np0005603622 nova_compute[221301]: 2026-01-31 07:53:58.614 221324 DEBUG nova.objects.instance [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lazy-loading 'resources' on Instance uuid 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:53:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:53:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:53:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:53:59.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:00.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:01.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:01 np0005603622 podman[233359]: 2026-01-31 07:54:01.158243259 +0000 UTC m=+0.073838862 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:54:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:02.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:02 np0005603622 nova_compute[221301]: 2026-01-31 07:54:02.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:02 np0005603622 nova_compute[221301]: 2026-01-31 07:54:02.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:02 np0005603622 nova_compute[221301]: 2026-01-31 07:54:02.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:02.888 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:54:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:02.890 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:54:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:03.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:04.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:04.892 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:06.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:07 np0005603622 nova_compute[221301]: 2026-01-31 07:54:07.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:07 np0005603622 nova_compute[221301]: 2026-01-31 07:54:07.377 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:08.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 02:54:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:09.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 02:54:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:10.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:10 np0005603622 ceph-osd[79107]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 31 02:54:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:11.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:11 np0005603622 nova_compute[221301]: 2026-01-31 07:54:11.721 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:11 np0005603622 nova_compute[221301]: 2026-01-31 07:54:11.722 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:54:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:12.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:12 np0005603622 nova_compute[221301]: 2026-01-31 07:54:12.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:12 np0005603622 nova_compute[221301]: 2026-01-31 07:54:12.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:13.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.611 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846038.6098406, 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.611 221324 INFO nova.compute.manager [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.641 221324 DEBUG nova.compute.manager [None req-cefcccf8-4c31-4d53-bdd1-dc8e95c9a031 - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.647 221324 DEBUG nova.compute.manager [None req-cefcccf8-4c31-4d53-bdd1-dc8e95c9a031 - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:54:13 np0005603622 nova_compute[221301]: 2026-01-31 07:54:13.668 221324 INFO nova.compute.manager [None req-cefcccf8-4c31-4d53-bdd1-dc8e95c9a031 - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 02:54:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:14.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.428 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.428 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:54:14 np0005603622 nova_compute[221301]: 2026-01-31 07:54:14.428 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:15.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:15 np0005603622 nova_compute[221301]: 2026-01-31 07:54:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:15 np0005603622 nova_compute[221301]: 2026-01-31 07:54:15.437 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:16.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:16 np0005603622 ovn_controller[130109]: 2026-01-31T07:54:16Z|00107|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 31 02:54:16 np0005603622 nova_compute[221301]: 2026-01-31 07:54:16.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:16 np0005603622 nova_compute[221301]: 2026-01-31 07:54:16.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:17.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:17 np0005603622 nova_compute[221301]: 2026-01-31 07:54:17.192 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:17 np0005603622 nova_compute[221301]: 2026-01-31 07:54:17.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:18.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.428 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.429 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.863 221324 INFO nova.virt.libvirt.driver [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Deleting instance files /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9_del#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.865 221324 INFO nova.virt.libvirt.driver [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Deletion of /var/lib/nova/instances/10bcd00c-3fce-4785-94ad-d6dde3c9cee9_del complete#033[00m
Jan 31 02:54:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1201664491' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.900 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.934 221324 INFO nova.compute.manager [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Took 20.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.934 221324 DEBUG oslo.service.loopingcall [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.935 221324 DEBUG nova.compute.manager [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.935 221324 DEBUG nova.network.neutron [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:54:18 np0005603622 nova_compute[221301]: 2026-01-31 07:54:18.990 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Error from libvirt while getting description of instance-0000001a: [Error Code 42] Domain not found: no domain with matching uuid '10bcd00c-3fce-4785-94ad-d6dde3c9cee9' (instance-0000001a): libvirt.libvirtError: Domain not found: no domain with matching uuid '10bcd00c-3fce-4785-94ad-d6dde3c9cee9' (instance-0000001a)#033[00m
Jan 31 02:54:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:19.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.184 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.185 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4876MB free_disk=20.90453338623047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.185 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.186 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.227 221324 DEBUG nova.network.neutron [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.247 221324 DEBUG nova.network.neutron [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.266 221324 INFO nova.compute.manager [-] [instance: 10bcd00c-3fce-4785-94ad-d6dde3c9cee9] Took 0.33 seconds to deallocate network for instance.#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.291 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 10bcd00c-3fce-4785-94ad-d6dde3c9cee9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.292 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.292 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.329 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.341 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3624675907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.753 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.760 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.794 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.897 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.898 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:19 np0005603622 nova_compute[221301]: 2026-01-31 07:54:19.898 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.000 221324 DEBUG oslo_concurrency.processutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:20.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/757201793' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.440 221324 DEBUG oslo_concurrency.processutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.448 221324 DEBUG nova.compute.provider_tree [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.496 221324 DEBUG nova.scheduler.client.report [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.551 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.588 221324 INFO nova.scheduler.client.report [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Deleted allocations for instance 10bcd00c-3fce-4785-94ad-d6dde3c9cee9#033[00m
Jan 31 02:54:20 np0005603622 nova_compute[221301]: 2026-01-31 07:54:20.699 221324 DEBUG oslo_concurrency.lockutils [None req-adb1da43-e999-42f1-a1c8-48e94b65fc90 784fa1eae9554e8ba1780d9a15d86b07 4eaf18b9bc8c4afd86bc06639fac4764 - - default default] Lock "10bcd00c-3fce-4785-94ad-d6dde3c9cee9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 23.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:21.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:22.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:22 np0005603622 nova_compute[221301]: 2026-01-31 07:54:22.196 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:22 np0005603622 nova_compute[221301]: 2026-01-31 07:54:22.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:23.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:24.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:54:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:25.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:54:25 np0005603622 podman[233447]: 2026-01-31 07:54:25.185010115 +0000 UTC m=+0.100297291 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 02:54:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:26.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:27.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:27 np0005603622 nova_compute[221301]: 2026-01-31 07:54:27.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:27 np0005603622 nova_compute[221301]: 2026-01-31 07:54:27.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:54:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:29.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:54:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:30.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:30.847 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:30.848 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:30.848 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:54:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:31.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:54:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:32.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:32 np0005603622 podman[233473]: 2026-01-31 07:54:32.152031825 +0000 UTC m=+0.071281002 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 02:54:32 np0005603622 nova_compute[221301]: 2026-01-31 07:54:32.205 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:32 np0005603622 nova_compute[221301]: 2026-01-31 07:54:32.386 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:33.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:54:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:34.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:54:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:35.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:54:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:37.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:37 np0005603622 nova_compute[221301]: 2026-01-31 07:54:37.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:37 np0005603622 nova_compute[221301]: 2026-01-31 07:54:37.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:38.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:54:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:39.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:41.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:42.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:42.152 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:54:42 np0005603622 nova_compute[221301]: 2026-01-31 07:54:42.153 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:42.154 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:54:42 np0005603622 nova_compute[221301]: 2026-01-31 07:54:42.253 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:42 np0005603622 nova_compute[221301]: 2026-01-31 07:54:42.391 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:43.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:44.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:45.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:46.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.260000) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086260127, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2259, "num_deletes": 256, "total_data_size": 5674236, "memory_usage": 5744864, "flush_reason": "Manual Compaction"}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086288565, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3661247, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25569, "largest_seqno": 27823, "table_properties": {"data_size": 3651829, "index_size": 5912, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19635, "raw_average_key_size": 20, "raw_value_size": 3632825, "raw_average_value_size": 3741, "num_data_blocks": 261, "num_entries": 971, "num_filter_entries": 971, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769845886, "oldest_key_time": 1769845886, "file_creation_time": 1769846086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 28625 microseconds, and 6955 cpu microseconds.
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.288642) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3661247 bytes OK
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.288667) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.290866) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.290880) EVENT_LOG_v1 {"time_micros": 1769846086290875, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.290903) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5664071, prev total WAL file size 5664071, number of live WAL files 2.
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.291886) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3575KB)], [51(9063KB)]
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086291977, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12942780, "oldest_snapshot_seqno": -1}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5438 keys, 12818090 bytes, temperature: kUnknown
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086387310, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 12818090, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12777062, "index_size": 26326, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 136929, "raw_average_key_size": 25, "raw_value_size": 12674787, "raw_average_value_size": 2330, "num_data_blocks": 1085, "num_entries": 5438, "num_filter_entries": 5438, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.387646) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 12818090 bytes
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.392096) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.8 rd, 134.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.9 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(7.0) write-amplify(3.5) OK, records in: 5971, records dropped: 533 output_compression: NoCompression
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.392112) EVENT_LOG_v1 {"time_micros": 1769846086392104, "job": 30, "event": "compaction_finished", "compaction_time_micros": 95340, "compaction_time_cpu_micros": 24371, "output_level": 6, "num_output_files": 1, "total_output_size": 12818090, "num_input_records": 5971, "num_output_records": 5438, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086392611, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846086393765, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.291785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.393897) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.393909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.393911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.393913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:54:46.393915) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:54:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:54:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:54:47 np0005603622 nova_compute[221301]: 2026-01-31 07:54:47.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:47 np0005603622 nova_compute[221301]: 2026-01-31 07:54:47.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:49.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:54:50.156 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:54:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:51.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:52.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:52 np0005603622 nova_compute[221301]: 2026-01-31 07:54:52.261 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:52 np0005603622 nova_compute[221301]: 2026-01-31 07:54:52.396 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:53.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:54.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:54:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:55.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:54:55 np0005603622 podman[233673]: 2026-01-31 07:54:55.404428874 +0000 UTC m=+0.135683670 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:54:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:54:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.280 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "31c35869-1596-49c7-8380-fb1f3716c194" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.280 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.524 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.771 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.772 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.786 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:54:56 np0005603622 nova_compute[221301]: 2026-01-31 07:54:56.787 221324 INFO nova.compute.claims [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:54:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:57.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.399 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.442 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:54:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3393888021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.890 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.896 221324 DEBUG nova.compute.provider_tree [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:54:57 np0005603622 nova_compute[221301]: 2026-01-31 07:54:57.946 221324 DEBUG nova.scheduler.client.report [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.022 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.024 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:54:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:54:58.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.174 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.175 221324 DEBUG nova.network.neutron [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.318 221324 INFO nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.359 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:54:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.564 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.566 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.566 221324 INFO nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Creating image(s)#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.605 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.649 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.685 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.690 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.776 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.778 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.778 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.779 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.813 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.818 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 31c35869-1596-49c7-8380-fb1f3716c194_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.841 221324 DEBUG nova.network.neutron [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:54:58 np0005603622 nova_compute[221301]: 2026-01-31 07:54:58.841 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:54:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:54:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:54:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:54:59.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:00.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:00 np0005603622 nova_compute[221301]: 2026-01-31 07:55:00.892 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 31c35869-1596-49c7-8380-fb1f3716c194_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:00 np0005603622 nova_compute[221301]: 2026-01-31 07:55:00.987 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] resizing rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:55:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:01.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.689 221324 DEBUG nova.objects.instance [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'migration_context' on Instance uuid 31c35869-1596-49c7-8380-fb1f3716c194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.769 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.770 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Ensure instance console log exists: /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.770 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.771 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.772 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.774 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.781 221324 WARNING nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.788 221324 DEBUG nova.virt.libvirt.host [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.790 221324 DEBUG nova.virt.libvirt.host [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.794 221324 DEBUG nova.virt.libvirt.host [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.795 221324 DEBUG nova.virt.libvirt.host [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.797 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.797 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.798 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.799 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.799 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.800 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.800 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.800 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.801 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.801 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.802 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.802 221324 DEBUG nova.virt.hardware [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:55:01 np0005603622 nova_compute[221301]: 2026-01-31 07:55:01.807 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:02.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1569674306' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.263 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.294 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.298 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.316 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.400 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/789596162' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.769 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.772 221324 DEBUG nova.objects.instance [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31c35869-1596-49c7-8380-fb1f3716c194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:02 np0005603622 nova_compute[221301]: 2026-01-31 07:55:02.964 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <uuid>31c35869-1596-49c7-8380-fb1f3716c194</uuid>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <name>instance-0000001f</name>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1838366758</nova:name>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:55:01</nova:creationTime>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:user uuid="d4307bc8a2224140b78ba248cecefe55">tempest-ServersOnMultiNodesTest-1827677275-project-member</nova:user>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <nova:project uuid="b6dca32431594e2682c5d2acb448bbf4">tempest-ServersOnMultiNodesTest-1827677275</nova:project>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="serial">31c35869-1596-49c7-8380-fb1f3716c194</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="uuid">31c35869-1596-49c7-8380-fb1f3716c194</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/31c35869-1596-49c7-8380-fb1f3716c194_disk">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/31c35869-1596-49c7-8380-fb1f3716c194_disk.config">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/console.log" append="off"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:55:02 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:55:02 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:55:02 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:55:02 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:55:03 np0005603622 podman[233950]: 2026-01-31 07:55:03.150323903 +0000 UTC m=+0.065425846 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 02:55:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:03.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.180 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.180 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.181 221324 INFO nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Using config drive#033[00m
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.208 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.925 221324 INFO nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Creating config drive at /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config#033[00m
Jan 31 02:55:03 np0005603622 nova_compute[221301]: 2026-01-31 07:55:03.932 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3_hsxv3c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:04 np0005603622 nova_compute[221301]: 2026-01-31 07:55:04.065 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3_hsxv3c" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:04 np0005603622 nova_compute[221301]: 2026-01-31 07:55:04.093 221324 DEBUG nova.storage.rbd_utils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 31c35869-1596-49c7-8380-fb1f3716c194_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:04 np0005603622 nova_compute[221301]: 2026-01-31 07:55:04.098 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config 31c35869-1596-49c7-8380-fb1f3716c194_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:04.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:05 np0005603622 nova_compute[221301]: 2026-01-31 07:55:05.646 221324 DEBUG oslo_concurrency.processutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config 31c35869-1596-49c7-8380-fb1f3716c194_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:05 np0005603622 nova_compute[221301]: 2026-01-31 07:55:05.647 221324 INFO nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Deleting local config drive /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194/disk.config because it was imported into RBD.#033[00m
Jan 31 02:55:05 np0005603622 systemd-machined[190406]: New machine qemu-14-instance-0000001f.
Jan 31 02:55:05 np0005603622 systemd[1]: Started Virtual Machine qemu-14-instance-0000001f.
Jan 31 02:55:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:06.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.980 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846106.9800394, 31c35869-1596-49c7-8380-fb1f3716c194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.982 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.986 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.986 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.991 221324 INFO nova.virt.libvirt.driver [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance spawned successfully.#033[00m
Jan 31 02:55:06 np0005603622 nova_compute[221301]: 2026-01-31 07:55:06.992 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.100 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.105 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.106 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.106 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.107 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.107 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.107 221324 DEBUG nova.virt.libvirt.driver [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.111 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:07.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.321 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.403 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.620 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.621 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846106.980256, 31c35869-1596-49c7-8380-fb1f3716c194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:07 np0005603622 nova_compute[221301]: 2026-01-31 07:55:07.621 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] VM Started (Lifecycle Event)#033[00m
Jan 31 02:55:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:08.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.342 221324 INFO nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Took 9.78 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.345 221324 DEBUG nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.347 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.356 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.426 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.509 221324 INFO nova.compute.manager [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Took 11.78 seconds to build instance.#033[00m
Jan 31 02:55:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:08 np0005603622 nova_compute[221301]: 2026-01-31 07:55:08.753 221324 DEBUG oslo_concurrency.lockutils [None req-daaf1168-51ce-4acf-bbec-ac61dfe3c210 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:09.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.585980) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109586040, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 474, "num_deletes": 251, "total_data_size": 721120, "memory_usage": 731352, "flush_reason": "Manual Compaction"}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109657142, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 476388, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27828, "largest_seqno": 28297, "table_properties": {"data_size": 473718, "index_size": 770, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6271, "raw_average_key_size": 18, "raw_value_size": 468442, "raw_average_value_size": 1415, "num_data_blocks": 34, "num_entries": 331, "num_filter_entries": 331, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846087, "oldest_key_time": 1769846087, "file_creation_time": 1769846109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 71245 microseconds, and 2855 cpu microseconds.
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.657226) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 476388 bytes OK
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.657253) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.692651) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.692711) EVENT_LOG_v1 {"time_micros": 1769846109692699, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.692739) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 718247, prev total WAL file size 718247, number of live WAL files 2.
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.693422) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(465KB)], [54(12MB)]
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109693504, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 13294478, "oldest_snapshot_seqno": -1}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5256 keys, 11325484 bytes, temperature: kUnknown
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109825926, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 11325484, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11286949, "index_size": 24281, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13189, "raw_key_size": 133813, "raw_average_key_size": 25, "raw_value_size": 11189032, "raw_average_value_size": 2128, "num_data_blocks": 995, "num_entries": 5256, "num_filter_entries": 5256, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.826229) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 11325484 bytes
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.830241) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 100.3 rd, 85.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 12.2 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(51.7) write-amplify(23.8) OK, records in: 5769, records dropped: 513 output_compression: NoCompression
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.830307) EVENT_LOG_v1 {"time_micros": 1769846109830269, "job": 32, "event": "compaction_finished", "compaction_time_micros": 132534, "compaction_time_cpu_micros": 20234, "output_level": 6, "num_output_files": 1, "total_output_size": 11325484, "num_input_records": 5769, "num_output_records": 5256, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109830577, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846109831624, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.693289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.831710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.831717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.831718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.831720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:55:09.831722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:55:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:10.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:10 np0005603622 nova_compute[221301]: 2026-01-31 07:55:10.475 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:10 np0005603622 nova_compute[221301]: 2026-01-31 07:55:10.476 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:55:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:11.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:12.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:12 np0005603622 nova_compute[221301]: 2026-01-31 07:55:12.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:12 np0005603622 nova_compute[221301]: 2026-01-31 07:55:12.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:12 np0005603622 nova_compute[221301]: 2026-01-31 07:55:12.426 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:12 np0005603622 nova_compute[221301]: 2026-01-31 07:55:12.427 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:12 np0005603622 nova_compute[221301]: 2026-01-31 07:55:12.661 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.139 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.140 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.149 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.149 221324 INFO nova.compute.claims [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:55:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:13.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:13 np0005603622 nova_compute[221301]: 2026-01-31 07:55:13.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 02:55:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:14.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:15.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:16.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.341 221324 DEBUG nova.scheduler.client.report [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.359 221324 DEBUG nova.scheduler.client.report [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.360 221324 DEBUG nova.compute.provider_tree [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.372 221324 DEBUG nova.scheduler.client.report [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.395 221324 DEBUG nova.scheduler.client.report [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 02:55:16 np0005603622 nova_compute[221301]: 2026-01-31 07:55:16.450 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3813634236' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.188 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.737s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.193 221324 DEBUG nova.compute.provider_tree [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.343 221324 DEBUG nova.scheduler.client.report [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.623 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.624 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.625 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:55:17 np0005603622 nova_compute[221301]: 2026-01-31 07:55:17.625 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:55:18 np0005603622 nova_compute[221301]: 2026-01-31 07:55:18.132 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:18 np0005603622 nova_compute[221301]: 2026-01-31 07:55:18.134 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:55:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:18.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:18 np0005603622 nova_compute[221301]: 2026-01-31 07:55:18.252 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:55:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:19.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:19 np0005603622 nova_compute[221301]: 2026-01-31 07:55:19.300 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:55:19 np0005603622 nova_compute[221301]: 2026-01-31 07:55:19.301 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:55:19 np0005603622 nova_compute[221301]: 2026-01-31 07:55:19.301 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:55:19 np0005603622 nova_compute[221301]: 2026-01-31 07:55:19.301 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 31c35869-1596-49c7-8380-fb1f3716c194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:20.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:20 np0005603622 nova_compute[221301]: 2026-01-31 07:55:20.376 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:55:20 np0005603622 nova_compute[221301]: 2026-01-31 07:55:20.377 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:55:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:21.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:22.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:22 np0005603622 nova_compute[221301]: 2026-01-31 07:55:22.374 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:22 np0005603622 nova_compute[221301]: 2026-01-31 07:55:22.412 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:23.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:24.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:24 np0005603622 nova_compute[221301]: 2026-01-31 07:55:24.315 221324 INFO nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:55:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:25.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:25 np0005603622 nova_compute[221301]: 2026-01-31 07:55:25.490 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:55:25 np0005603622 nova_compute[221301]: 2026-01-31 07:55:25.789 221324 DEBUG nova.policy [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93973daeb08c453e90372a79b54b9ede', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:55:25 np0005603622 nova_compute[221301]: 2026-01-31 07:55:25.994 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.048 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.049 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.050 221324 INFO nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Creating image(s)#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.081 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.112 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.149 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.154 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:26.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:26 np0005603622 podman[234108]: 2026-01-31 07:55:26.181697915 +0000 UTC m=+0.099560321 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.232 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.234 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.235 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.236 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.272 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:26 np0005603622 nova_compute[221301]: 2026-01-31 07:55:26.278 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.304 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.415 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.419 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.542 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] resizing rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.677 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.678 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.679 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.679 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.680 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.680 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:27 np0005603622 nova_compute[221301]: 2026-01-31 07:55:27.680 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.047 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.047 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.048 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.048 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.048 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.147 221324 DEBUG nova.objects.instance [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'migration_context' on Instance uuid ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:28.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.202 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.203 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Ensure instance console log exists: /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.204 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.204 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.205 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.308 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Successfully created port: 91caa716-ddb3-41b4-a4d3-b18e5c81423b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:55:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4072539440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.513 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.668 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000001f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.669 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000001f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.808 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.810 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4610MB free_disk=20.880840301513672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.810 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.810 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.973 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 31c35869-1596-49c7-8380-fb1f3716c194 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.973 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.974 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:55:28 np0005603622 nova_compute[221301]: 2026-01-31 07:55:28.974 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:55:29 np0005603622 nova_compute[221301]: 2026-01-31 07:55:29.100 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:29.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:55:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2070933811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:55:29 np0005603622 nova_compute[221301]: 2026-01-31 07:55:29.559 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:29 np0005603622 nova_compute[221301]: 2026-01-31 07:55:29.565 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:55:29 np0005603622 nova_compute[221301]: 2026-01-31 07:55:29.751 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:55:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:30.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:30.848 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:30.849 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:30.850 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:31 np0005603622 nova_compute[221301]: 2026-01-31 07:55:31.154 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:55:31 np0005603622 nova_compute[221301]: 2026-01-31 07:55:31.155 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:31 np0005603622 nova_compute[221301]: 2026-01-31 07:55:31.155 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:55:31 np0005603622 nova_compute[221301]: 2026-01-31 07:55:31.155 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 02:55:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:31.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:32.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:32 np0005603622 nova_compute[221301]: 2026-01-31 07:55:32.288 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 02:55:32 np0005603622 nova_compute[221301]: 2026-01-31 07:55:32.380 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:32 np0005603622 nova_compute[221301]: 2026-01-31 07:55:32.417 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:33.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:34 np0005603622 podman[234342]: 2026-01-31 07:55:34.123342153 +0000 UTC m=+0.053761193 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 02:55:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:34.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:35 np0005603622 nova_compute[221301]: 2026-01-31 07:55:35.110 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Successfully updated port: 91caa716-ddb3-41b4-a4d3-b18e5c81423b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:55:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:35.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:35 np0005603622 nova_compute[221301]: 2026-01-31 07:55:35.235 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:55:35 np0005603622 nova_compute[221301]: 2026-01-31 07:55:35.235 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquired lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:55:35 np0005603622 nova_compute[221301]: 2026-01-31 07:55:35.235 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:55:36 np0005603622 nova_compute[221301]: 2026-01-31 07:55:36.097 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:55:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:36.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:36 np0005603622 nova_compute[221301]: 2026-01-31 07:55:36.426 221324 DEBUG nova.compute.manager [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-changed-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:36 np0005603622 nova_compute[221301]: 2026-01-31 07:55:36.426 221324 DEBUG nova.compute.manager [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Refreshing instance network info cache due to event network-changed-91caa716-ddb3-41b4-a4d3-b18e5c81423b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:55:36 np0005603622 nova_compute[221301]: 2026-01-31 07:55:36.426 221324 DEBUG oslo_concurrency.lockutils [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:55:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:37.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:37 np0005603622 nova_compute[221301]: 2026-01-31 07:55:37.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:37 np0005603622 nova_compute[221301]: 2026-01-31 07:55:37.419 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:38.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.440 221324 DEBUG nova.network.neutron [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updating instance_info_cache with network_info: [{"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.961 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Releasing lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.962 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Instance network_info: |[{"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.963 221324 DEBUG oslo_concurrency.lockutils [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.963 221324 DEBUG nova.network.neutron [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Refreshing network info cache for port 91caa716-ddb3-41b4-a4d3-b18e5c81423b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.966 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Start _get_guest_xml network_info=[{"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.970 221324 WARNING nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.975 221324 DEBUG nova.virt.libvirt.host [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.976 221324 DEBUG nova.virt.libvirt.host [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.979 221324 DEBUG nova.virt.libvirt.host [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.979 221324 DEBUG nova.virt.libvirt.host [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.980 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.981 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.981 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.981 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.981 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.982 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.982 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.982 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.982 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.982 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.983 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.983 221324 DEBUG nova.virt.hardware [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:55:38 np0005603622 nova_compute[221301]: 2026-01-31 07:55:38.986 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:39.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817533676' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:39 np0005603622 nova_compute[221301]: 2026-01-31 07:55:39.593 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:39 np0005603622 nova_compute[221301]: 2026-01-31 07:55:39.630 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:39 np0005603622 nova_compute[221301]: 2026-01-31 07:55:39.638 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:55:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3992097462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.066 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.068 221324 DEBUG nova.virt.libvirt.vif [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:55:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1694929068',display_name='tempest-ServersAdminTestJSON-server-1694929068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1694929068',id=33,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-vbdmhf1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:55:25Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.069 221324 DEBUG nova.network.os_vif_util [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.070 221324 DEBUG nova.network.os_vif_util [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.072 221324 DEBUG nova.objects.instance [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'pci_devices' on Instance uuid ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:55:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:40.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.464 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <uuid>ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0</uuid>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <name>instance-00000021</name>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersAdminTestJSON-server-1694929068</nova:name>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:55:38</nova:creationTime>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:user uuid="93973daeb08c453e90372a79b54b9ede">tempest-ServersAdminTestJSON-784933461-project-member</nova:user>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:project uuid="8033316fc42c4926bfd1f8a34b02fa97">tempest-ServersAdminTestJSON-784933461</nova:project>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <nova:port uuid="91caa716-ddb3-41b4-a4d3-b18e5c81423b">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="serial">ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="uuid">ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:7c:ac:eb"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <target dev="tap91caa716-dd"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/console.log" append="off"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:55:40 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:55:40 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:55:40 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:55:40 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.466 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Preparing to wait for external event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.466 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.467 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.468 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.470 221324 DEBUG nova.virt.libvirt.vif [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:55:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1694929068',display_name='tempest-ServersAdminTestJSON-server-1694929068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1694929068',id=33,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-vbdmhf1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:55:25Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.471 221324 DEBUG nova.network.os_vif_util [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.472 221324 DEBUG nova.network.os_vif_util [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.472 221324 DEBUG os_vif [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.474 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.474 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.480 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91caa716-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.480 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap91caa716-dd, col_values=(('external_ids', {'iface-id': '91caa716-ddb3-41b4-a4d3-b18e5c81423b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:ac:eb', 'vm-uuid': 'ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:55:40 np0005603622 NetworkManager[49080]: <info>  [1769846140.4862] manager: (tap91caa716-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.493 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.496 221324 INFO os_vif [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd')#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.733 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.734 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.734 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] No VIF found with MAC fa:16:3e:7c:ac:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.734 221324 INFO nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Using config drive#033[00m
Jan 31 02:55:40 np0005603622 nova_compute[221301]: 2026-01-31 07:55:40.757 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:41.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.296 221324 INFO nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Creating config drive at /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.301 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvuu4lvh5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.428 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvuu4lvh5" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.463 221324 DEBUG nova.storage.rbd_utils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] rbd image ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.467 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.685 221324 DEBUG nova.network.neutron [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updated VIF entry in instance network info cache for port 91caa716-ddb3-41b4-a4d3-b18e5c81423b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.686 221324 DEBUG nova.network.neutron [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updating instance_info_cache with network_info: [{"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:55:41 np0005603622 nova_compute[221301]: 2026-01-31 07:55:41.707 221324 DEBUG oslo_concurrency.lockutils [req-e1d4552d-bb3a-4ded-9a3c-98d3f79b88c3 req-ee1348f5-307c-42ff-a8f6-94fdd5e062f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:55:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:42.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:42 np0005603622 nova_compute[221301]: 2026-01-31 07:55:42.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:43.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:44.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:45.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:45 np0005603622 nova_compute[221301]: 2026-01-31 07:55:45.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:46.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:47.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.258 221324 DEBUG oslo_concurrency.processutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.791s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.259 221324 INFO nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Deleting local config drive /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0/disk.config because it was imported into RBD.#033[00m
Jan 31 02:55:47 np0005603622 kernel: tap91caa716-dd: entered promiscuous mode
Jan 31 02:55:47 np0005603622 NetworkManager[49080]: <info>  [1769846147.3335] manager: (tap91caa716-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:47 np0005603622 ovn_controller[130109]: 2026-01-31T07:55:47Z|00108|binding|INFO|Claiming lport 91caa716-ddb3-41b4-a4d3-b18e5c81423b for this chassis.
Jan 31 02:55:47 np0005603622 ovn_controller[130109]: 2026-01-31T07:55:47Z|00109|binding|INFO|91caa716-ddb3-41b4-a4d3-b18e5c81423b: Claiming fa:16:3e:7c:ac:eb 10.100.0.7
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.336 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:47 np0005603622 ovn_controller[130109]: 2026-01-31T07:55:47Z|00110|binding|INFO|Setting lport 91caa716-ddb3-41b4-a4d3-b18e5c81423b ovn-installed in OVS
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.401 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:47 np0005603622 systemd-machined[190406]: New machine qemu-15-instance-00000021.
Jan 31 02:55:47 np0005603622 systemd-udevd[234496]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:55:47 np0005603622 systemd[1]: Started Virtual Machine qemu-15-instance-00000021.
Jan 31 02:55:47 np0005603622 NetworkManager[49080]: <info>  [1769846147.4206] device (tap91caa716-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:55:47 np0005603622 NetworkManager[49080]: <info>  [1769846147.4214] device (tap91caa716-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:55:47 np0005603622 nova_compute[221301]: 2026-01-31 07:55:47.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:48.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:48 np0005603622 nova_compute[221301]: 2026-01-31 07:55:48.291 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846148.2902656, ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:48 np0005603622 nova_compute[221301]: 2026-01-31 07:55:48.292 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] VM Started (Lifecycle Event)#033[00m
Jan 31 02:55:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:49.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 31 02:55:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:50.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:50 np0005603622 nova_compute[221301]: 2026-01-31 07:55:50.484 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:50 np0005603622 ovn_controller[130109]: 2026-01-31T07:55:50Z|00111|binding|INFO|Setting lport 91caa716-ddb3-41b4-a4d3-b18e5c81423b up in Southbound
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.635 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:ac:eb 10.100.0.7'], port_security=['fa:16:3e:7c:ac:eb 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c58eaedf-202a-428a-acfb-f0b1291517f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b3d9baf-bd3e-457e-a5c2-9addbc71d588', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189b55ef-8e14-4c6c-870a-5dba85715c4a, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=91caa716-ddb3-41b4-a4d3-b18e5c81423b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.638 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 91caa716-ddb3-41b4-a4d3-b18e5c81423b in datapath c58eaedf-202a-428a-acfb-f0b1291517f1 bound to our chassis#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.640 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c58eaedf-202a-428a-acfb-f0b1291517f1#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.655 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ddef8a98-7276-499f-835f-27534c4f3d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.657 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc58eaedf-21 in ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.660 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc58eaedf-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.660 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3b1072-91c7-4a6b-8b2c-a2717d1bf925]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.661 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[439fc0d5-212f-4903-8884-d4904787a951]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.674 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[32d2252e-7c87-4f9f-8be9-51dfca086b43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.689 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8114c582-061a-429a-bcc0-a8b4b8339a9a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.723 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[080ca47f-34ca-4fa4-b346-265eb5067720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.730 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[22bb4872-7ce8-41bf-9cc3-fe40a8940f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 NetworkManager[49080]: <info>  [1769846150.7317] manager: (tapc58eaedf-20): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 31 02:55:50 np0005603622 systemd-udevd[234554]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.758 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7e6fca-49d7-4b31-ad1b-5c4f9d9cc851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.762 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[76e5092d-7352-491d-b83c-e15afa299e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 NetworkManager[49080]: <info>  [1769846150.7851] device (tapc58eaedf-20): carrier: link connected
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.788 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6db8332f-19ae-4c9f-852e-e8bfcbb5f53b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.806 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bcae7d-506a-455c-80bc-0a8cfba2aded]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc58eaedf-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:11:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540212, 'reachable_time': 36725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234573, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f31a5117-59b6-4fde-ad0b-ca46fe67ed0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:11bf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540212, 'tstamp': 540212}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234574, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.837 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[42477120-6578-44cb-8a93-d7b139381163]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc58eaedf-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:11:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540212, 'reachable_time': 36725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234575, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.867 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dc9f9c-1552-41b1-a402-691f60e6b182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.922 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ff63340c-8184-4839-b8f0-92545aba57ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.924 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc58eaedf-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.924 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.925 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc58eaedf-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:50 np0005603622 nova_compute[221301]: 2026-01-31 07:55:50.957 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:50 np0005603622 NetworkManager[49080]: <info>  [1769846150.9581] manager: (tapc58eaedf-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 31 02:55:50 np0005603622 kernel: tapc58eaedf-20: entered promiscuous mode
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.961 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc58eaedf-20, col_values=(('external_ids', {'iface-id': '8c531a0f-deeb-4de0-880b-b07ec1cf9103'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:55:50 np0005603622 ovn_controller[130109]: 2026-01-31T07:55:50Z|00112|binding|INFO|Releasing lport 8c531a0f-deeb-4de0-880b-b07ec1cf9103 from this chassis (sb_readonly=1)
Jan 31 02:55:50 np0005603622 nova_compute[221301]: 2026-01-31 07:55:50.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.964 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.965 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc3b48e-304d-43b9-8e71-be1fe7897926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.965 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c58eaedf-202a-428a-acfb-f0b1291517f1
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c58eaedf-202a-428a-acfb-f0b1291517f1.pid.haproxy
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c58eaedf-202a-428a-acfb-f0b1291517f1
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:55:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:55:50.967 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'env', 'PROCESS_TAG=haproxy-c58eaedf-202a-428a-acfb-f0b1291517f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c58eaedf-202a-428a-acfb-f0b1291517f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:55:50 np0005603622 nova_compute[221301]: 2026-01-31 07:55:50.967 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:51.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:51 np0005603622 podman[234607]: 2026-01-31 07:55:51.284312253 +0000 UTC m=+0.022354080 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:55:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:55:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:52.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:55:52 np0005603622 podman[234607]: 2026-01-31 07:55:52.232995014 +0000 UTC m=+0.971036791 container create b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:55:52 np0005603622 nova_compute[221301]: 2026-01-31 07:55:52.428 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:52 np0005603622 nova_compute[221301]: 2026-01-31 07:55:52.446 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:52 np0005603622 nova_compute[221301]: 2026-01-31 07:55:52.454 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846148.2917757, ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:52 np0005603622 nova_compute[221301]: 2026-01-31 07:55:52.455 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:55:52 np0005603622 systemd[1]: Started libpod-conmon-b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e.scope.
Jan 31 02:55:52 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:55:52 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114b941803498188fc44ea3ea8c9fe24bd2faabda83d378c8de26e2bc5445907/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:55:52 np0005603622 podman[234607]: 2026-01-31 07:55:52.773914081 +0000 UTC m=+1.511955908 container init b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 02:55:52 np0005603622 podman[234607]: 2026-01-31 07:55:52.781523155 +0000 UTC m=+1.519564942 container start b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:55:52 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [NOTICE]   (234627) : New worker (234629) forked
Jan 31 02:55:52 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [NOTICE]   (234627) : Loading success.
Jan 31 02:55:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:53.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:54.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:54 np0005603622 nova_compute[221301]: 2026-01-31 07:55:54.673 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:54 np0005603622 nova_compute[221301]: 2026-01-31 07:55:54.677 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:55.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:55 np0005603622 nova_compute[221301]: 2026-01-31 07:55:55.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:56.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.269 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.683 221324 DEBUG nova.compute.manager [req-b75b4906-8c6a-49d7-bbc7-7de30b434931 req-2a7cab17-ec1f-4cf1-a3e8-9cf42b2907a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.684 221324 DEBUG oslo_concurrency.lockutils [req-b75b4906-8c6a-49d7-bbc7-7de30b434931 req-2a7cab17-ec1f-4cf1-a3e8-9cf42b2907a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.685 221324 DEBUG oslo_concurrency.lockutils [req-b75b4906-8c6a-49d7-bbc7-7de30b434931 req-2a7cab17-ec1f-4cf1-a3e8-9cf42b2907a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.685 221324 DEBUG oslo_concurrency.lockutils [req-b75b4906-8c6a-49d7-bbc7-7de30b434931 req-2a7cab17-ec1f-4cf1-a3e8-9cf42b2907a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.686 221324 DEBUG nova.compute.manager [req-b75b4906-8c6a-49d7-bbc7-7de30b434931 req-2a7cab17-ec1f-4cf1-a3e8-9cf42b2907a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Processing event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.687 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.694 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846156.6938212, ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.695 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.699 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.709 221324 INFO nova.virt.libvirt.driver [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Instance spawned successfully.#033[00m
Jan 31 02:55:56 np0005603622 nova_compute[221301]: 2026-01-31 07:55:56.709 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:55:56 np0005603622 podman[234810]: 2026-01-31 07:55:56.904081993 +0000 UTC m=+0.831549742 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 31 02:55:56 np0005603622 podman[234810]: 2026-01-31 07:55:56.99794659 +0000 UTC m=+0.925414289 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 31 02:55:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:57.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:57 np0005603622 nova_compute[221301]: 2026-01-31 07:55:57.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:55:57 np0005603622 podman[234841]: 2026-01-31 07:55:57.711669841 +0000 UTC m=+0.630238003 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 02:55:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:55:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:55:58.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.061 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.066 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.067 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.067 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.068 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.068 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.068 221324 DEBUG nova.virt.libvirt.driver [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.077 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:55:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:55:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:55:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:55:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:55:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:55:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:55:59.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:55:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:55:59 np0005603622 nova_compute[221301]: 2026-01-31 07:55:59.857 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.063 221324 INFO nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Took 34.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.064 221324 DEBUG nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:00.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.220 221324 INFO nova.compute.manager [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Took 47.16 seconds to build instance.#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.305 221324 DEBUG oslo_concurrency.lockutils [None req-4fc2110c-8abe-4b2b-9a17-6a8d240f1742 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 47.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.386 221324 DEBUG nova.compute.manager [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.387 221324 DEBUG oslo_concurrency.lockutils [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.387 221324 DEBUG oslo_concurrency.lockutils [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.387 221324 DEBUG oslo_concurrency.lockutils [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.388 221324 DEBUG nova.compute.manager [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] No waiting events found dispatching network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.388 221324 WARNING nova.compute.manager [req-c1ade783-40fe-4d22-9885-c9484a5ce5d7 req-a7ec1893-c7ff-49e5-b247-b2b02325a5c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received unexpected event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b for instance with vm_state active and task_state None.#033[00m
Jan 31 02:56:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:56:00 np0005603622 nova_compute[221301]: 2026-01-31 07:56:00.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:01.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:02.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:02 np0005603622 nova_compute[221301]: 2026-01-31 07:56:02.431 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:03.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:04.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:05 np0005603622 podman[235087]: 2026-01-31 07:56:05.150548746 +0000 UTC m=+0.064985694 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 02:56:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:05.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:05 np0005603622 nova_compute[221301]: 2026-01-31 07:56:05.491 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:06.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:07.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:07 np0005603622 nova_compute[221301]: 2026-01-31 07:56:07.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:08.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:09.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:10.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:10 np0005603622 nova_compute[221301]: 2026-01-31 07:56:10.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:11.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:11.460 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:56:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:11.463 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.707 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.707 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.817 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.958 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.959 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.968 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:56:11 np0005603622 nova_compute[221301]: 2026-01-31 07:56:11.968 221324 INFO nova.compute.claims [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:56:12 np0005603622 nova_compute[221301]: 2026-01-31 07:56:12.204 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:56:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:12.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:12 np0005603622 nova_compute[221301]: 2026-01-31 07:56:12.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3295373078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:12 np0005603622 nova_compute[221301]: 2026-01-31 07:56:12.989 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.784s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:12 np0005603622 nova_compute[221301]: 2026-01-31 07:56:12.995 221324 DEBUG nova.compute.provider_tree [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.019 221324 DEBUG nova.scheduler.client.report [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.130 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.167 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "84b46624-4c75-43ae-90e4-00a0a374729b" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.168 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "84b46624-4c75-43ae-90e4-00a0a374729b" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.252 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] No node specified, defaulting to compute-1.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Jan 31 02:56:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:13.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:56:13Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:ac:eb 10.100.0.7
Jan 31 02:56:13 np0005603622 ovn_controller[130109]: 2026-01-31T07:56:13Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:ac:eb 10.100.0.7
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.722 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "84b46624-4c75-43ae-90e4-00a0a374729b" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:13 np0005603622 nova_compute[221301]: 2026-01-31 07:56:13.723 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:56:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:14 np0005603622 nova_compute[221301]: 2026-01-31 07:56:14.678 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:56:14 np0005603622 nova_compute[221301]: 2026-01-31 07:56:14.679 221324 DEBUG nova.network.neutron [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:56:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:14 np0005603622 nova_compute[221301]: 2026-01-31 07:56:14.849 221324 INFO nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:56:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.003 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.004 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.029 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:56:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:15.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.499 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.685 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.687 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.687 221324 INFO nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Creating image(s)#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.724 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.762 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.793 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.799 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.852 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.853 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.854 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.854 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.884 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:15 np0005603622 nova_compute[221301]: 2026-01-31 07:56:15.888 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:16 np0005603622 nova_compute[221301]: 2026-01-31 07:56:16.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:16 np0005603622 nova_compute[221301]: 2026-01-31 07:56:16.608 221324 DEBUG nova.network.neutron [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 31 02:56:16 np0005603622 nova_compute[221301]: 2026-01-31 07:56:16.609 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:56:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:17.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:17 np0005603622 nova_compute[221301]: 2026-01-31 07:56:17.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:17 np0005603622 nova_compute[221301]: 2026-01-31 07:56:17.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:17 np0005603622 nova_compute[221301]: 2026-01-31 07:56:17.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:18.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.451 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.451 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.452 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.515 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.757 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.758 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.758 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:56:18 np0005603622 nova_compute[221301]: 2026-01-31 07:56:18.758 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 31c35869-1596-49c7-8380-fb1f3716c194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:19 np0005603622 nova_compute[221301]: 2026-01-31 07:56:19.086 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:19.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.150 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.209 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.210 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.211 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:20.468 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:56:20 np0005603622 nova_compute[221301]: 2026-01-31 07:56:20.502 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:21.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.439 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.440 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.440 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.440 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.441 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4008894624' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.905 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.988 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000001f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.989 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000001f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.995 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:21 np0005603622 nova_compute[221301]: 2026-01-31 07:56:21.996 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.202 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.203 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4356MB free_disk=20.671531677246094GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.204 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.205 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:22.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.315 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 31c35869-1596-49c7-8380-fb1f3716c194 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.316 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.316 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.316 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.317 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.382 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/532425287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.882 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.888 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.929 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.990 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:56:22 np0005603622 nova_compute[221301]: 2026-01-31 07:56:22.991 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:23 np0005603622 nova_compute[221301]: 2026-01-31 07:56:23.179 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.290s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:23.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:23 np0005603622 nova_compute[221301]: 2026-01-31 07:56:23.391 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] resizing rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:56:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:24.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.553 221324 DEBUG nova.objects.instance [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.580 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.581 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Ensure instance console log exists: /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.582 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.583 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.583 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.587 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.594 221324 WARNING nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.600 221324 DEBUG nova.virt.libvirt.host [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.601 221324 DEBUG nova.virt.libvirt.host [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.604 221324 DEBUG nova.virt.libvirt.host [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.605 221324 DEBUG nova.virt.libvirt.host [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.606 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.607 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.608 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.608 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.608 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.609 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.609 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.609 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.610 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.610 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.610 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.610 221324 DEBUG nova.virt.hardware [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:56:24 np0005603622 nova_compute[221301]: 2026-01-31 07:56:24.614 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:56:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1963806761' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.099 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.131 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.136 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:25.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:56:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1190899947' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.551 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.554 221324 DEBUG nova.objects.instance [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.596 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <uuid>2f07ca13-ade3-4c0b-af0b-2f1c713d2842</uuid>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <name>instance-00000026</name>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersOnMultiNodesTest-server-2015351071-1</nova:name>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:56:24</nova:creationTime>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:user uuid="d4307bc8a2224140b78ba248cecefe55">tempest-ServersOnMultiNodesTest-1827677275-project-member</nova:user>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <nova:project uuid="b6dca32431594e2682c5d2acb448bbf4">tempest-ServersOnMultiNodesTest-1827677275</nova:project>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="serial">2f07ca13-ade3-4c0b-af0b-2f1c713d2842</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="uuid">2f07ca13-ade3-4c0b-af0b-2f1c713d2842</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/console.log" append="off"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:56:25 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:56:25 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:56:25 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:56:25 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.809 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.810 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.811 221324 INFO nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Using config drive#033[00m
Jan 31 02:56:25 np0005603622 nova_compute[221301]: 2026-01-31 07:56:25.843 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 31 02:56:26 np0005603622 nova_compute[221301]: 2026-01-31 07:56:26.143 221324 INFO nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Creating config drive at /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config#033[00m
Jan 31 02:56:26 np0005603622 nova_compute[221301]: 2026-01-31 07:56:26.149 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd8ia1hiz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:26.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:26 np0005603622 nova_compute[221301]: 2026-01-31 07:56:26.279 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd8ia1hiz" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:26 np0005603622 nova_compute[221301]: 2026-01-31 07:56:26.307 221324 DEBUG nova.storage.rbd_utils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] rbd image 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:26 np0005603622 nova_compute[221301]: 2026-01-31 07:56:26.312 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:27.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:27 np0005603622 nova_compute[221301]: 2026-01-31 07:56:27.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:28 np0005603622 podman[235510]: 2026-01-31 07:56:28.212861146 +0000 UTC m=+0.135115054 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 31 02:56:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:28.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:29.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 02:56:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:30.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 02:56:30 np0005603622 nova_compute[221301]: 2026-01-31 07:56:30.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:30 np0005603622 nova_compute[221301]: 2026-01-31 07:56:30.629 221324 DEBUG oslo_concurrency.processutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config 2f07ca13-ade3-4c0b-af0b-2f1c713d2842_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:30 np0005603622 nova_compute[221301]: 2026-01-31 07:56:30.630 221324 INFO nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Deleting local config drive /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842/disk.config because it was imported into RBD.#033[00m
Jan 31 02:56:30 np0005603622 systemd-machined[190406]: New machine qemu-16-instance-00000026.
Jan 31 02:56:30 np0005603622 systemd[1]: Started Virtual Machine qemu-16-instance-00000026.
Jan 31 02:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:30.850 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:30.851 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:56:30.852 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:31.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:32.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.527 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846192.527168, 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.528 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.531 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.533 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.538 221324 INFO nova.virt.libvirt.driver [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance spawned successfully.#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.538 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.668 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.673 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.673 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.674 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.674 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.674 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.675 221324 DEBUG nova.virt.libvirt.driver [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.679 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.810 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.811 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846192.5286484, 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:32 np0005603622 nova_compute[221301]: 2026-01-31 07:56:32.811 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] VM Started (Lifecycle Event)#033[00m
Jan 31 02:56:33 np0005603622 nova_compute[221301]: 2026-01-31 07:56:33.090 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:33 np0005603622 nova_compute[221301]: 2026-01-31 07:56:33.095 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:56:33 np0005603622 nova_compute[221301]: 2026-01-31 07:56:33.267 221324 INFO nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Took 17.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:56:33 np0005603622 nova_compute[221301]: 2026-01-31 07:56:33.268 221324 DEBUG nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:33 np0005603622 nova_compute[221301]: 2026-01-31 07:56:33.284 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:56:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:33.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:34 np0005603622 nova_compute[221301]: 2026-01-31 07:56:34.153 221324 INFO nova.compute.manager [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Took 22.23 seconds to build instance.#033[00m
Jan 31 02:56:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:34.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:34 np0005603622 nova_compute[221301]: 2026-01-31 07:56:34.381 221324 DEBUG oslo_concurrency.lockutils [None req-9b87d736-98a6-4e09-9cde-6b1f1a69c0b2 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:35.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:35 np0005603622 nova_compute[221301]: 2026-01-31 07:56:35.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:36 np0005603622 podman[235595]: 2026-01-31 07:56:36.162884619 +0000 UTC m=+0.063356140 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:56:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:36.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:37.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:37 np0005603622 nova_compute[221301]: 2026-01-31 07:56:37.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.158 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.159 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.160 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.161 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.161 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.164 221324 INFO nova.compute.manager [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Terminating instance#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.166 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "refresh_cache-2f07ca13-ade3-4c0b-af0b-2f1c713d2842" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.167 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquired lock "refresh_cache-2f07ca13-ade3-4c0b-af0b-2f1c713d2842" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.167 221324 DEBUG nova.network.neutron [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:56:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:38.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:38 np0005603622 nova_compute[221301]: 2026-01-31 07:56:38.488 221324 DEBUG nova.network.neutron [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:39 np0005603622 nova_compute[221301]: 2026-01-31 07:56:39.121 221324 DEBUG nova.network.neutron [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:39 np0005603622 nova_compute[221301]: 2026-01-31 07:56:39.167 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Releasing lock "refresh_cache-2f07ca13-ade3-4c0b-af0b-2f1c713d2842" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:56:39 np0005603622 nova_compute[221301]: 2026-01-31 07:56:39.168 221324 DEBUG nova.compute.manager [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:56:39 np0005603622 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000026.scope: Deactivated successfully.
Jan 31 02:56:39 np0005603622 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000026.scope: Consumed 7.957s CPU time.
Jan 31 02:56:39 np0005603622 systemd-machined[190406]: Machine qemu-16-instance-00000026 terminated.
Jan 31 02:56:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:39.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:39 np0005603622 nova_compute[221301]: 2026-01-31 07:56:39.394 221324 INFO nova.virt.libvirt.driver [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance destroyed successfully.#033[00m
Jan 31 02:56:39 np0005603622 nova_compute[221301]: 2026-01-31 07:56:39.395 221324 DEBUG nova.objects.instance [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'resources' on Instance uuid 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:56:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:40.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.442 221324 INFO nova.virt.libvirt.driver [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Deleting instance files /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842_del#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.443 221324 INFO nova.virt.libvirt.driver [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Deletion of /var/lib/nova/instances/2f07ca13-ade3-4c0b-af0b-2f1c713d2842_del complete#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.612 221324 INFO nova.compute.manager [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Took 1.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.613 221324 DEBUG oslo.service.loopingcall [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.613 221324 DEBUG nova.compute.manager [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.613 221324 DEBUG nova.network.neutron [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.785 221324 DEBUG nova.network.neutron [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.822 221324 DEBUG nova.network.neutron [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:56:40 np0005603622 nova_compute[221301]: 2026-01-31 07:56:40.911 221324 INFO nova.compute.manager [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Took 0.30 seconds to deallocate network for instance.#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.108 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.109 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.193 221324 DEBUG oslo_concurrency.processutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:41.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1649090312' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.615 221324 DEBUG oslo_concurrency.processutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.622 221324 DEBUG nova.compute.provider_tree [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.730 221324 DEBUG nova.scheduler.client.report [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:41 np0005603622 nova_compute[221301]: 2026-01-31 07:56:41.897 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:42 np0005603622 nova_compute[221301]: 2026-01-31 07:56:42.066 221324 INFO nova.scheduler.client.report [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Deleted allocations for instance 2f07ca13-ade3-4c0b-af0b-2f1c713d2842#033[00m
Jan 31 02:56:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:42.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:42 np0005603622 nova_compute[221301]: 2026-01-31 07:56:42.324 221324 DEBUG oslo_concurrency.lockutils [None req-c2178e9b-d95f-4383-a66c-a733be3fe159 d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "2f07ca13-ade3-4c0b-af0b-2f1c713d2842" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:42 np0005603622 nova_compute[221301]: 2026-01-31 07:56:42.520 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:43.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:44.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:45.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:45 np0005603622 nova_compute[221301]: 2026-01-31 07:56:45.547 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.004000107s ======
Jan 31 02:56:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:46.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000107s
Jan 31 02:56:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:47.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:47 np0005603622 nova_compute[221301]: 2026-01-31 07:56:47.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:49.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:50 np0005603622 nova_compute[221301]: 2026-01-31 07:56:50.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:51.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:52.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:52 np0005603622 nova_compute[221301]: 2026-01-31 07:56:52.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:53.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:56:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1402694343' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:56:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:54.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:54 np0005603622 nova_compute[221301]: 2026-01-31 07:56:54.392 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846199.3913023, 2f07ca13-ade3-4c0b-af0b-2f1c713d2842 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:56:54 np0005603622 nova_compute[221301]: 2026-01-31 07:56:54.393 221324 INFO nova.compute.manager [-] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:56:54 np0005603622 nova_compute[221301]: 2026-01-31 07:56:54.809 221324 DEBUG nova.compute.manager [None req-caa36ac3-b0b2-4b9c-947e-adfd28aa3555 - - - - - -] [instance: 2f07ca13-ade3-4c0b-af0b-2f1c713d2842] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:56:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:55.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:55 np0005603622 nova_compute[221301]: 2026-01-31 07:56:55.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:55 np0005603622 nova_compute[221301]: 2026-01-31 07:56:55.556 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:55 np0005603622 nova_compute[221301]: 2026-01-31 07:56:55.557 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:55 np0005603622 nova_compute[221301]: 2026-01-31 07:56:55.783 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:56:56 np0005603622 nova_compute[221301]: 2026-01-31 07:56:56.223 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:56 np0005603622 nova_compute[221301]: 2026-01-31 07:56:56.223 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:56 np0005603622 nova_compute[221301]: 2026-01-31 07:56:56.231 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:56:56 np0005603622 nova_compute[221301]: 2026-01-31 07:56:56.232 221324 INFO nova.compute.claims [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:56:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:56.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:56 np0005603622 nova_compute[221301]: 2026-01-31 07:56:56.968 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:56:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:57.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:56:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:56:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1050248075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:56:57 np0005603622 nova_compute[221301]: 2026-01-31 07:56:57.483 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:57 np0005603622 nova_compute[221301]: 2026-01-31 07:56:57.489 221324 DEBUG nova.compute.provider_tree [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:56:57 np0005603622 nova_compute[221301]: 2026-01-31 07:56:57.532 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:56:57 np0005603622 nova_compute[221301]: 2026-01-31 07:56:57.751 221324 DEBUG nova.scheduler.client.report [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:56:58 np0005603622 nova_compute[221301]: 2026-01-31 07:56:58.271 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:58 np0005603622 nova_compute[221301]: 2026-01-31 07:56:58.272 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:56:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:56:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:56:58.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:56:58 np0005603622 nova_compute[221301]: 2026-01-31 07:56:58.886 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:56:58 np0005603622 nova_compute[221301]: 2026-01-31 07:56:58.886 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:56:58 np0005603622 nova_compute[221301]: 2026-01-31 07:56:58.948 221324 INFO nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.100 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.135 221324 DEBUG nova.policy [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'af90832a37cd4b10b02397d75b4802c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '200fedf7dc594b4da20c0397b4581e89', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:56:59 np0005603622 podman[235682]: 2026-01-31 07:56:59.198037082 +0000 UTC m=+0.114060370 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 02:56:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:56:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:56:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:56:59.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.701 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.702 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.703 221324 INFO nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Creating image(s)#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.734 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.765 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.796 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.802 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:56:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.868 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.869 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.870 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.870 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.895 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:56:59 np0005603622 nova_compute[221301]: 2026-01-31 07:56:59.899 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:00 np0005603622 nova_compute[221301]: 2026-01-31 07:57:00.232 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Successfully created port: f4ace9f4-2b8c-40bc-9098-97bc720826ac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:57:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:00.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:00 np0005603622 nova_compute[221301]: 2026-01-31 07:57:00.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:01.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:01 np0005603622 nova_compute[221301]: 2026-01-31 07:57:01.796 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Successfully updated port: f4ace9f4-2b8c-40bc-9098-97bc720826ac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:57:01 np0005603622 nova_compute[221301]: 2026-01-31 07:57:01.969 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:57:01 np0005603622 nova_compute[221301]: 2026-01-31 07:57:01.969 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquired lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:57:01 np0005603622 nova_compute[221301]: 2026-01-31 07:57:01.969 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.155 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.191 221324 DEBUG nova.compute.manager [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-changed-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.192 221324 DEBUG nova.compute.manager [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Refreshing instance network info cache due to event network-changed-f4ace9f4-2b8c-40bc-9098-97bc720826ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.192 221324 DEBUG oslo_concurrency.lockutils [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:57:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:02.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.760 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.861s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:02 np0005603622 nova_compute[221301]: 2026-01-31 07:57:02.835 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] resizing rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:57:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:03.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.029 221324 DEBUG nova.objects.instance [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lazy-loading 'migration_context' on Instance uuid 610866c1-bc50-4e42-a6bc-e4263d2b424e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.046 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.046 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Ensure instance console log exists: /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.047 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.047 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:04 np0005603622 nova_compute[221301]: 2026-01-31 07:57:04.048 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 02:57:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:04.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 02:57:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.105 221324 DEBUG nova.network.neutron [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Updating instance_info_cache with network_info: [{"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.177 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Releasing lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.178 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Instance network_info: |[{"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.179 221324 DEBUG oslo_concurrency.lockutils [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.179 221324 DEBUG nova.network.neutron [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Refreshing network info cache for port f4ace9f4-2b8c-40bc-9098-97bc720826ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.183 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Start _get_guest_xml network_info=[{"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.190 221324 WARNING nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.196 221324 DEBUG nova.virt.libvirt.host [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.197 221324 DEBUG nova.virt.libvirt.host [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.202 221324 DEBUG nova.virt.libvirt.host [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.203 221324 DEBUG nova.virt.libvirt.host [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.204 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.205 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.205 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.205 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.206 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.206 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.206 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.206 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.207 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.207 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.207 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.208 221324 DEBUG nova.virt.hardware [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.211 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:05.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:57:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3521001212' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.749 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.810 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:57:05 np0005603622 nova_compute[221301]: 2026-01-31 07:57:05.820 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:57:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1607777860' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.246 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.249 221324 DEBUG nova.virt.libvirt.vif [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:56:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-998124970',display_name='tempest-ImagesNegativeTestJSON-server-998124970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-998124970',id=41,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='200fedf7dc594b4da20c0397b4581e89',ramdisk_id='',reservation_id='r-2t1btgek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1409118820',owner_user_name='tempest-ImagesNegativeTestJSON-1409118820-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:56:59Z,user_data=None,user_id='af90832a37cd4b10b02397d75b4802c9',uuid=610866c1-bc50-4e42-a6bc-e4263d2b424e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.250 221324 DEBUG nova.network.os_vif_util [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converting VIF {"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.252 221324 DEBUG nova.network.os_vif_util [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.254 221324 DEBUG nova.objects.instance [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lazy-loading 'pci_devices' on Instance uuid 610866c1-bc50-4e42-a6bc-e4263d2b424e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.303 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <uuid>610866c1-bc50-4e42-a6bc-e4263d2b424e</uuid>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <name>instance-00000029</name>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-ImagesNegativeTestJSON-server-998124970</nova:name>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:57:05</nova:creationTime>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:user uuid="af90832a37cd4b10b02397d75b4802c9">tempest-ImagesNegativeTestJSON-1409118820-project-member</nova:user>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:project uuid="200fedf7dc594b4da20c0397b4581e89">tempest-ImagesNegativeTestJSON-1409118820</nova:project>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <nova:port uuid="f4ace9f4-2b8c-40bc-9098-97bc720826ac">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="serial">610866c1-bc50-4e42-a6bc-e4263d2b424e</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="uuid">610866c1-bc50-4e42-a6bc-e4263d2b424e</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/610866c1-bc50-4e42-a6bc-e4263d2b424e_disk">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:04:fa:2e"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <target dev="tapf4ace9f4-2b"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/console.log" append="off"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:57:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:57:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:57:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:57:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.305 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Preparing to wait for external event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.306 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.306 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.307 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.308 221324 DEBUG nova.virt.libvirt.vif [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:56:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-998124970',display_name='tempest-ImagesNegativeTestJSON-server-998124970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-998124970',id=41,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='200fedf7dc594b4da20c0397b4581e89',ramdisk_id='',reservation_id='r-2t1btgek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1409118820',owner_user_name='tempest-ImagesNegativeTestJSON-1409118820-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:56:59Z,user_data=None,user_id='af90832a37cd4b10b02397d75b4802c9',uuid=610866c1-bc50-4e42-a6bc-e4263d2b424e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.308 221324 DEBUG nova.network.os_vif_util [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converting VIF {"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.309 221324 DEBUG nova.network.os_vif_util [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.310 221324 DEBUG os_vif [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.311 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.312 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.313 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.321 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.322 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4ace9f4-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.322 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4ace9f4-2b, col_values=(('external_ids', {'iface-id': 'f4ace9f4-2b8c-40bc-9098-97bc720826ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:fa:2e', 'vm-uuid': '610866c1-bc50-4e42-a6bc-e4263d2b424e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.325 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:06 np0005603622 NetworkManager[49080]: <info>  [1769846226.3284] manager: (tapf4ace9f4-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.330 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:06.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.335 221324 INFO os_vif [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b')#033[00m
Jan 31 02:57:06 np0005603622 podman[235938]: 2026-01-31 07:57:06.412140898 +0000 UTC m=+0.050891546 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.525 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.525 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.527 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] No VIF found with MAC fa:16:3e:04:fa:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.528 221324 INFO nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Using config drive#033[00m
Jan 31 02:57:06 np0005603622 nova_compute[221301]: 2026-01-31 07:57:06.559 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.332 221324 INFO nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Creating config drive at /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.336 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd5_mbvw7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:07.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.458 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd5_mbvw7" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.501 221324 DEBUG nova.storage.rbd_utils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] rbd image 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.507 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.536 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.820 221324 DEBUG oslo_concurrency.processutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config 610866c1-bc50-4e42-a6bc-e4263d2b424e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.313s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.820 221324 INFO nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Deleting local config drive /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e/disk.config because it was imported into RBD.#033[00m
Jan 31 02:57:07 np0005603622 kernel: tapf4ace9f4-2b: entered promiscuous mode
Jan 31 02:57:07 np0005603622 NetworkManager[49080]: <info>  [1769846227.8839] manager: (tapf4ace9f4-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 31 02:57:07 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:07Z|00113|binding|INFO|Claiming lport f4ace9f4-2b8c-40bc-9098-97bc720826ac for this chassis.
Jan 31 02:57:07 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:07Z|00114|binding|INFO|f4ace9f4-2b8c-40bc-9098-97bc720826ac: Claiming fa:16:3e:04:fa:2e 10.100.0.5
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.889 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603622 systemd-machined[190406]: New machine qemu-17-instance-00000029.
Jan 31 02:57:07 np0005603622 systemd-udevd[236028]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:57:07 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:07Z|00115|binding|INFO|Setting lport f4ace9f4-2b8c-40bc-9098-97bc720826ac ovn-installed in OVS
Jan 31 02:57:07 np0005603622 systemd[1]: Started Virtual Machine qemu-17-instance-00000029.
Jan 31 02:57:07 np0005603622 nova_compute[221301]: 2026-01-31 07:57:07.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:07 np0005603622 NetworkManager[49080]: <info>  [1769846227.9504] device (tapf4ace9f4-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:57:07 np0005603622 NetworkManager[49080]: <info>  [1769846227.9517] device (tapf4ace9f4-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.114 221324 DEBUG nova.network.neutron [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Updated VIF entry in instance network info cache for port f4ace9f4-2b8c-40bc-9098-97bc720826ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.115 221324 DEBUG nova.network.neutron [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Updating instance_info_cache with network_info: [{"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:08 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:08Z|00116|binding|INFO|Setting lport f4ace9f4-2b8c-40bc-9098-97bc720826ac up in Southbound
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.175 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fa:2e 10.100.0.5'], port_security=['fa:16:3e:04:fa:2e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '610866c1-bc50-4e42-a6bc-e4263d2b424e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-899802b0-9c47-42dc-b642-6fddcc730f66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '200fedf7dc594b4da20c0397b4581e89', 'neutron:revision_number': '2', 'neutron:security_group_ids': '538d5863-e20d-4bbe-b690-6e5c41471239', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad24e79e-ef80-49be-b35c-0a3db6dcdbfb, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f4ace9f4-2b8c-40bc-9098-97bc720826ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.180 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f4ace9f4-2b8c-40bc-9098-97bc720826ac in datapath 899802b0-9c47-42dc-b642-6fddcc730f66 bound to our chassis#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.185 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 899802b0-9c47-42dc-b642-6fddcc730f66#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.200 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c59aece2-73e2-4038-b8df-5752dd614df1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.202 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap899802b0-91 in ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.206 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap899802b0-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.206 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d91ea894-ea8a-4d7b-9850-dad896484a6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.208 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6dea40fd-40d7-40b9-baa0-46c50fca748b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.228 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e60391-a6c9-4d51-8ad1-f8c00b5b36e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.242 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f098fcb1-79a6-41ce-b09e-056a2c14aeb6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.264 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0186f7-d911-4948-9c21-c85c5234d677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.273 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[73903970-ac08-4b12-b289-bc6e1f6e3f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 NetworkManager[49080]: <info>  [1769846228.2741] manager: (tap899802b0-90): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.297 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b824ac-c83c-492f-8d51-14a45e328958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.301 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7fa718-b6ba-490b-b971-5908a40494cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 NetworkManager[49080]: <info>  [1769846228.3217] device (tap899802b0-90): carrier: link connected
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.326 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5024cc-6e1f-4854-9821-aa4efc4ee260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:08.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.345 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ae28142c-ebf0-412e-862b-79941c9ae6a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap899802b0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:cc:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547966, 'reachable_time': 43144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236079, 'error': None, 'target': 'ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.364 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[37ee8a61-9a37-4f83-96a8-0bdd6f885be3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:cc0b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547966, 'tstamp': 547966}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236095, 'error': None, 'target': 'ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.381 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5bda4a09-1b16-421b-884e-c34e16b3467b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap899802b0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:cc:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547966, 'reachable_time': 43144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236097, 'error': None, 'target': 'ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.405 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02594cb7-c98f-4842-80c5-2e3c52e26570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.468 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5976798b-f7a8-4f0d-892e-b1d647fc9908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.470 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap899802b0-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.470 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.471 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap899802b0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:08 np0005603622 NetworkManager[49080]: <info>  [1769846228.4738] manager: (tap899802b0-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 31 02:57:08 np0005603622 kernel: tap899802b0-90: entered promiscuous mode
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.476 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap899802b0-90, col_values=(('external_ids', {'iface-id': '37beb825-0b1f-469a-9eb0-0cf33343d282'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:08 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:08Z|00117|binding|INFO|Releasing lport 37beb825-0b1f-469a-9eb0-0cf33343d282 from this chassis (sb_readonly=0)
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.478 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.483 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.484 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/899802b0-9c47-42dc-b642-6fddcc730f66.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/899802b0-9c47-42dc-b642-6fddcc730f66.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.485 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed6e9d9-a9d1-4201-acda-e090391311a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.486 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-899802b0-9c47-42dc-b642-6fddcc730f66
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/899802b0-9c47-42dc-b642-6fddcc730f66.pid.haproxy
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 899802b0-9c47-42dc-b642-6fddcc730f66
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:57:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:08.487 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66', 'env', 'PROCESS_TAG=haproxy-899802b0-9c47-42dc-b642-6fddcc730f66', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/899802b0-9c47-42dc-b642-6fddcc730f66.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.543 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846228.5426123, 610866c1-bc50-4e42-a6bc-e4263d2b424e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.543 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] VM Started (Lifecycle Event)#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.624 221324 DEBUG oslo_concurrency.lockutils [req-35ec1444-9133-4779-91cf-326070e4b1b8 req-9f0c826f-4416-49eb-847c-cb56c5a46a69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-610866c1-bc50-4e42-a6bc-e4263d2b424e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.845 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.849 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846228.5429037, 610866c1-bc50-4e42-a6bc-e4263d2b424e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:08 np0005603622 nova_compute[221301]: 2026-01-31 07:57:08.850 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:57:08 np0005603622 podman[236137]: 2026-01-31 07:57:08.92806378 +0000 UTC m=+0.069609088 container create e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:57:08 np0005603622 systemd[1]: Started libpod-conmon-e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453.scope.
Jan 31 02:57:08 np0005603622 podman[236137]: 2026-01-31 07:57:08.893186184 +0000 UTC m=+0.034731472 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:57:08 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:57:09 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/847a533526b58d9163326b66bb6fb5e4c9af146c5dbfdd959715228a22882f07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:57:09 np0005603622 podman[236137]: 2026-01-31 07:57:09.018239898 +0000 UTC m=+0.159785246 container init e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:57:09 np0005603622 nova_compute[221301]: 2026-01-31 07:57:09.020 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:09 np0005603622 podman[236137]: 2026-01-31 07:57:09.023728795 +0000 UTC m=+0.165274103 container start e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 02:57:09 np0005603622 nova_compute[221301]: 2026-01-31 07:57:09.026 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:57:09 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [NOTICE]   (236156) : New worker (236158) forked
Jan 31 02:57:09 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [NOTICE]   (236156) : Loading success.
Jan 31 02:57:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:09.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:09.437 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:57:09 np0005603622 nova_compute[221301]: 2026-01-31 07:57:09.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:09.439 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:57:09 np0005603622 nova_compute[221301]: 2026-01-31 07:57:09.559 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:57:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:10.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.489 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "31c35869-1596-49c7-8380-fb1f3716c194" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.490 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.490 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "31c35869-1596-49c7-8380-fb1f3716c194-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.490 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.490 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.492 221324 INFO nova.compute.manager [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Terminating instance#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.492 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.492 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquired lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:57:10 np0005603622 nova_compute[221301]: 2026-01-31 07:57:10.493 221324 DEBUG nova.network.neutron [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.106 221324 DEBUG nova.network.neutron [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.325 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.373 221324 DEBUG nova.network.neutron [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:11.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.570 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Releasing lock "refresh_cache-31c35869-1596-49c7-8380-fb1f3716c194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.570 221324 DEBUG nova.compute.manager [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.632 221324 DEBUG nova.compute.manager [req-f402c1f0-bc9d-43a4-a8f0-3067679f0db4 req-2c45eca0-2e06-4525-bac4-9aeec70ca0af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.634 221324 DEBUG oslo_concurrency.lockutils [req-f402c1f0-bc9d-43a4-a8f0-3067679f0db4 req-2c45eca0-2e06-4525-bac4-9aeec70ca0af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.635 221324 DEBUG oslo_concurrency.lockutils [req-f402c1f0-bc9d-43a4-a8f0-3067679f0db4 req-2c45eca0-2e06-4525-bac4-9aeec70ca0af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.635 221324 DEBUG oslo_concurrency.lockutils [req-f402c1f0-bc9d-43a4-a8f0-3067679f0db4 req-2c45eca0-2e06-4525-bac4-9aeec70ca0af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.636 221324 DEBUG nova.compute.manager [req-f402c1f0-bc9d-43a4-a8f0-3067679f0db4 req-2c45eca0-2e06-4525-bac4-9aeec70ca0af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Processing event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.638 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.644 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846231.6443248, 610866c1-bc50-4e42-a6bc-e4263d2b424e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.645 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.649 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.655 221324 INFO nova.virt.libvirt.driver [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Instance spawned successfully.#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.656 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:57:11 np0005603622 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 31 02:57:11 np0005603622 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001f.scope: Consumed 17.327s CPU time.
Jan 31 02:57:11 np0005603622 systemd-machined[190406]: Machine qemu-14-instance-0000001f terminated.
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.978 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.990 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.991 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.992 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.993 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.996 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:11 np0005603622 nova_compute[221301]: 2026-01-31 07:57:11.997 221324 DEBUG nova.virt.libvirt.driver [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.002 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.002 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.006 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.014 221324 INFO nova.virt.libvirt.driver [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance destroyed successfully.#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.015 221324 DEBUG nova.objects.instance [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lazy-loading 'resources' on Instance uuid 31c35869-1596-49c7-8380-fb1f3716c194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.104 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.218 221324 INFO nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Took 12.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.219 221324 DEBUG nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:12.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.530 221324 INFO nova.compute.manager [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Took 16.35 seconds to build instance.#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:12 np0005603622 nova_compute[221301]: 2026-01-31 07:57:12.578 221324 DEBUG oslo_concurrency.lockutils [None req-751c1481-5cb6-49cf-9f53-e39070005a05 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:13.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.861 221324 DEBUG nova.compute.manager [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.862 221324 DEBUG oslo_concurrency.lockutils [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.863 221324 DEBUG oslo_concurrency.lockutils [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.863 221324 DEBUG oslo_concurrency.lockutils [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.863 221324 DEBUG nova.compute.manager [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] No waiting events found dispatching network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:57:13 np0005603622 nova_compute[221301]: 2026-01-31 07:57:13.864 221324 WARNING nova.compute.manager [req-9512388d-e259-4140-ab1d-ca7aa8935d47 req-e0273bc0-268c-4c51-a6da-bc39beb952e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received unexpected event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac for instance with vm_state active and task_state None.#033[00m
Jan 31 02:57:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:14.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:15.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.598 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.599 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.599 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.600 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.600 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.601 221324 INFO nova.compute.manager [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Terminating instance#033[00m
Jan 31 02:57:15 np0005603622 nova_compute[221301]: 2026-01-31 07:57:15.603 221324 DEBUG nova.compute.manager [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:57:16 np0005603622 kernel: tapf4ace9f4-2b (unregistering): left promiscuous mode
Jan 31 02:57:16 np0005603622 NetworkManager[49080]: <info>  [1769846236.0600] device (tapf4ace9f4-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:16Z|00118|binding|INFO|Releasing lport f4ace9f4-2b8c-40bc-9098-97bc720826ac from this chassis (sb_readonly=0)
Jan 31 02:57:16 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:16Z|00119|binding|INFO|Setting lport f4ace9f4-2b8c-40bc-9098-97bc720826ac down in Southbound
Jan 31 02:57:16 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:16Z|00120|binding|INFO|Removing iface tapf4ace9f4-2b ovn-installed in OVS
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.083 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.104 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:fa:2e 10.100.0.5'], port_security=['fa:16:3e:04:fa:2e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '610866c1-bc50-4e42-a6bc-e4263d2b424e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-899802b0-9c47-42dc-b642-6fddcc730f66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '200fedf7dc594b4da20c0397b4581e89', 'neutron:revision_number': '4', 'neutron:security_group_ids': '538d5863-e20d-4bbe-b690-6e5c41471239', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad24e79e-ef80-49be-b35c-0a3db6dcdbfb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f4ace9f4-2b8c-40bc-9098-97bc720826ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.106 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f4ace9f4-2b8c-40bc-9098-97bc720826ac in datapath 899802b0-9c47-42dc-b642-6fddcc730f66 unbound from our chassis#033[00m
Jan 31 02:57:16 np0005603622 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 31 02:57:16 np0005603622 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000029.scope: Consumed 4.598s CPU time.
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.108 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 899802b0-9c47-42dc-b642-6fddcc730f66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.109 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[84296965-ded4-499d-bc62-33e249a3284c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.110 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66 namespace which is not needed anymore#033[00m
Jan 31 02:57:16 np0005603622 systemd-machined[190406]: Machine qemu-17-instance-00000029 terminated.
Jan 31 02:57:16 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [NOTICE]   (236156) : haproxy version is 2.8.14-c23fe91
Jan 31 02:57:16 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [NOTICE]   (236156) : path to executable is /usr/sbin/haproxy
Jan 31 02:57:16 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [WARNING]  (236156) : Exiting Master process...
Jan 31 02:57:16 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [ALERT]    (236156) : Current worker (236158) exited with code 143 (Terminated)
Jan 31 02:57:16 np0005603622 neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66[236152]: [WARNING]  (236156) : All workers exited. Exiting... (0)
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.239 221324 INFO nova.virt.libvirt.driver [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Instance destroyed successfully.#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.240 221324 DEBUG nova.objects.instance [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lazy-loading 'resources' on Instance uuid 610866c1-bc50-4e42-a6bc-e4263d2b424e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:57:16 np0005603622 systemd[1]: libpod-e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453.scope: Deactivated successfully.
Jan 31 02:57:16 np0005603622 podman[236344]: 2026-01-31 07:57:16.250681698 +0000 UTC m=+0.050533016 container died e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 02:57:16 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453-userdata-shm.mount: Deactivated successfully.
Jan 31 02:57:16 np0005603622 systemd[1]: var-lib-containers-storage-overlay-847a533526b58d9163326b66bb6fb5e4c9af146c5dbfdd959715228a22882f07-merged.mount: Deactivated successfully.
Jan 31 02:57:16 np0005603622 podman[236344]: 2026-01-31 07:57:16.293726129 +0000 UTC m=+0.093577447 container cleanup e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 02:57:16 np0005603622 systemd[1]: libpod-conmon-e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453.scope: Deactivated successfully.
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.316 221324 DEBUG nova.virt.libvirt.vif [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:56:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-998124970',display_name='tempest-ImagesNegativeTestJSON-server-998124970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-998124970',id=41,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:57:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='200fedf7dc594b4da20c0397b4581e89',ramdisk_id='',reservation_id='r-2t1btgek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-1409118820',owner_user_name='tempest-ImagesNegativeTestJSON-1409118820-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:57:12Z,user_data=None,user_id='af90832a37cd4b10b02397d75b4802c9',uuid=610866c1-bc50-4e42-a6bc-e4263d2b424e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.316 221324 DEBUG nova.network.os_vif_util [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converting VIF {"id": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "address": "fa:16:3e:04:fa:2e", "network": {"id": "899802b0-9c47-42dc-b642-6fddcc730f66", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-844112012-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "200fedf7dc594b4da20c0397b4581e89", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4ace9f4-2b", "ovs_interfaceid": "f4ace9f4-2b8c-40bc-9098-97bc720826ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.317 221324 DEBUG nova.network.os_vif_util [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.318 221324 DEBUG os_vif [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.321 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.321 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4ace9f4-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.330 221324 INFO os_vif [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:fa:2e,bridge_name='br-int',has_traffic_filtering=True,id=f4ace9f4-2b8c-40bc-9098-97bc720826ac,network=Network(899802b0-9c47-42dc-b642-6fddcc730f66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4ace9f4-2b')#033[00m
Jan 31 02:57:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:16.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:16 np0005603622 podman[236383]: 2026-01-31 07:57:16.359861878 +0000 UTC m=+0.048432488 container remove e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.365 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[962c00e9-fb39-4b91-9175-c75996bdadf6]: (4, ('Sat Jan 31 07:57:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66 (e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453)\ne7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453\nSat Jan 31 07:57:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66 (e7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453)\ne7599b611b2850d6737dd2496a25676f1b8d0d7e6ba4556a20134fe6f9966453\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.368 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8453f6aa-ad62-40ce-a136-771c38a585a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.368 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap899802b0-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:16 np0005603622 kernel: tap899802b0-90: left promiscuous mode
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.371 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.382 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d310b92d-7b11-4766-8af3-8813db4831f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.406 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f54ce0-e9cb-41bf-8d61-298ece034f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.408 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[98507441-d657-4b77-a3b4-ce96f11b1c63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.420 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5155fefb-180c-49b7-8e91-d0f0fc6e405c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547960, 'reachable_time': 21418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236416, 'error': None, 'target': 'ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.422 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-899802b0-9c47-42dc-b642-6fddcc730f66 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:57:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:16.423 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[454f59cc-4ac6-4b56-94c1-95e14e57fe5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:57:16 np0005603622 systemd[1]: run-netns-ovnmeta\x2d899802b0\x2d9c47\x2d42dc\x2db642\x2d6fddcc730f66.mount: Deactivated successfully.
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.479 221324 DEBUG nova.compute.manager [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-unplugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.479 221324 DEBUG oslo_concurrency.lockutils [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.481 221324 DEBUG oslo_concurrency.lockutils [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.482 221324 DEBUG oslo_concurrency.lockutils [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.482 221324 DEBUG nova.compute.manager [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] No waiting events found dispatching network-vif-unplugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:57:16 np0005603622 nova_compute[221301]: 2026-01-31 07:57:16.482 221324 DEBUG nova.compute.manager [req-f3df8501-c550-42c0-b003-fb22412603e6 req-5ce11069-1e23-4c8f-a77b-f1e0759911f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-unplugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:57:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:17.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:17 np0005603622 nova_compute[221301]: 2026-01-31 07:57:17.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:18.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:18.441 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.644 221324 DEBUG nova.compute.manager [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.645 221324 DEBUG oslo_concurrency.lockutils [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.645 221324 DEBUG oslo_concurrency.lockutils [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.645 221324 DEBUG oslo_concurrency.lockutils [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.645 221324 DEBUG nova.compute.manager [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] No waiting events found dispatching network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:57:18 np0005603622 nova_compute[221301]: 2026-01-31 07:57:18.645 221324 WARNING nova.compute.manager [req-3123d3a9-48a8-4fbe-b3fc-f685f4c64a4e req-ab9fcd05-ea36-473c-9e82-100d4b4c3b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received unexpected event network-vif-plugged-f4ace9f4-2b8c-40bc-9098-97bc720826ac for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:57:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:57:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:57:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:19.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:19 np0005603622 nova_compute[221301]: 2026-01-31 07:57:19.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:19 np0005603622 nova_compute[221301]: 2026-01-31 07:57:19.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:57:19 np0005603622 nova_compute[221301]: 2026-01-31 07:57:19.648 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:57:19 np0005603622 nova_compute[221301]: 2026-01-31 07:57:19.649 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:57:19 np0005603622 nova_compute[221301]: 2026-01-31 07:57:19.649 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:57:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:20.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.684 221324 INFO nova.virt.libvirt.driver [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Deleting instance files /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194_del#033[00m
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.686 221324 INFO nova.virt.libvirt.driver [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Deletion of /var/lib/nova/instances/31c35869-1596-49c7-8380-fb1f3716c194_del complete#033[00m
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.907 221324 INFO nova.compute.manager [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Took 9.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.909 221324 DEBUG oslo.service.loopingcall [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.910 221324 DEBUG nova.compute.manager [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:57:20 np0005603622 nova_compute[221301]: 2026-01-31 07:57:20.910 221324 DEBUG nova.network.neutron [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.097 221324 DEBUG nova.network.neutron [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.136 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updating instance_info_cache with network_info: [{"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.274 221324 INFO nova.virt.libvirt.driver [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Deleting instance files /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e_del#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.275 221324 INFO nova.virt.libvirt.driver [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Deletion of /var/lib/nova/instances/610866c1-bc50-4e42-a6bc-e4263d2b424e_del complete#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.345 221324 DEBUG nova.network.neutron [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:21.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.419 221324 INFO nova.compute.manager [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Took 0.51 seconds to deallocate network for instance.#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.481 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.481 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.483 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.484 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.484 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.485 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.638 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.638 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.639 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.639 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.639 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.669 221324 INFO nova.compute.manager [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Took 6.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.671 221324 DEBUG oslo.service.loopingcall [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.671 221324 DEBUG nova.compute.manager [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.672 221324 DEBUG nova.network.neutron [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.686 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.687 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:21 np0005603622 nova_compute[221301]: 2026-01-31 07:57:21.784 221324 DEBUG oslo_concurrency.processutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/559306812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.083 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3979330570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.256 221324 DEBUG oslo_concurrency.processutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.263 221324 DEBUG nova.compute.provider_tree [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:22.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.446 221324 DEBUG nova.scheduler.client.report [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.456 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.456 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.648 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.649 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4555MB free_disk=20.81366729736328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.650 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.988 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:22 np0005603622 nova_compute[221301]: 2026-01-31 07:57:22.994 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.129 221324 INFO nova.scheduler.client.report [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Deleted allocations for instance 31c35869-1596-49c7-8380-fb1f3716c194#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.249 221324 DEBUG nova.compute.manager [req-c44eaeb6-8241-435f-b017-82aff3738b2c req-5ea2e5c9-cca4-4f0e-a5f6-169fa7a7f5b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Received event network-vif-deleted-f4ace9f4-2b8c-40bc-9098-97bc720826ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.249 221324 INFO nova.compute.manager [req-c44eaeb6-8241-435f-b017-82aff3738b2c req-5ea2e5c9-cca4-4f0e-a5f6-169fa7a7f5b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Neutron deleted interface f4ace9f4-2b8c-40bc-9098-97bc720826ac; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.250 221324 DEBUG nova.network.neutron [req-c44eaeb6-8241-435f-b017-82aff3738b2c req-5ea2e5c9-cca4-4f0e-a5f6-169fa7a7f5b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.313 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.314 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 610866c1-bc50-4e42-a6bc-e4263d2b424e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.314 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.314 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.333 221324 DEBUG nova.network.neutron [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.394 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.452 221324 DEBUG nova.compute.manager [req-c44eaeb6-8241-435f-b017-82aff3738b2c req-5ea2e5c9-cca4-4f0e-a5f6-169fa7a7f5b6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Detach interface failed, port_id=f4ace9f4-2b8c-40bc-9098-97bc720826ac, reason: Instance 610866c1-bc50-4e42-a6bc-e4263d2b424e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.456 221324 INFO nova.compute.manager [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Took 1.78 seconds to deallocate network for instance.#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.548 221324 DEBUG oslo_concurrency.lockutils [None req-9b6fe193-6a77-4273-bdbc-759ccb96cc7b d4307bc8a2224140b78ba248cecefe55 b6dca32431594e2682c5d2acb448bbf4 - - default default] Lock "31c35869-1596-49c7-8380-fb1f3716c194" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.595 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2831150332' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.886 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.891 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:23 np0005603622 nova_compute[221301]: 2026-01-31 07:57:23.932 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.048 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.049 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.049 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.317 221324 DEBUG oslo_concurrency.processutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:57:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:24.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:57:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/708728044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:57:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.821 221324 DEBUG oslo_concurrency.processutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.829 221324 DEBUG nova.compute.provider_tree [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.894 221324 DEBUG nova.scheduler.client.report [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:57:24 np0005603622 nova_compute[221301]: 2026-01-31 07:57:24.977 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:25 np0005603622 nova_compute[221301]: 2026-01-31 07:57:25.031 221324 INFO nova.scheduler.client.report [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Deleted allocations for instance 610866c1-bc50-4e42-a6bc-e4263d2b424e#033[00m
Jan 31 02:57:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:25.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:25 np0005603622 nova_compute[221301]: 2026-01-31 07:57:25.413 221324 DEBUG oslo_concurrency.lockutils [None req-cbdea8ca-ee5d-4960-b351-4d7090f8b130 af90832a37cd4b10b02397d75b4802c9 200fedf7dc594b4da20c0397b4581e89 - - default default] Lock "610866c1-bc50-4e42-a6bc-e4263d2b424e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:26 np0005603622 nova_compute[221301]: 2026-01-31 07:57:26.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:26.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:27 np0005603622 nova_compute[221301]: 2026-01-31 07:57:27.107 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846232.003956, 31c35869-1596-49c7-8380-fb1f3716c194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:27 np0005603622 nova_compute[221301]: 2026-01-31 07:57:27.107 221324 INFO nova.compute.manager [-] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:57:27 np0005603622 nova_compute[221301]: 2026-01-31 07:57:27.190 221324 DEBUG nova.compute.manager [None req-01c1e052-bbf0-4fa3-8bfb-f208a6e606e5 - - - - - -] [instance: 31c35869-1596-49c7-8380-fb1f3716c194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:27.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:27 np0005603622 nova_compute[221301]: 2026-01-31 07:57:27.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:28.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:29.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:30 np0005603622 podman[236508]: 2026-01-31 07:57:30.23542412 +0000 UTC m=+0.144231085 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:57:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:30.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:30.851 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:30.852 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:57:30.854 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:57:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:57:31 np0005603622 nova_compute[221301]: 2026-01-31 07:57:31.238 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846236.2363834, 610866c1-bc50-4e42-a6bc-e4263d2b424e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:57:31 np0005603622 nova_compute[221301]: 2026-01-31 07:57:31.240 221324 INFO nova.compute.manager [-] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:57:31 np0005603622 nova_compute[221301]: 2026-01-31 07:57:31.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:31 np0005603622 nova_compute[221301]: 2026-01-31 07:57:31.380 221324 DEBUG nova.compute.manager [None req-e81dec91-75b0-496b-aa4f-5b3dc337beaa - - - - - -] [instance: 610866c1-bc50-4e42-a6bc-e4263d2b424e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:57:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:31.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:32.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:32 np0005603622 nova_compute[221301]: 2026-01-31 07:57:32.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:33.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:34.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:57:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:35.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:57:36 np0005603622 ovn_controller[130109]: 2026-01-31T07:57:36Z|00121|binding|INFO|Releasing lport 8c531a0f-deeb-4de0-880b-b07ec1cf9103 from this chassis (sb_readonly=0)
Jan 31 02:57:36 np0005603622 nova_compute[221301]: 2026-01-31 07:57:36.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:36 np0005603622 nova_compute[221301]: 2026-01-31 07:57:36.331 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:36.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:37 np0005603622 podman[236584]: 2026-01-31 07:57:37.132000989 +0000 UTC m=+0.050289609 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 02:57:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:37.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:37 np0005603622 nova_compute[221301]: 2026-01-31 07:57:37.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:38.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 02:57:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/504981422' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 02:57:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 02:57:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/504981422' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 02:57:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:57:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:39.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:57:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:40.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:41 np0005603622 nova_compute[221301]: 2026-01-31 07:57:41.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:41.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:42.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:42 np0005603622 nova_compute[221301]: 2026-01-31 07:57:42.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:43.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:44.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:45.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:46 np0005603622 nova_compute[221301]: 2026-01-31 07:57:46.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:57:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:46.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:57:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:47.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:47 np0005603622 nova_compute[221301]: 2026-01-31 07:57:47.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:48.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:49.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:50.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:51 np0005603622 nova_compute[221301]: 2026-01-31 07:57:51.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:51.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:52.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:52 np0005603622 nova_compute[221301]: 2026-01-31 07:57:52.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:53.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:54.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:57:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:57:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:55.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:57:56 np0005603622 nova_compute[221301]: 2026-01-31 07:57:56.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:56.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:57.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:57 np0005603622 nova_compute[221301]: 2026-01-31 07:57:57.631 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:57:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:57:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:57:58.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:57:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:57:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:57:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:57:59.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:57:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:00.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:01 np0005603622 podman[236604]: 2026-01-31 07:58:01.206958587 +0000 UTC m=+0.130150443 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 02:58:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 02:58:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:01.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 02:58:02 np0005603622 nova_compute[221301]: 2026-01-31 07:58:02.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:02.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:02 np0005603622 nova_compute[221301]: 2026-01-31 07:58:02.634 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:04.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:58:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:05.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:58:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:06.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:07 np0005603622 nova_compute[221301]: 2026-01-31 07:58:07.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:07.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:07 np0005603622 nova_compute[221301]: 2026-01-31 07:58:07.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:08 np0005603622 podman[236631]: 2026-01-31 07:58:08.130636375 +0000 UTC m=+0.056603071 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 02:58:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:08.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:09.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:10.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:11.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:12 np0005603622 nova_compute[221301]: 2026-01-31 07:58:12.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:12 np0005603622 nova_compute[221301]: 2026-01-31 07:58:12.286 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:12.286 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:12.288 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:58:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:12.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:12 np0005603622 nova_compute[221301]: 2026-01-31 07:58:12.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:14.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:14 np0005603622 nova_compute[221301]: 2026-01-31 07:58:14.960 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:14 np0005603622 nova_compute[221301]: 2026-01-31 07:58:14.961 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:58:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:15.289 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:15.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:16.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:17 np0005603622 nova_compute[221301]: 2026-01-31 07:58:17.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:17 np0005603622 nova_compute[221301]: 2026-01-31 07:58:17.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:17.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:17 np0005603622 nova_compute[221301]: 2026-01-31 07:58:17.680 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.177 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.178 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.178 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.179 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.179 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.181 221324 INFO nova.compute.manager [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Terminating instance#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.183 221324 DEBUG nova.compute.manager [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:58:18 np0005603622 kernel: tap91caa716-dd (unregistering): left promiscuous mode
Jan 31 02:58:18 np0005603622 NetworkManager[49080]: <info>  [1769846298.2405] device (tap91caa716-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:18Z|00122|binding|INFO|Releasing lport 91caa716-ddb3-41b4-a4d3-b18e5c81423b from this chassis (sb_readonly=0)
Jan 31 02:58:18 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:18Z|00123|binding|INFO|Setting lport 91caa716-ddb3-41b4-a4d3-b18e5c81423b down in Southbound
Jan 31 02:58:18 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:18Z|00124|binding|INFO|Removing iface tap91caa716-dd ovn-installed in OVS
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.252 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.260 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:ac:eb 10.100.0.7'], port_security=['fa:16:3e:7c:ac:eb 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c58eaedf-202a-428a-acfb-f0b1291517f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8033316fc42c4926bfd1f8a34b02fa97', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4b3d9baf-bd3e-457e-a5c2-9addbc71d588', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189b55ef-8e14-4c6c-870a-5dba85715c4a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=91caa716-ddb3-41b4-a4d3-b18e5c81423b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.261 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.262 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 91caa716-ddb3-41b4-a4d3-b18e5c81423b in datapath c58eaedf-202a-428a-acfb-f0b1291517f1 unbound from our chassis#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.264 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c58eaedf-202a-428a-acfb-f0b1291517f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.266 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[75e93d94-da79-4afe-a583-cbb99ec31bb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.267 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 namespace which is not needed anymore#033[00m
Jan 31 02:58:18 np0005603622 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 31 02:58:18 np0005603622 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000021.scope: Consumed 19.013s CPU time.
Jan 31 02:58:18 np0005603622 systemd-machined[190406]: Machine qemu-15-instance-00000021 terminated.
Jan 31 02:58:18 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [NOTICE]   (234627) : haproxy version is 2.8.14-c23fe91
Jan 31 02:58:18 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [NOTICE]   (234627) : path to executable is /usr/sbin/haproxy
Jan 31 02:58:18 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [WARNING]  (234627) : Exiting Master process...
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [ALERT]    (234627) : Current worker (234629) exited with code 143 (Terminated)
Jan 31 02:58:18 np0005603622 neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1[234623]: [WARNING]  (234627) : All workers exited. Exiting... (0)
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:18 np0005603622 systemd[1]: libpod-b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e.scope: Deactivated successfully.
Jan 31 02:58:18 np0005603622 podman[236676]: 2026-01-31 07:58:18.403839061 +0000 UTC m=+0.057898806 container died b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.410 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.419 221324 INFO nova.virt.libvirt.driver [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Instance destroyed successfully.#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.420 221324 DEBUG nova.objects.instance [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lazy-loading 'resources' on Instance uuid ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:18 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e-userdata-shm.mount: Deactivated successfully.
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.438 221324 DEBUG nova.virt.libvirt.vif [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:55:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1694929068',display_name='tempest-ServersAdminTestJSON-server-1694929068',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1694929068',id=33,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:56:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8033316fc42c4926bfd1f8a34b02fa97',ramdisk_id='',reservation_id='r-vbdmhf1d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-784933461',owner_user_name='tempest-ServersAdminTestJSON-784933461-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:56:00Z,user_data=None,user_id='93973daeb08c453e90372a79b54b9ede',uuid=ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.438 221324 DEBUG nova.network.os_vif_util [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converting VIF {"id": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "address": "fa:16:3e:7c:ac:eb", "network": {"id": "c58eaedf-202a-428a-acfb-f0b1291517f1", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1332449122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8033316fc42c4926bfd1f8a34b02fa97", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91caa716-dd", "ovs_interfaceid": "91caa716-ddb3-41b4-a4d3-b18e5c81423b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.440 221324 DEBUG nova.network.os_vif_util [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.441 221324 DEBUG os_vif [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:58:18 np0005603622 systemd[1]: var-lib-containers-storage-overlay-114b941803498188fc44ea3ea8c9fe24bd2faabda83d378c8de26e2bc5445907-merged.mount: Deactivated successfully.
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.444 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91caa716-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.452 221324 INFO os_vif [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ac:eb,bridge_name='br-int',has_traffic_filtering=True,id=91caa716-ddb3-41b4-a4d3-b18e5c81423b,network=Network(c58eaedf-202a-428a-acfb-f0b1291517f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91caa716-dd')#033[00m
Jan 31 02:58:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:18.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:18 np0005603622 podman[236676]: 2026-01-31 07:58:18.453834561 +0000 UTC m=+0.107894296 container cleanup b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 02:58:18 np0005603622 systemd[1]: libpod-conmon-b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e.scope: Deactivated successfully.
Jan 31 02:58:18 np0005603622 podman[236715]: 2026-01-31 07:58:18.51554406 +0000 UTC m=+0.041969363 container remove b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.519 221324 DEBUG nova.compute.manager [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-unplugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.519 221324 DEBUG oslo_concurrency.lockutils [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.519 221324 DEBUG oslo_concurrency.lockutils [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.520 221324 DEBUG oslo_concurrency.lockutils [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.520 221324 DEBUG nova.compute.manager [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] No waiting events found dispatching network-vif-unplugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.520 221324 DEBUG nova.compute.manager [req-b9ad7004-806a-4da1-b6be-2339e4111ae6 req-b4bf7339-e91d-494b-b0b4-6d39b148da87 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-unplugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.520 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99b8bad1-22c6-499f-9fee-82033878bf1a]: (4, ('Sat Jan 31 07:58:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 (b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e)\nb636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e\nSat Jan 31 07:58:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 (b636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e)\nb636672d410f82e7a35d1a6f76d23128340e036d5d52a25324d9fc628ff8ee8e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.521 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15f86d81-4e33-4391-a373-c7d917131c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.522 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc58eaedf-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 kernel: tapc58eaedf-20: left promiscuous mode
Jan 31 02:58:18 np0005603622 nova_compute[221301]: 2026-01-31 07:58:18.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.533 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9b527a1e-ee09-449d-ba90-c499af0916a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.554 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[188c4161-0e3f-4861-9bd1-d0f0415965ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.556 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f859c3b1-36b1-4f40-b6d9-52bfbffa0e48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8969cc-d3f3-4a4f-9720-81f22df32ac9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540205, 'reachable_time': 17064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236747, 'error': None, 'target': 'ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:18 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc58eaedf\x2d202a\x2d428a\x2dacfb\x2df0b1291517f1.mount: Deactivated successfully.
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.571 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c58eaedf-202a-428a-acfb-f0b1291517f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:58:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:18.572 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[77e9356f-db21-458b-8b16-4f14b43a9c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:19.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.646 221324 INFO nova.virt.libvirt.driver [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Deleting instance files /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_del#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.647 221324 INFO nova.virt.libvirt.driver [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Deletion of /var/lib/nova/instances/ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0_del complete#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.701 221324 INFO nova.compute.manager [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Took 1.52 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.702 221324 DEBUG oslo.service.loopingcall [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.703 221324 DEBUG nova.compute.manager [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:58:19 np0005603622 nova_compute[221301]: 2026-01-31 07:58:19.703 221324 DEBUG nova.network.neutron [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:58:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:58:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:20.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.521 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.522 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.522 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.523 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.598 221324 DEBUG nova.compute.manager [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.599 221324 DEBUG oslo_concurrency.lockutils [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.599 221324 DEBUG oslo_concurrency.lockutils [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.599 221324 DEBUG oslo_concurrency.lockutils [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.599 221324 DEBUG nova.compute.manager [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] No waiting events found dispatching network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.599 221324 WARNING nova.compute.manager [req-5ead59bb-0b7c-4956-b175-0ed06984881f req-126b4db6-254b-4126-b720-a6f24bdf6270 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received unexpected event network-vif-plugged-91caa716-ddb3-41b4-a4d3-b18e5c81423b for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:58:20 np0005603622 nova_compute[221301]: 2026-01-31 07:58:20.976 221324 DEBUG nova.network.neutron [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.021 221324 INFO nova.compute.manager [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Took 1.32 seconds to deallocate network for instance.#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.087 221324 DEBUG nova.compute.manager [req-9adba2fe-1c5d-4fb4-afc4-bf75d95e2ded req-17795275-8f98-48d9-b286-dc522685003c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Received event network-vif-deleted-91caa716-ddb3-41b4-a4d3-b18e5c81423b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.107 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.107 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.155 221324 DEBUG oslo_concurrency.processutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:21.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/642721031' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.589 221324 DEBUG oslo_concurrency.processutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.596 221324 DEBUG nova.compute.provider_tree [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.630 221324 DEBUG nova.scheduler.client.report [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.659 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.695 221324 INFO nova.scheduler.client.report [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Deleted allocations for instance ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0#033[00m
Jan 31 02:58:21 np0005603622 nova_compute[221301]: 2026-01-31 07:58:21.810 221324 DEBUG oslo_concurrency.lockutils [None req-af0bc352-38d8-4802-9f3a-fadfb8b1e062 93973daeb08c453e90372a79b54b9ede 8033316fc42c4926bfd1f8a34b02fa97 - - default default] Lock "ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.430 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:58:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:58:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:22.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.463 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.463 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.463 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.463 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.464 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3475962907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:22 np0005603622 nova_compute[221301]: 2026-01-31 07:58:22.910 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.086 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.089 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4820MB free_disk=20.909778594970703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.090 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.090 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.159 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.160 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.191 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:23.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3683975873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.698 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.703 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.722 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.745 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:58:23 np0005603622 nova_compute[221301]: 2026-01-31 07:58:23.745 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:24.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:25.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:26.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:58:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:27.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:58:27 np0005603622 nova_compute[221301]: 2026-01-31 07:58:27.687 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603622 nova_compute[221301]: 2026-01-31 07:58:28.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:28.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:29.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:30.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:30.852 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:30.854 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:30.854 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:31.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:32 np0005603622 podman[236947]: 2026-01-31 07:58:32.155549783 +0000 UTC m=+0.074234601 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 02:58:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:58:32 np0005603622 nova_compute[221301]: 2026-01-31 07:58:32.402 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:32.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:32 np0005603622 nova_compute[221301]: 2026-01-31 07:58:32.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603622 nova_compute[221301]: 2026-01-31 07:58:33.418 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846298.4173548, ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:33 np0005603622 nova_compute[221301]: 2026-01-31 07:58:33.419 221324 INFO nova.compute.manager [-] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:58:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:58:33 np0005603622 nova_compute[221301]: 2026-01-31 07:58:33.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:33 np0005603622 nova_compute[221301]: 2026-01-31 07:58:33.452 221324 DEBUG nova.compute.manager [None req-d48d5630-01db-4b5c-b4c7-0a57b78366d7 - - - - - -] [instance: ff3de2be-ce2a-4fa5-9c79-c2a8170d31a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:33.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:34.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:35.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:36.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:37.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:37 np0005603622 nova_compute[221301]: 2026-01-31 07:58:37.690 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.334 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.335 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.369 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:38.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.576 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.577 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.583 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.583 221324 INFO nova.compute.claims [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:58:38 np0005603622 nova_compute[221301]: 2026-01-31 07:58:38.986 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:39 np0005603622 podman[236976]: 2026-01-31 07:58:39.123866894 +0000 UTC m=+0.042995851 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:58:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3904426248' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.394 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.400 221324 DEBUG nova.compute.provider_tree [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.453 221324 DEBUG nova.scheduler.client.report [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.490 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.491 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:58:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:39.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.544 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.545 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.564 221324 INFO nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.584 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.677 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.678 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.679 221324 INFO nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Creating image(s)#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.702 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.726 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.751 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.754 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.774 221324 DEBUG nova.policy [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0e402088c09448e1a6f0cd61b11e0816', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f31b0319126848a5b8fd9521dc509172', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.804 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.805 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.805 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.806 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.829 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:39 np0005603622 nova_compute[221301]: 2026-01-31 07:58:39.832 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 398d0932-8bb1-453a-b131-005935297a74_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.240 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 398d0932-8bb1-453a-b131-005935297a74_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.313 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] resizing rbd image 398d0932-8bb1-453a-b131-005935297a74_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.440 221324 DEBUG nova.objects.instance [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'migration_context' on Instance uuid 398d0932-8bb1-453a-b131-005935297a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.471 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.472 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Ensure instance console log exists: /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.472 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.473 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.473 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:40.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:40 np0005603622 nova_compute[221301]: 2026-01-31 07:58:40.781 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Successfully created port: 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:58:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:41.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.680 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Successfully updated port: 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.701 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.701 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquired lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.702 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.892 221324 DEBUG nova.compute.manager [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.893 221324 DEBUG nova.compute.manager [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing instance network info cache due to event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.894 221324 DEBUG oslo_concurrency.lockutils [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:41 np0005603622 nova_compute[221301]: 2026-01-31 07:58:41.963 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:58:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:42.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:42 np0005603622 nova_compute[221301]: 2026-01-31 07:58:42.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.182 221324 DEBUG nova.network.neutron [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.213 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Releasing lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.214 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Instance network_info: |[{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.214 221324 DEBUG oslo_concurrency.lockutils [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.214 221324 DEBUG nova.network.neutron [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.217 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Start _get_guest_xml network_info=[{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.222 221324 WARNING nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.227 221324 DEBUG nova.virt.libvirt.host [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.228 221324 DEBUG nova.virt.libvirt.host [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.233 221324 DEBUG nova.virt.libvirt.host [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.234 221324 DEBUG nova.virt.libvirt.host [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.235 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.235 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.236 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.236 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.236 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.236 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.237 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.237 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.237 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.237 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.238 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.238 221324 DEBUG nova.virt.hardware [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.240 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:43.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1789651703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.686 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.723 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:43 np0005603622 nova_compute[221301]: 2026-01-31 07:58:43.730 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/370515657' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.195 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.198 221324 DEBUG nova.virt.libvirt.vif [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2040319596',display_name='tempest-SecurityGroupsTestJSON-server-2040319596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2040319596',id=44,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-grx3dghc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:39Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=398d0932-8bb1-453a-b131-005935297a74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.198 221324 DEBUG nova.network.os_vif_util [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.200 221324 DEBUG nova.network.os_vif_util [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.202 221324 DEBUG nova.objects.instance [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'pci_devices' on Instance uuid 398d0932-8bb1-453a-b131-005935297a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.223 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <uuid>398d0932-8bb1-453a-b131-005935297a74</uuid>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <name>instance-0000002c</name>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:name>tempest-SecurityGroupsTestJSON-server-2040319596</nova:name>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:58:43</nova:creationTime>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:user uuid="0e402088c09448e1a6f0cd61b11e0816">tempest-SecurityGroupsTestJSON-648078268-project-member</nova:user>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:project uuid="f31b0319126848a5b8fd9521dc509172">tempest-SecurityGroupsTestJSON-648078268</nova:project>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <nova:port uuid="3c9485bf-c01f-49c4-89bc-8ed14efcd7dd">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="serial">398d0932-8bb1-453a-b131-005935297a74</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="uuid">398d0932-8bb1-453a-b131-005935297a74</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/398d0932-8bb1-453a-b131-005935297a74_disk">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/398d0932-8bb1-453a-b131-005935297a74_disk.config">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:9e:e4:7e"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <target dev="tap3c9485bf-c0"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/console.log" append="off"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:58:44 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:58:44 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:58:44 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:58:44 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.224 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Preparing to wait for external event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.225 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.225 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.226 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.227 221324 DEBUG nova.virt.libvirt.vif [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2040319596',display_name='tempest-SecurityGroupsTestJSON-server-2040319596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2040319596',id=44,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-grx3dghc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:39Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=398d0932-8bb1-453a-b131-005935297a74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.227 221324 DEBUG nova.network.os_vif_util [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.228 221324 DEBUG nova.network.os_vif_util [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.228 221324 DEBUG os_vif [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.230 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.230 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.238 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c9485bf-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.238 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c9485bf-c0, col_values=(('external_ids', {'iface-id': '3c9485bf-c01f-49c4-89bc-8ed14efcd7dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:e4:7e', 'vm-uuid': '398d0932-8bb1-453a-b131-005935297a74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:44 np0005603622 NetworkManager[49080]: <info>  [1769846324.2414] manager: (tap3c9485bf-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.247 221324 INFO os_vif [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0')#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.301 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.301 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.302 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] No VIF found with MAC fa:16:3e:9e:e4:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.302 221324 INFO nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Using config drive#033[00m
Jan 31 02:58:44 np0005603622 nova_compute[221301]: 2026-01-31 07:58:44.340 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:44.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.032 221324 INFO nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Creating config drive at /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.037 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpj7rvf0u4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.172 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpj7rvf0u4" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.203 221324 DEBUG nova.storage.rbd_utils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] rbd image 398d0932-8bb1-453a-b131-005935297a74_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.209 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config 398d0932-8bb1-453a-b131-005935297a74_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.285 221324 DEBUG nova.network.neutron [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updated VIF entry in instance network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.286 221324 DEBUG nova.network.neutron [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.302 221324 DEBUG oslo_concurrency.lockutils [req-daaff5bd-09ff-4730-ab95-b8f180afc813 req-be788985-3f6a-44fd-b81a-e749d082981e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.528 221324 DEBUG oslo_concurrency.processutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config 398d0932-8bb1-453a-b131-005935297a74_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.529 221324 INFO nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Deleting local config drive /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74/disk.config because it was imported into RBD.#033[00m
Jan 31 02:58:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:45.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:45 np0005603622 kernel: tap3c9485bf-c0: entered promiscuous mode
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.5925] manager: (tap3c9485bf-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 31 02:58:45 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:45Z|00125|binding|INFO|Claiming lport 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd for this chassis.
Jan 31 02:58:45 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:45Z|00126|binding|INFO|3c9485bf-c01f-49c4-89bc-8ed14efcd7dd: Claiming fa:16:3e:9e:e4:7e 10.100.0.5
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.592 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.597 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.608 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:e4:7e 10.100.0.5'], port_security=['fa:16:3e:9e:e4:7e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '398d0932-8bb1-453a-b131-005935297a74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.610 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 bound to our chassis#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.613 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92b7a3d2-99de-4036-b28b-98f77dab6a25#033[00m
Jan 31 02:58:45 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:45Z|00127|binding|INFO|Setting lport 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd ovn-installed in OVS
Jan 31 02:58:45 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:45Z|00128|binding|INFO|Setting lport 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd up in Southbound
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 systemd-machined[190406]: New machine qemu-18-instance-0000002c.
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.631 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[50dbb167-8be2-4ec0-94e1-0746173e75cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.633 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92b7a3d2-91 in ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.636 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92b7a3d2-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.636 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ad5cfa-fae4-4238-82ee-29f0e85e3f2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 systemd-udevd[237366]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.637 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[14ccf886-ea6b-4372-a234-3f288d2c8cf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 systemd[1]: Started Virtual Machine qemu-18-instance-0000002c.
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.6554] device (tap3c9485bf-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.6560] device (tap3c9485bf-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.655 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[203870cd-9b73-4482-8845-f02c6b4bb8b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.668 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5511c8-61ef-4f4e-84c3-93e89fa69971]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.695 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1bdc35af-b989-4a5f-a8de-dbe7d228432e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 systemd-udevd[237369]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.699 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74d70cd6-ff98-468c-a19d-216f144f3ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.7005] manager: (tap92b7a3d2-90): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.724746) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325724844, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2684, "num_deletes": 502, "total_data_size": 5806932, "memory_usage": 5886800, "flush_reason": "Manual Compaction"}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.729 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f29c776-6a90-4a40-bfc8-8e592d777006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.732 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4e73e483-fb1d-470a-990c-0a854cec1762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.7542] device (tap92b7a3d2-90): carrier: link connected
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.756 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[27d4ffed-f008-4948-947f-d82c3d928f6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325762070, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3656389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28303, "largest_seqno": 30981, "table_properties": {"data_size": 3646130, "index_size": 5985, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 25689, "raw_average_key_size": 20, "raw_value_size": 3623274, "raw_average_value_size": 2875, "num_data_blocks": 260, "num_entries": 1260, "num_filter_entries": 1260, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846110, "oldest_key_time": 1769846110, "file_creation_time": 1769846325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 37369 microseconds, and 7062 cpu microseconds.
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.762126) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3656389 bytes OK
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.762150) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.765064) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.765082) EVENT_LOG_v1 {"time_micros": 1769846325765076, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.765108) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5794218, prev total WAL file size 5794218, number of live WAL files 2.
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.766529) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3570KB)], [57(10MB)]
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325766611, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14981873, "oldest_snapshot_seqno": -1}
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.774 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9525f0-2caf-40e4-b106-00ffc0d3bfb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557709, 'reachable_time': 35488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237398, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.792 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5143ff61-874a-48c4-9b1f-5901298fc72e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:d681'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557709, 'tstamp': 557709}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237399, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.810 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[47fdeb0e-fb6e-4fde-873d-cd5d52ad055d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92b7a3d2-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:d6:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557709, 'reachable_time': 35488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237400, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.841 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1dec68fd-b20d-4309-9fe0-2bd576a1f22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5501 keys, 9230421 bytes, temperature: kUnknown
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325844825, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 9230421, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9193650, "index_size": 21941, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 140369, "raw_average_key_size": 25, "raw_value_size": 9094809, "raw_average_value_size": 1653, "num_data_blocks": 886, "num_entries": 5501, "num_filter_entries": 5501, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846325, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.845525) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 9230421 bytes
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.847369) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.4 rd, 117.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.8 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(6.6) write-amplify(2.5) OK, records in: 6516, records dropped: 1015 output_compression: NoCompression
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.847394) EVENT_LOG_v1 {"time_micros": 1769846325847378, "job": 34, "event": "compaction_finished", "compaction_time_micros": 78706, "compaction_time_cpu_micros": 18788, "output_level": 6, "num_output_files": 1, "total_output_size": 9230421, "num_input_records": 6516, "num_output_records": 5501, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325847961, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846325849108, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.766416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.849247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.849254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.849256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.849257) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-07:58:45.849258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.902 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc71c33-476c-4d4e-b927-bd1ebf001b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.904 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.904 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.905 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92b7a3d2-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:45 np0005603622 kernel: tap92b7a3d2-90: entered promiscuous mode
Jan 31 02:58:45 np0005603622 NetworkManager[49080]: <info>  [1769846325.9084] manager: (tap92b7a3d2-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.907 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.919 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92b7a3d2-90, col_values=(('external_ids', {'iface-id': 'b33af60b-01fb-4204-b1d7-f9b1d79e127d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:45 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:45Z|00129|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.920 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.926 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.927 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25c548e5-cbb7-48bc-aaea-ac9676e1dd9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.929 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/92b7a3d2-99de-4036-b28b-98f77dab6a25.pid.haproxy
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 92b7a3d2-99de-4036-b28b-98f77dab6a25
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:58:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:45.929 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'env', 'PROCESS_TAG=haproxy-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92b7a3d2-99de-4036-b28b-98f77dab6a25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.978 221324 DEBUG nova.compute.manager [req-17d7b69d-cf05-4aa8-a906-12891f5a0e76 req-4e2b77b8-3579-4f10-90f1-02489e90021f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.979 221324 DEBUG oslo_concurrency.lockutils [req-17d7b69d-cf05-4aa8-a906-12891f5a0e76 req-4e2b77b8-3579-4f10-90f1-02489e90021f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.979 221324 DEBUG oslo_concurrency.lockutils [req-17d7b69d-cf05-4aa8-a906-12891f5a0e76 req-4e2b77b8-3579-4f10-90f1-02489e90021f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.980 221324 DEBUG oslo_concurrency.lockutils [req-17d7b69d-cf05-4aa8-a906-12891f5a0e76 req-4e2b77b8-3579-4f10-90f1-02489e90021f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:45 np0005603622 nova_compute[221301]: 2026-01-31 07:58:45.980 221324 DEBUG nova.compute.manager [req-17d7b69d-cf05-4aa8-a906-12891f5a0e76 req-4e2b77b8-3579-4f10-90f1-02489e90021f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Processing event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.289 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.291 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846326.287692, 398d0932-8bb1-453a-b131-005935297a74 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.292 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] VM Started (Lifecycle Event)#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.296 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.300 221324 INFO nova.virt.libvirt.driver [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] Instance spawned successfully.#033[00m
Jan 31 02:58:46 np0005603622 podman[237473]: 2026-01-31 07:58:46.300505703 +0000 UTC m=+0.055994205 container create 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.300 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.319 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.325 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.330 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.331 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.331 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.332 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.332 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.332 221324 DEBUG nova.virt.libvirt.driver [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:46 np0005603622 systemd[1]: Started libpod-conmon-2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053.scope.
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.355 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.356 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846326.2897153, 398d0932-8bb1-453a-b131-005935297a74 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.357 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:58:46 np0005603622 podman[237473]: 2026-01-31 07:58:46.269455668 +0000 UTC m=+0.024944190 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:58:46 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:58:46 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1874557af069dd6da6384382d1a0ccf99f45d3a3a1ad8d6dcceddcd2b4f69bcd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:58:46 np0005603622 podman[237473]: 2026-01-31 07:58:46.397131452 +0000 UTC m=+0.152620304 container init 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.397 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.400 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846326.2959077, 398d0932-8bb1-453a-b131-005935297a74 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.401 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:58:46 np0005603622 podman[237473]: 2026-01-31 07:58:46.403267999 +0000 UTC m=+0.158756501 container start 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 02:58:46 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [NOTICE]   (237493) : New worker (237495) forked
Jan 31 02:58:46 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [NOTICE]   (237493) : Loading success.
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.423 221324 INFO nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Took 6.75 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.424 221324 DEBUG nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.426 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.433 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.472 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.534 221324 INFO nova.compute.manager [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Took 8.08 seconds to build instance.#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.556 221324 DEBUG oslo_concurrency.lockutils [None req-15c12442-a434-4613-bfde-081f2d1dbaeb 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.595 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.596 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.612 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.680 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.680 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.687 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.688 221324 INFO nova.compute.claims [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 02:58:46 np0005603622 nova_compute[221301]: 2026-01-31 07:58:46.798 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:58:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2488816792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.238 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.242 221324 DEBUG nova.compute.provider_tree [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.268 221324 DEBUG nova.scheduler.client.report [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.312 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.313 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.380 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.381 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.402 221324 INFO nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.420 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.521 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.524 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.524 221324 INFO nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Creating image(s)#033[00m
Jan 31 02:58:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:47.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.554 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.585 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.612 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.616 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.638 221324 DEBUG nova.policy [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5b93162787e405080a5a790c1847434', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbf6b6306ca449dfb064371ec88681f5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.677 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.678 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.678 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.678 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.704 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.708 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:47 np0005603622 nova_compute[221301]: 2026-01-31 07:58:47.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.018 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.310s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.081 221324 DEBUG nova.compute.manager [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.082 221324 DEBUG oslo_concurrency.lockutils [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.082 221324 DEBUG oslo_concurrency.lockutils [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.082 221324 DEBUG oslo_concurrency.lockutils [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.082 221324 DEBUG nova.compute.manager [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] No waiting events found dispatching network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.082 221324 WARNING nova.compute.manager [req-8509205c-d2b8-4758-ae0a-374924f72e82 req-a034af88-1f86-4938-95b2-3a9a6147c80c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received unexpected event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd for instance with vm_state active and task_state None.#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.087 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] resizing rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.197 221324 DEBUG nova.objects.instance [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lazy-loading 'migration_context' on Instance uuid 3fa092d5-409e-4eee-a7dd-993869d5e7d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.215 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.215 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Ensure instance console log exists: /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.216 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.218 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:48 np0005603622 nova_compute[221301]: 2026-01-31 07:58:48.219 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:48.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:49 np0005603622 nova_compute[221301]: 2026-01-31 07:58:49.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:49 np0005603622 nova_compute[221301]: 2026-01-31 07:58:49.351 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Successfully created port: 01f0ddb0-1553-44cd-9a70-560dd3fbb08d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 02:58:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:49.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.150 221324 DEBUG nova.compute.manager [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.151 221324 DEBUG nova.compute.manager [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing instance network info cache due to event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.151 221324 DEBUG oslo_concurrency.lockutils [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.151 221324 DEBUG oslo_concurrency.lockutils [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.151 221324 DEBUG nova.network.neutron [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:50 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.440 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Successfully updated port: 01f0ddb0-1553-44cd-9a70-560dd3fbb08d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.483 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.484 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquired lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.484 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 02:58:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:50.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.561 221324 DEBUG nova.compute.manager [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.562 221324 DEBUG nova.compute.manager [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing instance network info cache due to event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.562 221324 DEBUG oslo_concurrency.lockutils [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:50 np0005603622 nova_compute[221301]: 2026-01-31 07:58:50.687 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 02:58:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:58:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:51.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:58:51 np0005603622 nova_compute[221301]: 2026-01-31 07:58:51.559 221324 DEBUG nova.network.neutron [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updated VIF entry in instance network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:51 np0005603622 nova_compute[221301]: 2026-01-31 07:58:51.560 221324 DEBUG nova.network.neutron [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:51 np0005603622 nova_compute[221301]: 2026-01-31 07:58:51.626 221324 DEBUG oslo_concurrency.lockutils [req-4a93d530-1e9b-4977-a38a-755113585b3f req-2e2cb954-09a4-4af0-9b2a-2fc8c8c8a257 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.084 221324 DEBUG nova.network.neutron [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updating instance_info_cache with network_info: [{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.119 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Releasing lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.119 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Instance network_info: |[{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.120 221324 DEBUG oslo_concurrency.lockutils [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.120 221324 DEBUG nova.network.neutron [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.123 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Start _get_guest_xml network_info=[{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.128 221324 WARNING nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.135 221324 DEBUG nova.virt.libvirt.host [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.136 221324 DEBUG nova.virt.libvirt.host [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.140 221324 DEBUG nova.virt.libvirt.host [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.141 221324 DEBUG nova.virt.libvirt.host [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.142 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.143 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.143 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.143 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.144 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.144 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.144 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.145 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.145 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.145 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.145 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.145 221324 DEBUG nova.virt.hardware [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.148 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:52.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3804718627' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.601 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.634 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.644 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.674 221324 DEBUG nova.compute.manager [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.674 221324 DEBUG nova.compute.manager [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing instance network info cache due to event network-changed-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.674 221324 DEBUG oslo_concurrency.lockutils [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.675 221324 DEBUG oslo_concurrency.lockutils [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.675 221324 DEBUG nova.network.neutron [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Refreshing network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:58:52 np0005603622 nova_compute[221301]: 2026-01-31 07:58:52.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 02:58:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/812192320' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.069 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.071 221324 DEBUG nova.virt.libvirt.vif [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1051226512',display_name='tempest-FloatingIPsAssociationTestJSON-server-1051226512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1051226512',id=46,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbf6b6306ca449dfb064371ec88681f5',ramdisk_id='',reservation_id='r-js219900',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-338180924',owner_user_name='tempest-FloatingIPsAssociationTestJSON-338180924-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:47Z,user_data=None,user_id='e5b93162787e405080a5a790c1847434',uuid=3fa092d5-409e-4eee-a7dd-993869d5e7d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.071 221324 DEBUG nova.network.os_vif_util [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converting VIF {"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.073 221324 DEBUG nova.network.os_vif_util [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.075 221324 DEBUG nova.objects.instance [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fa092d5-409e-4eee-a7dd-993869d5e7d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.102 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] End _get_guest_xml xml=<domain type="kvm">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <uuid>3fa092d5-409e-4eee-a7dd-993869d5e7d2</uuid>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <name>instance-0000002e</name>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1051226512</nova:name>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 07:58:52</nova:creationTime>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:user uuid="e5b93162787e405080a5a790c1847434">tempest-FloatingIPsAssociationTestJSON-338180924-project-member</nova:user>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:project uuid="dbf6b6306ca449dfb064371ec88681f5">tempest-FloatingIPsAssociationTestJSON-338180924</nova:project>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <nova:port uuid="01f0ddb0-1553-44cd-9a70-560dd3fbb08d">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <system>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="serial">3fa092d5-409e-4eee-a7dd-993869d5e7d2</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="uuid">3fa092d5-409e-4eee-a7dd-993869d5e7d2</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </system>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <os>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </os>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <features>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </features>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </clock>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  <devices>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </source>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      </auth>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </disk>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:82:94:47"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <target dev="tap01f0ddb0-15"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </interface>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/console.log" append="off"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </serial>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <video>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </video>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </rng>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 02:58:53 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 02:58:53 np0005603622 nova_compute[221301]:  </devices>
Jan 31 02:58:53 np0005603622 nova_compute[221301]: </domain>
Jan 31 02:58:53 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.103 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Preparing to wait for external event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.104 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.104 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.104 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.105 221324 DEBUG nova.virt.libvirt.vif [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T07:58:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1051226512',display_name='tempest-FloatingIPsAssociationTestJSON-server-1051226512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1051226512',id=46,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbf6b6306ca449dfb064371ec88681f5',ramdisk_id='',reservation_id='r-js219900',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-338180924',owner_user_name='tempest-FloatingIPsAssociationTestJSON-338180924-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T07:58:47Z,user_data=None,user_id='e5b93162787e405080a5a790c1847434',uuid=3fa092d5-409e-4eee-a7dd-993869d5e7d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.105 221324 DEBUG nova.network.os_vif_util [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converting VIF {"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.106 221324 DEBUG nova.network.os_vif_util [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.106 221324 DEBUG os_vif [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.107 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.107 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.108 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.114 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01f0ddb0-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.114 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01f0ddb0-15, col_values=(('external_ids', {'iface-id': '01f0ddb0-1553-44cd-9a70-560dd3fbb08d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:94:47', 'vm-uuid': '3fa092d5-409e-4eee-a7dd-993869d5e7d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:53 np0005603622 NetworkManager[49080]: <info>  [1769846333.1186] manager: (tap01f0ddb0-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.124 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.125 221324 INFO os_vif [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15')#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.193 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.193 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.194 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] No VIF found with MAC fa:16:3e:82:94:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.195 221324 INFO nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Using config drive#033[00m
Jan 31 02:58:53 np0005603622 nova_compute[221301]: 2026-01-31 07:58:53.235 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:53.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.225 221324 INFO nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Creating config drive at /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.228 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi9wy2w_z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.352 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi9wy2w_z" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.402 221324 DEBUG nova.storage.rbd_utils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] rbd image 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.409 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:58:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:58:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:54.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.571 221324 DEBUG oslo_concurrency.processutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config 3fa092d5-409e-4eee-a7dd-993869d5e7d2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.572 221324 INFO nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Deleting local config drive /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2/disk.config because it was imported into RBD.#033[00m
Jan 31 02:58:54 np0005603622 kernel: tap01f0ddb0-15: entered promiscuous mode
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.6284] manager: (tap01f0ddb0-15): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.626 221324 DEBUG nova.network.neutron [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updated VIF entry in instance network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.628 221324 DEBUG nova.network.neutron [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updating instance_info_cache with network_info: [{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:54Z|00130|binding|INFO|Claiming lport 01f0ddb0-1553-44cd-9a70-560dd3fbb08d for this chassis.
Jan 31 02:58:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:54Z|00131|binding|INFO|01f0ddb0-1553-44cd-9a70-560dd3fbb08d: Claiming fa:16:3e:82:94:47 10.100.0.12
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.648 221324 DEBUG oslo_concurrency.lockutils [req-db80b9d3-116c-40f4-aa1d-9dec23840b83 req-910ffc0e-2ddf-4523-b379-ef0ab75493e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.650 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:94:47 10.100.0.12'], port_security=['fa:16:3e:82:94:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3fa092d5-409e-4eee-a7dd-993869d5e7d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c28cb14d-420d-4734-af33-c452602f84f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbf6b6306ca449dfb064371ec88681f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8f2274c0-e32f-4e92-b2d3-ffc323775fce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01325513-d5d2-4c4f-b0c6-fdb3a77ce88f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=01f0ddb0-1553-44cd-9a70-560dd3fbb08d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.653 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d in datapath c28cb14d-420d-4734-af33-c452602f84f5 bound to our chassis#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.655 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c28cb14d-420d-4734-af33-c452602f84f5#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:54Z|00132|binding|INFO|Setting lport 01f0ddb0-1553-44cd-9a70-560dd3fbb08d ovn-installed in OVS
Jan 31 02:58:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:54Z|00133|binding|INFO|Setting lport 01f0ddb0-1553-44cd-9a70-560dd3fbb08d up in Southbound
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.668 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e42d3cf8-9083-4789-a9fd-525d7d907952]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.671 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc28cb14d-41 in ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 02:58:54 np0005603622 systemd-udevd[237828]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.676 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc28cb14d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.676 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bc45e5cd-334b-4b08-b815-fe44859e104f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.677 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3cf708-7a84-41a0-924f-03af263a382e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 systemd-machined[190406]: New machine qemu-19-instance-0000002e.
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.6906] device (tap01f0ddb0-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.6919] device (tap01f0ddb0-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.689 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5b667f-1035-46d1-903f-7512f6f8c7c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 systemd[1]: Started Virtual Machine qemu-19-instance-0000002e.
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.701 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd1752e-8cc3-42b2-be7d-ccf25f020033]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.730 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d08c9715-7d41-45ee-be46-50eb56700d15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.735 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b71dd286-0187-4f4e-9b06-b27d14e70140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.7364] manager: (tapc28cb14d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.739 221324 DEBUG nova.network.neutron [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updated VIF entry in instance network info cache for port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.739 221324 DEBUG nova.network.neutron [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.759 221324 DEBUG oslo_concurrency.lockutils [req-67cfbbd6-d58a-4c03-b085-f7b5eb29fd87 req-4aaece09-a7a5-49ee-aaff-78d67262acc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.765 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[874bf511-b4da-4702-8dd8-1072c82209da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.770 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f7af0a64-d5b5-4279-92d4-1d49079b8db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.7923] device (tapc28cb14d-40): carrier: link connected
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.796 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[172a100b-de44-4248-9446-8d3ee20efb81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.810 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[afa1a60d-0bf2-4ad6-8957-25c2651f75c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc28cb14d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:59:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558613, 'reachable_time': 19852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237861, 'error': None, 'target': 'ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.824 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1494bbb8-e7cb-4c41-8d57-da0966a52c96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:594f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558613, 'tstamp': 558613}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237862, 'error': None, 'target': 'ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.837 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8a977c20-230c-45eb-ab96-5e195ab51d34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc28cb14d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:59:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558613, 'reachable_time': 19852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237863, 'error': None, 'target': 'ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.866 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4a47c1-6c5d-4aa4-b4ac-a33441ddea87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.913 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bea058cb-f3b7-4848-9e01-ba7e98f8f6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.915 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc28cb14d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.916 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.916 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc28cb14d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.959 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 NetworkManager[49080]: <info>  [1769846334.9607] manager: (tapc28cb14d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 31 02:58:54 np0005603622 kernel: tapc28cb14d-40: entered promiscuous mode
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.963 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.965 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc28cb14d-40, col_values=(('external_ids', {'iface-id': 'fca56836-0662-40f0-bef3-510bc5f598ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.966 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.967 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c28cb14d-420d-4734-af33-c452602f84f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c28cb14d-420d-4734-af33-c452602f84f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 02:58:54 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:54Z|00134|binding|INFO|Releasing lport fca56836-0662-40f0-bef3-510bc5f598ce from this chassis (sb_readonly=0)
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.968 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f7612559-7106-48ec-bfa7-56d991930d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.969 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c28cb14d-420d-4734-af33-c452602f84f5
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c28cb14d-420d-4734-af33-c452602f84f5.pid.haproxy
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c28cb14d-420d-4734-af33-c452602f84f5
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 02:58:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:58:54.970 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5', 'env', 'PROCESS_TAG=haproxy-c28cb14d-420d-4734-af33-c452602f84f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c28cb14d-420d-4734-af33-c452602f84f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 02:58:54 np0005603622 nova_compute[221301]: 2026-01-31 07:58:54.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.135 221324 DEBUG nova.compute.manager [req-ebd7022d-fc8c-4e82-801e-03d3054d6c23 req-8484cdec-4498-4634-a932-a384c7e2779a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.136 221324 DEBUG oslo_concurrency.lockutils [req-ebd7022d-fc8c-4e82-801e-03d3054d6c23 req-8484cdec-4498-4634-a932-a384c7e2779a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.136 221324 DEBUG oslo_concurrency.lockutils [req-ebd7022d-fc8c-4e82-801e-03d3054d6c23 req-8484cdec-4498-4634-a932-a384c7e2779a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.137 221324 DEBUG oslo_concurrency.lockutils [req-ebd7022d-fc8c-4e82-801e-03d3054d6c23 req-8484cdec-4498-4634-a932-a384c7e2779a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.137 221324 DEBUG nova.compute.manager [req-ebd7022d-fc8c-4e82-801e-03d3054d6c23 req-8484cdec-4498-4634-a932-a384c7e2779a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Processing event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 02:58:55 np0005603622 podman[237931]: 2026-01-31 07:58:55.371711951 +0000 UTC m=+0.078952519 container create 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.394 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846335.3939254, 3fa092d5-409e-4eee-a7dd-993869d5e7d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.394 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] VM Started (Lifecycle Event)#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.396 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.401 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.406 221324 INFO nova.virt.libvirt.driver [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Instance spawned successfully.#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.407 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 02:58:55 np0005603622 systemd[1]: Started libpod-conmon-6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a.scope.
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.416 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:55 np0005603622 podman[237931]: 2026-01-31 07:58:55.321707681 +0000 UTC m=+0.028948249 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.426 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:55 np0005603622 systemd[1]: Started libcrun container.
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.439 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.440 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.441 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.442 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.442 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/439119f8191bc70992b108e1d87e79f04e16f6bf7afe8409a3b24f539b7c5e6e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.443 221324 DEBUG nova.virt.libvirt.driver [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.448 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.449 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846335.3940258, 3fa092d5-409e-4eee-a7dd-993869d5e7d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.449 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] VM Paused (Lifecycle Event)#033[00m
Jan 31 02:58:55 np0005603622 podman[237931]: 2026-01-31 07:58:55.457606459 +0000 UTC m=+0.164847047 container init 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:58:55 np0005603622 podman[237931]: 2026-01-31 07:58:55.462228354 +0000 UTC m=+0.169468922 container start 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 02:58:55 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [NOTICE]   (237956) : New worker (237959) forked
Jan 31 02:58:55 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [NOTICE]   (237956) : Loading success.
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.493 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.498 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846335.4003916, 3fa092d5-409e-4eee-a7dd-993869d5e7d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.498 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] VM Resumed (Lifecycle Event)#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.511 221324 INFO nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Took 7.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.511 221324 DEBUG nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.521 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.526 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 02:58:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:55.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.560 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.583 221324 INFO nova.compute.manager [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Took 8.93 seconds to build instance.#033[00m
Jan 31 02:58:55 np0005603622 nova_compute[221301]: 2026-01-31 07:58:55.605 221324 DEBUG oslo_concurrency.lockutils [None req-01ed6581-600a-40a3-a39f-b92d0adcfa51 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:56.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.256 221324 DEBUG nova.compute.manager [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.257 221324 DEBUG oslo_concurrency.lockutils [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.257 221324 DEBUG oslo_concurrency.lockutils [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.258 221324 DEBUG oslo_concurrency.lockutils [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.258 221324 DEBUG nova.compute.manager [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] No waiting events found dispatching network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.259 221324 WARNING nova.compute.manager [req-6d787436-c375-419f-af56-51db46fa4e64 req-d0be1e35-1b49-4f2a-a45f-e278ee90020a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received unexpected event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d for instance with vm_state active and task_state None.#033[00m
Jan 31 02:58:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:58:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:57.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:58:57 np0005603622 nova_compute[221301]: 2026-01-31 07:58:57.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603622 nova_compute[221301]: 2026-01-31 07:58:58.116 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603622 NetworkManager[49080]: <info>  [1769846338.2873] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 31 02:58:58 np0005603622 NetworkManager[49080]: <info>  [1769846338.2890] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 31 02:58:58 np0005603622 nova_compute[221301]: 2026-01-31 07:58:58.293 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603622 nova_compute[221301]: 2026-01-31 07:58:58.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:58Z|00135|binding|INFO|Releasing lport fca56836-0662-40f0-bef3-510bc5f598ce from this chassis (sb_readonly=0)
Jan 31 02:58:58 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:58Z|00136|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:58:58 np0005603622 nova_compute[221301]: 2026-01-31 07:58:58.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:58:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 02:58:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:58:58.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 02:58:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:59Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:e4:7e 10.100.0.5
Jan 31 02:58:59 np0005603622 ovn_controller[130109]: 2026-01-31T07:58:59Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:e4:7e 10.100.0.5
Jan 31 02:58:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:58:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:58:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:58:59.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:58:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:00.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:01.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:02.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:02 np0005603622 nova_compute[221301]: 2026-01-31 07:59:02.734 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:03 np0005603622 nova_compute[221301]: 2026-01-31 07:59:03.117 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:03 np0005603622 podman[237969]: 2026-01-31 07:59:03.16758566 +0000 UTC m=+0.089491926 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 31 02:59:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:03.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:04.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:05.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 02:59:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:06.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 02:59:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:07.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:07 np0005603622 nova_compute[221301]: 2026-01-31 07:59:07.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:08 np0005603622 nova_compute[221301]: 2026-01-31 07:59:08.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:08.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:08 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:08Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:94:47 10.100.0.12
Jan 31 02:59:08 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:08Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:94:47 10.100.0.12
Jan 31 02:59:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:09.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 31 02:59:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:10 np0005603622 podman[237998]: 2026-01-31 07:59:10.160558981 +0000 UTC m=+0.066740206 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 02:59:10 np0005603622 nova_compute[221301]: 2026-01-31 07:59:10.480 221324 DEBUG nova.compute.manager [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:10 np0005603622 nova_compute[221301]: 2026-01-31 07:59:10.481 221324 DEBUG nova.compute.manager [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing instance network info cache due to event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:10 np0005603622 nova_compute[221301]: 2026-01-31 07:59:10.482 221324 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:10 np0005603622 nova_compute[221301]: 2026-01-31 07:59:10.482 221324 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:10 np0005603622 nova_compute[221301]: 2026-01-31 07:59:10.482 221324 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:10.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 31 02:59:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:11.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 31 02:59:12 np0005603622 nova_compute[221301]: 2026-01-31 07:59:12.467 221324 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updated VIF entry in instance network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:12 np0005603622 nova_compute[221301]: 2026-01-31 07:59:12.468 221324 DEBUG nova.network.neutron [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updating instance_info_cache with network_info: [{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:12.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:12 np0005603622 nova_compute[221301]: 2026-01-31 07:59:12.768 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:12 np0005603622 nova_compute[221301]: 2026-01-31 07:59:12.779 221324 DEBUG oslo_concurrency.lockutils [req-e35119a1-11f0-4f9e-8e00-7561b01742b1 req-f8eb6a31-d1bd-4893-803b-118c41623f78 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:13 np0005603622 nova_compute[221301]: 2026-01-31 07:59:13.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:13.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:14.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:14 np0005603622 nova_compute[221301]: 2026-01-31 07:59:14.713 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:14 np0005603622 nova_compute[221301]: 2026-01-31 07:59:14.715 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 02:59:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:15.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 31 02:59:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:16.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:17.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:17 np0005603622 nova_compute[221301]: 2026-01-31 07:59:17.769 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:18 np0005603622 nova_compute[221301]: 2026-01-31 07:59:18.122 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:18 np0005603622 nova_compute[221301]: 2026-01-31 07:59:18.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:18 np0005603622 nova_compute[221301]: 2026-01-31 07:59:18.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:18.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:19 np0005603622 nova_compute[221301]: 2026-01-31 07:59:19.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:19.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.322 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:21.322 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:21.325 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 02:59:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:21.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.657 221324 DEBUG nova.compute.manager [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.657 221324 DEBUG nova.compute.manager [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing instance network info cache due to event network-changed-01f0ddb0-1553-44cd-9a70-560dd3fbb08d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.657 221324 DEBUG oslo_concurrency.lockutils [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.658 221324 DEBUG oslo_concurrency.lockutils [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.658 221324 DEBUG nova.network.neutron [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Refreshing network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.719 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.720 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.720 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 02:59:21 np0005603622 nova_compute[221301]: 2026-01-31 07:59:21.720 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 398d0932-8bb1-453a-b131-005935297a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:22.328 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:22.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:22 np0005603622 nova_compute[221301]: 2026-01-31 07:59:22.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.123 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.265 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.265 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.266 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.266 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.266 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.268 221324 INFO nova.compute.manager [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Terminating instance#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.269 221324 DEBUG nova.compute.manager [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 02:59:23 np0005603622 kernel: tap01f0ddb0-15 (unregistering): left promiscuous mode
Jan 31 02:59:23 np0005603622 NetworkManager[49080]: <info>  [1769846363.3216] device (tap01f0ddb0-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 02:59:23 np0005603622 systemd[1]: Starting dnf makecache...
Jan 31 02:59:23 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:23Z|00137|binding|INFO|Releasing lport 01f0ddb0-1553-44cd-9a70-560dd3fbb08d from this chassis (sb_readonly=0)
Jan 31 02:59:23 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:23Z|00138|binding|INFO|Setting lport 01f0ddb0-1553-44cd-9a70-560dd3fbb08d down in Southbound
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.341 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:23Z|00139|binding|INFO|Removing iface tap01f0ddb0-15 ovn-installed in OVS
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.343 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.356 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.354 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:94:47 10.100.0.12'], port_security=['fa:16:3e:82:94:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3fa092d5-409e-4eee-a7dd-993869d5e7d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c28cb14d-420d-4734-af33-c452602f84f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbf6b6306ca449dfb064371ec88681f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f2274c0-e32f-4e92-b2d3-ffc323775fce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01325513-d5d2-4c4f-b0c6-fdb3a77ce88f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=01f0ddb0-1553-44cd-9a70-560dd3fbb08d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.357 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d in datapath c28cb14d-420d-4734-af33-c452602f84f5 unbound from our chassis#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.358 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c28cb14d-420d-4734-af33-c452602f84f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.360 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[297b2c61-2fdb-4fc0-8765-c911c431bf93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.362 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5 namespace which is not needed anymore#033[00m
Jan 31 02:59:23 np0005603622 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 31 02:59:23 np0005603622 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002e.scope: Consumed 14.100s CPU time.
Jan 31 02:59:23 np0005603622 systemd-machined[190406]: Machine qemu-19-instance-0000002e terminated.
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.508 221324 INFO nova.virt.libvirt.driver [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Instance destroyed successfully.#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.509 221324 DEBUG nova.objects.instance [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lazy-loading 'resources' on Instance uuid 3fa092d5-409e-4eee-a7dd-993869d5e7d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 02:59:23 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [NOTICE]   (237956) : haproxy version is 2.8.14-c23fe91
Jan 31 02:59:23 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [NOTICE]   (237956) : path to executable is /usr/sbin/haproxy
Jan 31 02:59:23 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [WARNING]  (237956) : Exiting Master process...
Jan 31 02:59:23 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [ALERT]    (237956) : Current worker (237959) exited with code 143 (Terminated)
Jan 31 02:59:23 np0005603622 neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5[237950]: [WARNING]  (237956) : All workers exited. Exiting... (0)
Jan 31 02:59:23 np0005603622 systemd[1]: libpod-6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a.scope: Deactivated successfully.
Jan 31 02:59:23 np0005603622 podman[238043]: 2026-01-31 07:59:23.521726079 +0000 UTC m=+0.054965116 container died 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 02:59:23 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a-userdata-shm.mount: Deactivated successfully.
Jan 31 02:59:23 np0005603622 systemd[1]: var-lib-containers-storage-overlay-439119f8191bc70992b108e1d87e79f04e16f6bf7afe8409a3b24f539b7c5e6e-merged.mount: Deactivated successfully.
Jan 31 02:59:23 np0005603622 podman[238043]: 2026-01-31 07:59:23.566763965 +0000 UTC m=+0.100003012 container cleanup 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 02:59:23 np0005603622 systemd[1]: libpod-conmon-6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a.scope: Deactivated successfully.
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.588 221324 DEBUG nova.virt.libvirt.vif [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:58:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1051226512',display_name='tempest-FloatingIPsAssociationTestJSON-server-1051226512',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1051226512',id=46,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:58:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dbf6b6306ca449dfb064371ec88681f5',ramdisk_id='',reservation_id='r-js219900',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-338180924',owner_user_name='tempest-FloatingIPsAssociationTestJSON-338180924-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:58:55Z,user_data=None,user_id='e5b93162787e405080a5a790c1847434',uuid=3fa092d5-409e-4eee-a7dd-993869d5e7d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.588 221324 DEBUG nova.network.os_vif_util [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converting VIF {"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.589 221324 DEBUG nova.network.os_vif_util [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.590 221324 DEBUG os_vif [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 02:59:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:23.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.594 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.594 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01f0ddb0-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.598 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.603 221324 INFO os_vif [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:94:47,bridge_name='br-int',has_traffic_filtering=True,id=01f0ddb0-1553-44cd-9a70-560dd3fbb08d,network=Network(c28cb14d-420d-4734-af33-c452602f84f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01f0ddb0-15')#033[00m
Jan 31 02:59:23 np0005603622 dnf[238019]: Metadata cache refreshed recently.
Jan 31 02:59:23 np0005603622 podman[238081]: 2026-01-31 07:59:23.640706127 +0000 UTC m=+0.053243090 container remove 6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.647 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7b2cdc81-7cbb-48ef-9810-dddfa41aaf6f]: (4, ('Sat Jan 31 07:59:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5 (6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a)\n6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a\nSat Jan 31 07:59:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5 (6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a)\n6073446357cdd7ef87e633223257129878d085f37d45398b3c0c3bb0dc73c19a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.649 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ef3630-7790-4c2f-953e-aea81a73f095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.650 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc28cb14d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 02:59:23 np0005603622 kernel: tapc28cb14d-40: left promiscuous mode
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.659 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.661 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39781152-2e03-40cb-b364-8829b2b67f1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.678 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b90e264-c472-429f-86f2-8de6d29a7911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.680 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1c60eaa6-bc95-497c-9769-c57cb9af896a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.693 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c4450f82-2c6a-4e13-a9ee-a06d87317ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558606, 'reachable_time': 20599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238114, 'error': None, 'target': 'ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc28cb14d\x2d420d\x2d4734\x2daf33\x2dc452602f84f5.mount: Deactivated successfully.
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.699 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c28cb14d-420d-4734-af33-c452602f84f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.699 221324 DEBUG nova.compute.manager [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-unplugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.700 221324 DEBUG oslo_concurrency.lockutils [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.700 221324 DEBUG oslo_concurrency.lockutils [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.700 221324 DEBUG oslo_concurrency.lockutils [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:23.699 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[85743307-4e59-40c2-a7af-9d33f0079bf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.700 221324 DEBUG nova.compute.manager [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] No waiting events found dispatching network-vif-unplugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.700 221324 DEBUG nova.compute.manager [req-e1a112f4-2ec7-42cd-9367-9a6e677e3446 req-b70e61fd-ca43-48da-91cf-f60f0830d94e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-unplugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 02:59:23 np0005603622 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 02:59:23 np0005603622 systemd[1]: Finished dnf makecache.
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.756 221324 DEBUG nova.network.neutron [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updated VIF entry in instance network info cache for port 01f0ddb0-1553-44cd-9a70-560dd3fbb08d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.757 221324 DEBUG nova.network.neutron [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updating instance_info_cache with network_info: [{"id": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "address": "fa:16:3e:82:94:47", "network": {"id": "c28cb14d-420d-4734-af33-c452602f84f5", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-462404100-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbf6b6306ca449dfb064371ec88681f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01f0ddb0-15", "ovs_interfaceid": "01f0ddb0-1553-44cd-9a70-560dd3fbb08d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:23 np0005603622 nova_compute[221301]: 2026-01-31 07:59:23.819 221324 DEBUG oslo_concurrency.lockutils [req-27ae1103-44a8-45cb-be02-b3b5306d08d5 req-a7f5c002-f1bf-49ee-8e75-bbc9eacc9243 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-3fa092d5-409e-4eee-a7dd-993869d5e7d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.252 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [{"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.502 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-398d0932-8bb1-453a-b131-005935297a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.502 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.503 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.503 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.503 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.504 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 02:59:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:24.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.590 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.590 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.590 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.590 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 02:59:24 np0005603622 nova_compute[221301]: 2026-01-31 07:59:24.591 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1543704575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.224 221324 INFO nova.virt.libvirt.driver [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Deleting instance files /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2_del#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.225 221324 INFO nova.virt.libvirt.driver [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Deletion of /var/lib/nova/instances/3fa092d5-409e-4eee-a7dd-993869d5e7d2_del complete#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.229 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.639s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.357 221324 INFO nova.compute.manager [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Took 2.09 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.358 221324 DEBUG oslo.service.loopingcall [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.358 221324 DEBUG nova.compute.manager [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.358 221324 DEBUG nova.network.neutron [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.516 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000002c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.517 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000002c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.518 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Error from libvirt while getting description of instance-0000002e: [Error Code 42] Domain not found: no domain with matching uuid '3fa092d5-409e-4eee-a7dd-993869d5e7d2' (instance-0000002e): libvirt.libvirtError: Domain not found: no domain with matching uuid '3fa092d5-409e-4eee-a7dd-993869d5e7d2' (instance-0000002e)#033[00m
Jan 31 02:59:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:25.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.709 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.711 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4596MB free_disk=20.764564514160156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.711 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.711 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.902 221324 DEBUG nova.compute.manager [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.903 221324 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.903 221324 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.903 221324 DEBUG oslo_concurrency.lockutils [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.903 221324 DEBUG nova.compute.manager [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] No waiting events found dispatching network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.903 221324 WARNING nova.compute.manager [req-a4526f78-d539-445c-8962-ea94915f0dc7 req-0d8617b0-c2ca-4a40-9923-51166dd7bfab fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received unexpected event network-vif-plugged-01f0ddb0-1553-44cd-9a70-560dd3fbb08d for instance with vm_state active and task_state deleting.#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.924 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 398d0932-8bb1-453a-b131-005935297a74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.924 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 3fa092d5-409e-4eee-a7dd-993869d5e7d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.924 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 02:59:25 np0005603622 nova_compute[221301]: 2026-01-31 07:59:25.924 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.009 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/974636285' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.445 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.451 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.699 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.774 221324 DEBUG nova.network.neutron [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.835 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.836 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:26 np0005603622 nova_compute[221301]: 2026-01-31 07:59:26.933 221324 INFO nova.compute.manager [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Took 1.57 seconds to deallocate network for instance.#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.051 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.051 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.119 221324 DEBUG oslo_concurrency.processutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 02:59:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 02:59:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1496275008' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.580 221324 DEBUG oslo_concurrency.processutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.586 221324 DEBUG nova.compute.provider_tree [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 02:59:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:27.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.623 221324 DEBUG nova.scheduler.client.report [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.743 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.788 221324 DEBUG nova.compute.manager [req-4de373bf-d2d8-4ac7-9948-18ce0d1c6a35 req-f93ae8c0-4a6e-4e82-a088-c58cf8efaad3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Received event network-vif-deleted-01f0ddb0-1553-44cd-9a70-560dd3fbb08d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 02:59:27 np0005603622 nova_compute[221301]: 2026-01-31 07:59:27.816 221324 INFO nova.scheduler.client.report [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Deleted allocations for instance 3fa092d5-409e-4eee-a7dd-993869d5e7d2#033[00m
Jan 31 02:59:28 np0005603622 nova_compute[221301]: 2026-01-31 07:59:28.079 221324 DEBUG oslo_concurrency.lockutils [None req-71ecc519-651a-438e-a180-bc1e00ce23a0 e5b93162787e405080a5a790c1847434 dbf6b6306ca449dfb064371ec88681f5 - - default default] Lock "3fa092d5-409e-4eee-a7dd-993869d5e7d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:28 np0005603622 nova_compute[221301]: 2026-01-31 07:59:28.651 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:29.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 31 02:59:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:30.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:30.854 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 02:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:30.855 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 02:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 07:59:30.855 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 02:59:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:31.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:32.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:32 np0005603622 nova_compute[221301]: 2026-01-31 07:59:32.781 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:33.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:33 np0005603622 nova_compute[221301]: 2026-01-31 07:59:33.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:34 np0005603622 podman[238183]: 2026-01-31 07:59:34.187435545 +0000 UTC m=+0.085390013 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 02:59:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:34.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:35.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:36.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 31 02:59:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:37.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:37 np0005603622 nova_compute[221301]: 2026-01-31 07:59:37.783 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:38 np0005603622 nova_compute[221301]: 2026-01-31 07:59:38.508 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846363.5067449, 3fa092d5-409e-4eee-a7dd-993869d5e7d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 02:59:38 np0005603622 nova_compute[221301]: 2026-01-31 07:59:38.508 221324 INFO nova.compute.manager [-] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] VM Stopped (Lifecycle Event)#033[00m
Jan 31 02:59:38 np0005603622 nova_compute[221301]: 2026-01-31 07:59:38.579 221324 DEBUG nova.compute.manager [None req-2162ddae-ce95-46e5-a213-c7381575d351 - - - - - -] [instance: 3fa092d5-409e-4eee-a7dd-993869d5e7d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 02:59:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:38.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:38 np0005603622 nova_compute[221301]: 2026-01-31 07:59:38.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:39.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:40.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:41 np0005603622 podman[238340]: 2026-01-31 07:59:41.152846069 +0000 UTC m=+0.077573652 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 02:59:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:42.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:42 np0005603622 nova_compute[221301]: 2026-01-31 07:59:42.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:43.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:43 np0005603622 nova_compute[221301]: 2026-01-31 07:59:43.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 02:59:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 02:59:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 02:59:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:44.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:45.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:46.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:47.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:47 np0005603622 nova_compute[221301]: 2026-01-31 07:59:47.787 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:48.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:48 np0005603622 nova_compute[221301]: 2026-01-31 07:59:48.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:49.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:49 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:49Z|00140|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:49 np0005603622 nova_compute[221301]: 2026-01-31 07:59:49.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:49 np0005603622 ovn_controller[130109]: 2026-01-31T07:59:49Z|00141|binding|INFO|Releasing lport b33af60b-01fb-4204-b1d7-f9b1d79e127d from this chassis (sb_readonly=0)
Jan 31 02:59:49 np0005603622 nova_compute[221301]: 2026-01-31 07:59:49.713 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:50.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:51.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:52.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:52 np0005603622 nova_compute[221301]: 2026-01-31 07:59:52.791 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 31 02:59:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:53.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:53 np0005603622 nova_compute[221301]: 2026-01-31 07:59:53.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:54.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 02:59:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 02:59:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:55.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 02:59:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:56.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:57.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 02:59:57 np0005603622 nova_compute[221301]: 2026-01-31 07:59:57.794 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 02:59:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:07:59:58.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 02:59:58 np0005603622 nova_compute[221301]: 2026-01-31 07:59:58.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 02:59:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 02:59:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 02:59:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:07:59:59.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:00.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:00:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:00:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:01.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:01 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:00:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 31 03:00:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:02.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:02 np0005603622 nova_compute[221301]: 2026-01-31 08:00:02.795 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 31 03:00:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:03.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:03 np0005603622 nova_compute[221301]: 2026-01-31 08:00:03.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:04.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:05 np0005603622 podman[238411]: 2026-01-31 08:00:05.177326525 +0000 UTC m=+0.096227520 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:00:05 np0005603622 nova_compute[221301]: 2026-01-31 08:00:05.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:05.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 31 03:00:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:06.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:07.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:07 np0005603622 nova_compute[221301]: 2026-01-31 08:00:07.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:08.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:08 np0005603622 nova_compute[221301]: 2026-01-31 08:00:08.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 31 03:00:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:09.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:10.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.245 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.246 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.246 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.247 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.247 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.249 221324 INFO nova.compute.manager [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Terminating instance#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.251 221324 DEBUG nova.compute.manager [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:00:11 np0005603622 kernel: tap3c9485bf-c0 (unregistering): left promiscuous mode
Jan 31 03:00:11 np0005603622 NetworkManager[49080]: <info>  [1769846411.3071] device (tap3c9485bf-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.307 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.318 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:00:11Z|00142|binding|INFO|Releasing lport 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd from this chassis (sb_readonly=0)
Jan 31 03:00:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:00:11Z|00143|binding|INFO|Setting lport 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd down in Southbound
Jan 31 03:00:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:00:11Z|00144|binding|INFO|Removing iface tap3c9485bf-c0 ovn-installed in OVS
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.321 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.327 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:e4:7e 10.100.0.5'], port_security=['fa:16:3e:9e:e4:7e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '398d0932-8bb1-453a-b131-005935297a74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f31b0319126848a5b8fd9521dc509172', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2814a167-00ed-4304-830b-a99e04552970 676d4b42-e676-4b95-993b-5744e33649a9 7886d77a-2fd6-41af-97fc-abc8ba4ee615', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a935fc-1163-43c6-97c6-acf0f9c4194f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.330 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9485bf-c01f-49c4-89bc-8ed14efcd7dd in datapath 92b7a3d2-99de-4036-b28b-98f77dab6a25 unbound from our chassis#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.332 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92b7a3d2-99de-4036-b28b-98f77dab6a25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.335 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[46950dff-e0bf-4384-87c0-693caacb9898]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.336 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 namespace which is not needed anymore#033[00m
Jan 31 03:00:11 np0005603622 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Jan 31 03:00:11 np0005603622 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002c.scope: Consumed 16.120s CPU time.
Jan 31 03:00:11 np0005603622 systemd-machined[190406]: Machine qemu-18-instance-0000002c terminated.
Jan 31 03:00:11 np0005603622 podman[238438]: 2026-01-31 08:00:11.407861601 +0000 UTC m=+0.065198865 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:00:11 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [NOTICE]   (237493) : haproxy version is 2.8.14-c23fe91
Jan 31 03:00:11 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [NOTICE]   (237493) : path to executable is /usr/sbin/haproxy
Jan 31 03:00:11 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [ALERT]    (237493) : Current worker (237495) exited with code 143 (Terminated)
Jan 31 03:00:11 np0005603622 neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25[237489]: [WARNING]  (237493) : All workers exited. Exiting... (0)
Jan 31 03:00:11 np0005603622 systemd[1]: libpod-2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053.scope: Deactivated successfully.
Jan 31 03:00:11 np0005603622 podman[238479]: 2026-01-31 08:00:11.474082633 +0000 UTC m=+0.045227361 container died 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.495 221324 INFO nova.virt.libvirt.driver [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] Instance destroyed successfully.#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.496 221324 DEBUG nova.objects.instance [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lazy-loading 'resources' on Instance uuid 398d0932-8bb1-453a-b131-005935297a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:11 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053-userdata-shm.mount: Deactivated successfully.
Jan 31 03:00:11 np0005603622 systemd[1]: var-lib-containers-storage-overlay-1874557af069dd6da6384382d1a0ccf99f45d3a3a1ad8d6dcceddcd2b4f69bcd-merged.mount: Deactivated successfully.
Jan 31 03:00:11 np0005603622 podman[238479]: 2026-01-31 08:00:11.512667243 +0000 UTC m=+0.083811981 container cleanup 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.520 221324 DEBUG nova.virt.libvirt.vif [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T07:58:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2040319596',display_name='tempest-SecurityGroupsTestJSON-server-2040319596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2040319596',id=44,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T07:58:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f31b0319126848a5b8fd9521dc509172',ramdisk_id='',reservation_id='r-grx3dghc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-648078268',owner_user_name='tempest-SecurityGroupsTestJSON-648078268-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T07:58:46Z,user_data=None,user_id='0e402088c09448e1a6f0cd61b11e0816',uuid=398d0932-8bb1-453a-b131-005935297a74,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.521 221324 DEBUG nova.network.os_vif_util [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converting VIF {"id": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "address": "fa:16:3e:9e:e4:7e", "network": {"id": "92b7a3d2-99de-4036-b28b-98f77dab6a25", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1264866811-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f31b0319126848a5b8fd9521dc509172", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9485bf-c0", "ovs_interfaceid": "3c9485bf-c01f-49c4-89bc-8ed14efcd7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.522 221324 DEBUG nova.network.os_vif_util [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:00:11 np0005603622 systemd[1]: libpod-conmon-2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053.scope: Deactivated successfully.
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.522 221324 DEBUG os_vif [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.525 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c9485bf-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.532 221324 INFO os_vif [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:e4:7e,bridge_name='br-int',has_traffic_filtering=True,id=3c9485bf-c01f-49c4-89bc-8ed14efcd7dd,network=Network(92b7a3d2-99de-4036-b28b-98f77dab6a25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9485bf-c0')#033[00m
Jan 31 03:00:11 np0005603622 podman[238518]: 2026-01-31 08:00:11.578783652 +0000 UTC m=+0.045737715 container remove 2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.584 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff6601f-42ec-42dd-917c-2f046bd18109]: (4, ('Sat Jan 31 08:00:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053)\n2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053\nSat Jan 31 08:00:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 (2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053)\n2db7cbaa29bd9856f6e4134e3ac7d4f43e3514412765bec9091fb4a41b30d053\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.586 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[383083f1-33a2-4567-b8bb-bb2958f62d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.587 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92b7a3d2-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.589 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 kernel: tap92b7a3d2-90: left promiscuous mode
Jan 31 03:00:11 np0005603622 nova_compute[221301]: 2026-01-31 08:00:11.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.606 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8359dd6f-ba13-4646-b670-b0c1b3990096]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.622 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[db852d84-912a-4795-9307-a6f7269f326d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.623 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ad843f7f-4bd1-44f9-b892-3615dceacad0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.637 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0fece21-2e69-419b-a1d3-913b6e16a82b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557703, 'reachable_time': 22133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238551, 'error': None, 'target': 'ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.640 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92b7a3d2-99de-4036-b28b-98f77dab6a25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:00:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:11.640 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd918d4-26ca-40ea-8874-73e8711e73d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:00:11 np0005603622 systemd[1]: run-netns-ovnmeta\x2d92b7a3d2\x2d99de\x2d4036\x2db28b\x2d98f77dab6a25.mount: Deactivated successfully.
Jan 31 03:00:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:11.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.338 221324 DEBUG nova.compute.manager [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-unplugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.339 221324 DEBUG oslo_concurrency.lockutils [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.339 221324 DEBUG oslo_concurrency.lockutils [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.339 221324 DEBUG oslo_concurrency.lockutils [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.339 221324 DEBUG nova.compute.manager [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] No waiting events found dispatching network-vif-unplugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.339 221324 DEBUG nova.compute.manager [req-b9d2efc8-f2af-421c-a9c9-4c32068a741c req-769046d5-6b5b-4a93-92cf-28e11250591a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-unplugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:00:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000055s ======
Jan 31 03:00:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.981 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquiring lock "a24ac16c-df64-4cef-a252-1f1c38920602" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.982 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.982 221324 INFO nova.compute.manager [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Unshelving#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.988 221324 INFO nova.virt.libvirt.driver [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Deleting instance files /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74_del#033[00m
Jan 31 03:00:12 np0005603622 nova_compute[221301]: 2026-01-31 08:00:12.989 221324 INFO nova.virt.libvirt.driver [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Deletion of /var/lib/nova/instances/398d0932-8bb1-453a-b131-005935297a74_del complete#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.072 221324 INFO nova.compute.manager [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Took 1.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.072 221324 DEBUG oslo.service.loopingcall [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.073 221324 DEBUG nova.compute.manager [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.073 221324 DEBUG nova.network.neutron [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.123 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.123 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.130 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'pci_requests' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.150 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'numa_topology' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.165 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.165 221324 INFO nova.compute.claims [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.364 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.471 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.471 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:00:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:13.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:13 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/485500457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.812 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.816 221324 DEBUG nova.compute.provider_tree [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.850 221324 DEBUG nova.scheduler.client.report [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:00:13 np0005603622 nova_compute[221301]: 2026-01-31 08:00:13.875 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.137 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquiring lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.138 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquired lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.138 221324 DEBUG nova.network.neutron [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.336 221324 DEBUG nova.network.neutron [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.363 221324 INFO nova.compute.manager [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] Took 1.29 seconds to deallocate network for instance.#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.438 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.439 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.456 221324 DEBUG nova.compute.manager [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.457 221324 DEBUG oslo_concurrency.lockutils [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "398d0932-8bb1-453a-b131-005935297a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.458 221324 DEBUG oslo_concurrency.lockutils [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.458 221324 DEBUG oslo_concurrency.lockutils [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.459 221324 DEBUG nova.compute.manager [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] No waiting events found dispatching network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.459 221324 WARNING nova.compute.manager [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received unexpected event network-vif-plugged-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.460 221324 DEBUG nova.compute.manager [req-aa782403-c9a0-4090-9025-8b2260b442e5 req-6f119eb2-9146-40e6-92cf-05da849a9734 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 398d0932-8bb1-453a-b131-005935297a74] Received event network-vif-deleted-3c9485bf-c01f-49c4-89bc-8ed14efcd7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.470 221324 DEBUG nova.network.neutron [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.534 221324 DEBUG oslo_concurrency.processutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:14.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/641425555' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.983 221324 DEBUG oslo_concurrency.processutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:14 np0005603622 nova_compute[221301]: 2026-01-31 08:00:14.988 221324 DEBUG nova.compute.provider_tree [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.014 221324 DEBUG nova.scheduler.client.report [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.036 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.061 221324 INFO nova.scheduler.client.report [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Deleted allocations for instance 398d0932-8bb1-453a-b131-005935297a74#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.160 221324 DEBUG oslo_concurrency.lockutils [None req-23d73b7d-f1d2-499e-b8a3-3ee4ce37ec20 0e402088c09448e1a6f0cd61b11e0816 f31b0319126848a5b8fd9521dc509172 - - default default] Lock "398d0932-8bb1-453a-b131-005935297a74" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.339 221324 DEBUG nova.network.neutron [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.514 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Releasing lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.516 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.517 221324 INFO nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Creating image(s)#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.550 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.554 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:15.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.771 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.804 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.808 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquiring lock "80508e5378c83b1dad0f33f893e753cd20914039" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:15 np0005603622 nova_compute[221301]: 2026-01-31 08:00:15.808 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "80508e5378c83b1dad0f33f893e753cd20914039" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:16 np0005603622 nova_compute[221301]: 2026-01-31 08:00:16.174 221324 DEBUG nova.virt.libvirt.imagebackend [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/919869d2-31bd-400c-b71c-e086ee5512c0/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/919869d2-31bd-400c-b71c-e086ee5512c0/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:00:16 np0005603622 nova_compute[221301]: 2026-01-31 08:00:16.229 221324 DEBUG nova.virt.libvirt.imagebackend [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/919869d2-31bd-400c-b71c-e086ee5512c0/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:00:16 np0005603622 nova_compute[221301]: 2026-01-31 08:00:16.230 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] cloning images/919869d2-31bd-400c-b71c-e086ee5512c0@snap to None/a24ac16c-df64-4cef-a252-1f1c38920602_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:00:16 np0005603622 nova_compute[221301]: 2026-01-31 08:00:16.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:16.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 31 03:00:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:17.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:17 np0005603622 nova_compute[221301]: 2026-01-31 08:00:17.801 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:18.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:19.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:20 np0005603622 nova_compute[221301]: 2026-01-31 08:00:20.427 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:20 np0005603622 nova_compute[221301]: 2026-01-31 08:00:20.428 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:20.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.301 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "80508e5378c83b1dad0f33f893e753cd20914039" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.411 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.412 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.413 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.500 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'migration_context' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:21.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:21 np0005603622 nova_compute[221301]: 2026-01-31 08:00:21.698 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] flattening vms/a24ac16c-df64-4cef-a252-1f1c38920602_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.432 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.432 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:22.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.841 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3484770409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:22 np0005603622 nova_compute[221301]: 2026-01-31 08:00:22.868 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.064 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.065 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4809MB free_disk=20.97982406616211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.065 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.066 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.187 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance a24ac16c-df64-4cef-a252-1f1c38920602 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.187 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.188 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.222 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.239 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.239 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.255 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.278 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.335 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.574 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Image rbd:vms/a24ac16c-df64-4cef-a252-1f1c38920602_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.575 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.576 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Ensure instance console log exists: /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.577 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.577 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.577 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.580 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T07:59:36Z,direct_url=<?>,disk_format='raw',id=919869d2-31bd-400c-b71c-e086ee5512c0,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1524536714-shelved',owner='0f3a75a965fc495bbe02cb5bfad2053b',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:00:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.585 221324 WARNING nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.589 221324 DEBUG nova.virt.libvirt.host [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.590 221324 DEBUG nova.virt.libvirt.host [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.605 221324 DEBUG nova.virt.libvirt.host [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.607 221324 DEBUG nova.virt.libvirt.host [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.610 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.610 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T07:59:36Z,direct_url=<?>,disk_format='raw',id=919869d2-31bd-400c-b71c-e086ee5512c0,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1524536714-shelved',owner='0f3a75a965fc495bbe02cb5bfad2053b',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:00:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.611 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.611 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.611 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.612 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.612 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.613 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.613 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.613 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.614 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.614 221324 DEBUG nova.virt.hardware [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.614 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:23.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.697 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049522806' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.756 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.761 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:00:23 np0005603622 nova_compute[221301]: 2026-01-31 08:00:23.817 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.033 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.033 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:00:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/208235013' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.207 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.236 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.240 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:24.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:00:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2802673278' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.722 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.724 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'pci_devices' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:24 np0005603622 nova_compute[221301]: 2026-01-31 08:00:24.883 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <uuid>a24ac16c-df64-4cef-a252-1f1c38920602</uuid>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <name>instance-0000002d</name>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1524536714</nova:name>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:00:23</nova:creationTime>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:user uuid="2d20e7b6189c4916947ddae2155da8cf">tempest-UnshelveToHostMultiNodesTest-849130801-project-member</nova:user>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <nova:project uuid="0f3a75a965fc495bbe02cb5bfad2053b">tempest-UnshelveToHostMultiNodesTest-849130801</nova:project>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="919869d2-31bd-400c-b71c-e086ee5512c0"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <nova:ports/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="serial">a24ac16c-df64-4cef-a252-1f1c38920602</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="uuid">a24ac16c-df64-4cef-a252-1f1c38920602</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a24ac16c-df64-4cef-a252-1f1c38920602_disk">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a24ac16c-df64-4cef-a252-1f1c38920602_disk.config">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/console.log" append="off"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:00:24 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:00:24 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:00:24 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:00:24 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.032 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.033 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.142 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.143 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.147 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.147 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.148 221324 INFO nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Using config drive#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.174 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.217 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'ec2_ids' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.578 221324 DEBUG nova.objects.instance [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lazy-loading 'keypairs' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:25.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.883 221324 INFO nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Creating config drive at /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config#033[00m
Jan 31 03:00:25 np0005603622 nova_compute[221301]: 2026-01-31 08:00:25.887 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpuc6pfykk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.014 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpuc6pfykk" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:26.033 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:00:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:26.034 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:00:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:26.035 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.055 221324 DEBUG nova.storage.rbd_utils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] rbd image a24ac16c-df64-4cef-a252-1f1c38920602_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.061 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config a24ac16c-df64-4cef-a252-1f1c38920602_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.081 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.426 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.426 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.444 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.493 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846411.4924738, 398d0932-8bb1-453a-b131-005935297a74 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.494 221324 INFO nova.compute.manager [-] [instance: 398d0932-8bb1-453a-b131-005935297a74] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.521 221324 DEBUG nova.compute.manager [None req-6da3d969-e937-4595-b581-faf4ae495a86 - - - - - -] [instance: 398d0932-8bb1-453a-b131-005935297a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.648 221324 DEBUG oslo_concurrency.processutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config a24ac16c-df64-4cef-a252-1f1c38920602_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:26 np0005603622 nova_compute[221301]: 2026-01-31 08:00:26.649 221324 INFO nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Deleting local config drive /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602/disk.config because it was imported into RBD.#033[00m
Jan 31 03:00:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:26.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:26 np0005603622 systemd-machined[190406]: New machine qemu-20-instance-0000002d.
Jan 31 03:00:26 np0005603622 systemd[1]: Started Virtual Machine qemu-20-instance-0000002d.
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.448 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846427.4481578, a24ac16c-df64-4cef-a252-1f1c38920602 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.451 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.453 221324 DEBUG nova.compute.manager [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.454 221324 DEBUG nova.virt.libvirt.driver [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.458 221324 INFO nova.virt.libvirt.driver [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance spawned successfully.#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.486 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.493 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.529 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.530 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846427.4493334, a24ac16c-df64-4cef-a252-1f1c38920602 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.530 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] VM Started (Lifecycle Event)#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.561 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.565 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.585 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:00:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:27.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:27 np0005603622 nova_compute[221301]: 2026-01-31 08:00:27.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:28.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 31 03:00:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:29.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:30.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:30.855 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:30.856 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:00:30.856 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:31 np0005603622 nova_compute[221301]: 2026-01-31 08:00:31.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:31.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:32.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:32 np0005603622 nova_compute[221301]: 2026-01-31 08:00:32.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:33.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 31 03:00:35 np0005603622 nova_compute[221301]: 2026-01-31 08:00:35.601 221324 DEBUG nova.compute.manager [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:35.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:36 np0005603622 nova_compute[221301]: 2026-01-31 08:00:36.067 221324 DEBUG oslo_concurrency.lockutils [None req-fd7feb9c-7b47-42b6-bb16-32d85f310530 aca4a3ac27c249718268b71ac43ef4f9 546c0d49cea64695b5f4ee15ea24aa42 - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 23.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:36 np0005603622 podman[239032]: 2026-01-31 08:00:36.170137449 +0000 UTC m=+0.087055820 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 03:00:36 np0005603622 nova_compute[221301]: 2026-01-31 08:00:36.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:36.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.550 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Acquiring lock "a24ac16c-df64-4cef-a252-1f1c38920602" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.550 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.550 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Acquiring lock "a24ac16c-df64-4cef-a252-1f1c38920602-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.551 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.551 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.552 221324 INFO nova.compute.manager [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Terminating instance#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.553 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Acquiring lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.553 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Acquired lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.553 221324 DEBUG nova.network.neutron [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:00:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:37.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.817 221324 DEBUG nova.network.neutron [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:00:37 np0005603622 nova_compute[221301]: 2026-01-31 08:00:37.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:38 np0005603622 nova_compute[221301]: 2026-01-31 08:00:38.641 221324 DEBUG nova.network.neutron [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:00:38 np0005603622 nova_compute[221301]: 2026-01-31 08:00:38.673 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Releasing lock "refresh_cache-a24ac16c-df64-4cef-a252-1f1c38920602" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:00:38 np0005603622 nova_compute[221301]: 2026-01-31 08:00:38.674 221324 DEBUG nova.compute.manager [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:00:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:00:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:38.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:00:38 np0005603622 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 31 03:00:38 np0005603622 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002d.scope: Consumed 12.242s CPU time.
Jan 31 03:00:38 np0005603622 systemd-machined[190406]: Machine qemu-20-instance-0000002d terminated.
Jan 31 03:00:38 np0005603622 nova_compute[221301]: 2026-01-31 08:00:38.900 221324 INFO nova.virt.libvirt.driver [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance destroyed successfully.#033[00m
Jan 31 03:00:38 np0005603622 nova_compute[221301]: 2026-01-31 08:00:38.900 221324 DEBUG nova.objects.instance [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lazy-loading 'resources' on Instance uuid a24ac16c-df64-4cef-a252-1f1c38920602 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:00:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:39.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 31 03:00:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:40.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:41 np0005603622 nova_compute[221301]: 2026-01-31 08:00:41.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:41.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:42 np0005603622 podman[239077]: 2026-01-31 08:00:42.147108986 +0000 UTC m=+0.072584535 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 03:00:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:42.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 31 03:00:42 np0005603622 nova_compute[221301]: 2026-01-31 08:00:42.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:42 np0005603622 nova_compute[221301]: 2026-01-31 08:00:42.977 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:00:43 np0005603622 nova_compute[221301]: 2026-01-31 08:00:43.280 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid a24ac16c-df64-4cef-a252-1f1c38920602 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:00:43 np0005603622 nova_compute[221301]: 2026-01-31 08:00:43.281 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "a24ac16c-df64-4cef-a252-1f1c38920602" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:43.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:44.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:45.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.472 221324 INFO nova.virt.libvirt.driver [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Deleting instance files /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602_del#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.473 221324 INFO nova.virt.libvirt.driver [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Deletion of /var/lib/nova/instances/a24ac16c-df64-4cef-a252-1f1c38920602_del complete#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.668 221324 INFO nova.compute.manager [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Took 7.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.669 221324 DEBUG oslo.service.loopingcall [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.669 221324 DEBUG nova.compute.manager [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:00:46 np0005603622 nova_compute[221301]: 2026-01-31 08:00:46.669 221324 DEBUG nova.network.neutron [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:00:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:46.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:47 np0005603622 nova_compute[221301]: 2026-01-31 08:00:47.267 221324 DEBUG nova.network.neutron [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:00:47 np0005603622 nova_compute[221301]: 2026-01-31 08:00:47.380 221324 DEBUG nova.network.neutron [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:00:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 31 03:00:47 np0005603622 nova_compute[221301]: 2026-01-31 08:00:47.691 221324 INFO nova.compute.manager [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Took 1.02 seconds to deallocate network for instance.#033[00m
Jan 31 03:00:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:47.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:47 np0005603622 nova_compute[221301]: 2026-01-31 08:00:47.920 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:48 np0005603622 nova_compute[221301]: 2026-01-31 08:00:48.282 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:00:48 np0005603622 nova_compute[221301]: 2026-01-31 08:00:48.282 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:48.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:48 np0005603622 nova_compute[221301]: 2026-01-31 08:00:48.790 221324 DEBUG oslo_concurrency.processutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:00:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:00:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/496846209' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:00:49 np0005603622 nova_compute[221301]: 2026-01-31 08:00:49.222 221324 DEBUG oslo_concurrency.processutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:00:49 np0005603622 nova_compute[221301]: 2026-01-31 08:00:49.231 221324 DEBUG nova.compute.provider_tree [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:00:49 np0005603622 nova_compute[221301]: 2026-01-31 08:00:49.435 221324 DEBUG nova.scheduler.client.report [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:00:49 np0005603622 nova_compute[221301]: 2026-01-31 08:00:49.559 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:49.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:49 np0005603622 nova_compute[221301]: 2026-01-31 08:00:49.809 221324 INFO nova.scheduler.client.report [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Deleted allocations for instance a24ac16c-df64-4cef-a252-1f1c38920602#033[00m
Jan 31 03:00:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:50 np0005603622 nova_compute[221301]: 2026-01-31 08:00:50.345 221324 DEBUG oslo_concurrency.lockutils [None req-56b3b9fe-c6a5-4ced-ad1b-afc9dfef1f79 2d20e7b6189c4916947ddae2155da8cf 0f3a75a965fc495bbe02cb5bfad2053b - - default default] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:50 np0005603622 nova_compute[221301]: 2026-01-31 08:00:50.347 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 7.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:00:50 np0005603622 nova_compute[221301]: 2026-01-31 08:00:50.348 221324 INFO nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:00:50 np0005603622 nova_compute[221301]: 2026-01-31 08:00:50.348 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "a24ac16c-df64-4cef-a252-1f1c38920602" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:00:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:50.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 31 03:00:51 np0005603622 nova_compute[221301]: 2026-01-31 08:00:51.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:51.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:52.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:00:52 np0005603622 nova_compute[221301]: 2026-01-31 08:00:52.923 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:53.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:53 np0005603622 nova_compute[221301]: 2026-01-31 08:00:53.898 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846438.8975675, a24ac16c-df64-4cef-a252-1f1c38920602 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:00:53 np0005603622 nova_compute[221301]: 2026-01-31 08:00:53.898 221324 INFO nova.compute.manager [-] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:00:53 np0005603622 nova_compute[221301]: 2026-01-31 08:00:53.959 221324 DEBUG nova.compute.manager [None req-e7446a80-5a91-4169-b329-26faa3b21ba0 - - - - - -] [instance: a24ac16c-df64-4cef-a252-1f1c38920602] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:00:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:54.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:00:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:55.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:56 np0005603622 nova_compute[221301]: 2026-01-31 08:00:56.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:56.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 31 03:00:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:57.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:57 np0005603622 nova_compute[221301]: 2026-01-31 08:00:57.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:00:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:00:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:00:58.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:00:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:00:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:00:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:00:59.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:00.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:01 np0005603622 nova_compute[221301]: 2026-01-31 08:01:01.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:01.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:02.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:02 np0005603622 nova_compute[221301]: 2026-01-31 08:01:02.926 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:01:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:01:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:03.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:04Z|00145|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:01:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:04.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:06 np0005603622 nova_compute[221301]: 2026-01-31 08:01:06.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:06.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:07 np0005603622 podman[239263]: 2026-01-31 08:01:07.171306521 +0000 UTC m=+0.100784903 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 03:01:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:07.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:07 np0005603622 nova_compute[221301]: 2026-01-31 08:01:07.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:08.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:09.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:01:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:10.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:01:11 np0005603622 nova_compute[221301]: 2026-01-31 08:01:11.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:01:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:11.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:01:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:12.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:12 np0005603622 nova_compute[221301]: 2026-01-31 08:01:12.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:13 np0005603622 podman[239290]: 2026-01-31 08:01:13.1434884 +0000 UTC m=+0.063463297 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:01:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:13.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:13 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:01:13 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:01:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:14.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:15 np0005603622 nova_compute[221301]: 2026-01-31 08:01:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:15 np0005603622 nova_compute[221301]: 2026-01-31 08:01:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:01:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:15.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:16 np0005603622 nova_compute[221301]: 2026-01-31 08:01:16.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:16.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:01:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:17.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:17 np0005603622 nova_compute[221301]: 2026-01-31 08:01:17.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:01:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:19.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:01:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:20 np0005603622 nova_compute[221301]: 2026-01-31 08:01:20.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:20.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:21 np0005603622 nova_compute[221301]: 2026-01-31 08:01:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:21 np0005603622 nova_compute[221301]: 2026-01-31 08:01:21.669 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:21.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:22 np0005603622 nova_compute[221301]: 2026-01-31 08:01:22.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603622 nova_compute[221301]: 2026-01-31 08:01:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:22.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:22 np0005603622 nova_compute[221301]: 2026-01-31 08:01:22.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:23 np0005603622 nova_compute[221301]: 2026-01-31 08:01:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:23.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.520 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.520 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.520 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.521 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.521 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:24.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:01:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2552327880' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:01:24 np0005603622 nova_compute[221301]: 2026-01-31 08:01:24.993 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.145 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.146 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4799MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.146 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.147 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.458 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.458 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.494 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:25.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:01:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1156978640' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.919 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.926 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:01:25 np0005603622 nova_compute[221301]: 2026-01-31 08:01:25.988 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:01:26 np0005603622 nova_compute[221301]: 2026-01-31 08:01:26.042 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:01:26 np0005603622 nova_compute[221301]: 2026-01-31 08:01:26.043 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:26.148 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:01:26 np0005603622 nova_compute[221301]: 2026-01-31 08:01:26.148 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:26.149 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:01:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:26.150 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:26 np0005603622 nova_compute[221301]: 2026-01-31 08:01:26.671 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.044 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.045 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.045 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.079 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.079 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.216 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.217 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.242 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.335 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.336 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.349 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.350 221324 INFO nova.compute.claims [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.532 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:27.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:01:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1129637320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.955 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.961 221324 DEBUG nova.compute.provider_tree [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.988 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:27 np0005603622 nova_compute[221301]: 2026-01-31 08:01:27.994 221324 DEBUG nova.scheduler.client.report [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.046 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.048 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.148 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.149 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.231 221324 INFO nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.264 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.419 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.421 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.421 221324 INFO nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Creating image(s)#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.452 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.482 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.511 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.515 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.585 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.586 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.586 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.587 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.614 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.619 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f985f921-eed3-43c3-85e4-d15c03ebf004_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:28 np0005603622 nova_compute[221301]: 2026-01-31 08:01:28.688 221324 DEBUG nova.policy [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c03bfebfc7e34f83b204997b98c8e690', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e7009d6ac414afab6cdf704bd382133', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:01:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:28.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:29 np0005603622 nova_compute[221301]: 2026-01-31 08:01:29.231 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f985f921-eed3-43c3-85e4-d15c03ebf004_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.612s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:29 np0005603622 nova_compute[221301]: 2026-01-31 08:01:29.316 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] resizing rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:01:29 np0005603622 nova_compute[221301]: 2026-01-31 08:01:29.433 221324 DEBUG nova.objects.instance [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lazy-loading 'migration_context' on Instance uuid f985f921-eed3-43c3-85e4-d15c03ebf004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:01:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:29.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:30 np0005603622 nova_compute[221301]: 2026-01-31 08:01:30.633 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:01:30 np0005603622 nova_compute[221301]: 2026-01-31 08:01:30.634 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Ensure instance console log exists: /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:01:30 np0005603622 nova_compute[221301]: 2026-01-31 08:01:30.634 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:30 np0005603622 nova_compute[221301]: 2026-01-31 08:01:30.635 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:30 np0005603622 nova_compute[221301]: 2026-01-31 08:01:30.635 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:30.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:30.856 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:30.857 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:30.858 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:31 np0005603622 nova_compute[221301]: 2026-01-31 08:01:31.683 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:31.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:32.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:32 np0005603622 nova_compute[221301]: 2026-01-31 08:01:32.997 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:33 np0005603622 nova_compute[221301]: 2026-01-31 08:01:33.596 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Successfully created port: c44951c8-f6ae-4440-8a09-2e528fc0c506 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:01:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:33.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:34.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:36 np0005603622 nova_compute[221301]: 2026-01-31 08:01:36.685 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:36.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.566 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Successfully updated port: c44951c8-f6ae-4440-8a09-2e528fc0c506 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.589 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.589 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquired lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.589 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.811 221324 DEBUG nova.compute.manager [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-changed-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.811 221324 DEBUG nova.compute.manager [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Refreshing instance network info cache due to event network-changed-c44951c8-f6ae-4440-8a09-2e528fc0c506. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:01:37 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.812 221324 DEBUG oslo_concurrency.lockutils [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:01:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:38 np0005603622 nova_compute[221301]: 2026-01-31 08:01:37.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:38 np0005603622 nova_compute[221301]: 2026-01-31 08:01:38.118 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:01:38 np0005603622 podman[239594]: 2026-01-31 08:01:38.178326972 +0000 UTC m=+0.101052751 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:01:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:38.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:39.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.433 221324 DEBUG nova.network.neutron [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updating instance_info_cache with network_info: [{"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.465 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Releasing lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.466 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Instance network_info: |[{"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.467 221324 DEBUG oslo_concurrency.lockutils [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.467 221324 DEBUG nova.network.neutron [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Refreshing network info cache for port c44951c8-f6ae-4440-8a09-2e528fc0c506 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.472 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Start _get_guest_xml network_info=[{"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.478 221324 WARNING nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.493 221324 DEBUG nova.virt.libvirt.host [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.495 221324 DEBUG nova.virt.libvirt.host [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.504 221324 DEBUG nova.virt.libvirt.host [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.505 221324 DEBUG nova.virt.libvirt.host [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.507 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.507 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.508 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.509 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.509 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.509 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.510 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.510 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.510 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.511 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.511 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.511 221324 DEBUG nova.virt.hardware [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.516 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:40.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:01:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1638455687' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.938 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.979 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:40 np0005603622 nova_compute[221301]: 2026-01-31 08:01:40.987 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:01:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3055071718' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.450 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.453 221324 DEBUG nova.virt.libvirt.vif [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:01:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=51,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/uYRaqYHa3qmKDs8j510SjH/x09/LJdT5DB1Lyp5jyC8Bx9QGRkfREVLYyTBjRyp3qZ0eWyDgxo/U7fGt44lzHTc5UaKeSpk2hZqOZ347xDdbXcrkvJa9lpPhooHBhSg==',key_name='tempest-keypair-1034345940',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e7009d6ac414afab6cdf704bd382133',ramdisk_id='',reservation_id='r-1eago0iq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2117672251',owner_user_name='tempest-ServersTestFqdnHostnames-2117672251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:01:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c03bfebfc7e34f83b204997b98c8e690',uuid=f985f921-eed3-43c3-85e4-d15c03ebf004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.454 221324 DEBUG nova.network.os_vif_util [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converting VIF {"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.455 221324 DEBUG nova.network.os_vif_util [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.456 221324 DEBUG nova.objects.instance [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lazy-loading 'pci_devices' on Instance uuid f985f921-eed3-43c3-85e4-d15c03ebf004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.503 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <uuid>f985f921-eed3-43c3-85e4-d15c03ebf004</uuid>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <name>instance-00000033</name>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:name>guest-instance-1.domain.com</nova:name>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:01:40</nova:creationTime>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:user uuid="c03bfebfc7e34f83b204997b98c8e690">tempest-ServersTestFqdnHostnames-2117672251-project-member</nova:user>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:project uuid="3e7009d6ac414afab6cdf704bd382133">tempest-ServersTestFqdnHostnames-2117672251</nova:project>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <nova:port uuid="c44951c8-f6ae-4440-8a09-2e528fc0c506">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="serial">f985f921-eed3-43c3-85e4-d15c03ebf004</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="uuid">f985f921-eed3-43c3-85e4-d15c03ebf004</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f985f921-eed3-43c3-85e4-d15c03ebf004_disk">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:03:28:10"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <target dev="tapc44951c8-f6"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/console.log" append="off"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:01:41 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:01:41 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:01:41 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:01:41 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.505 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Preparing to wait for external event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.506 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.506 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.506 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.507 221324 DEBUG nova.virt.libvirt.vif [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:01:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=51,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/uYRaqYHa3qmKDs8j510SjH/x09/LJdT5DB1Lyp5jyC8Bx9QGRkfREVLYyTBjRyp3qZ0eWyDgxo/U7fGt44lzHTc5UaKeSpk2hZqOZ347xDdbXcrkvJa9lpPhooHBhSg==',key_name='tempest-keypair-1034345940',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e7009d6ac414afab6cdf704bd382133',ramdisk_id='',reservation_id='r-1eago0iq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-2117672251',owner_user_name='tempest-ServersTestFqdnHostnames-2117672251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:01:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c03bfebfc7e34f83b204997b98c8e690',uuid=f985f921-eed3-43c3-85e4-d15c03ebf004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.508 221324 DEBUG nova.network.os_vif_util [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converting VIF {"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.509 221324 DEBUG nova.network.os_vif_util [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.509 221324 DEBUG os_vif [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.510 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.511 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.511 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.519 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.519 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc44951c8-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.520 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc44951c8-f6, col_values=(('external_ids', {'iface-id': 'c44951c8-f6ae-4440-8a09-2e528fc0c506', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:28:10', 'vm-uuid': 'f985f921-eed3-43c3-85e4-d15c03ebf004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:41 np0005603622 NetworkManager[49080]: <info>  [1769846501.5599] manager: (tapc44951c8-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.565 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.566 221324 INFO os_vif [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6')#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.672 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.673 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.674 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] No VIF found with MAC fa:16:3e:03:28:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.675 221324 INFO nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Using config drive#033[00m
Jan 31 03:01:41 np0005603622 nova_compute[221301]: 2026-01-31 08:01:41.714 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:41.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:42 np0005603622 nova_compute[221301]: 2026-01-31 08:01:42.543 221324 INFO nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Creating config drive at /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config#033[00m
Jan 31 03:01:42 np0005603622 nova_compute[221301]: 2026-01-31 08:01:42.552 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpan5ocv8p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:42 np0005603622 nova_compute[221301]: 2026-01-31 08:01:42.697 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpan5ocv8p" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:42 np0005603622 nova_compute[221301]: 2026-01-31 08:01:42.745 221324 DEBUG nova.storage.rbd_utils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] rbd image f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:01:42 np0005603622 nova_compute[221301]: 2026-01-31 08:01:42.756 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:01:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.002 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.038 221324 DEBUG oslo_concurrency.processutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config f985f921-eed3-43c3-85e4-d15c03ebf004_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.039 221324 INFO nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Deleting local config drive /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004/disk.config because it was imported into RBD.#033[00m
Jan 31 03:01:43 np0005603622 kernel: tapc44951c8-f6: entered promiscuous mode
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.1048] manager: (tapc44951c8-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 31 03:01:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:43Z|00146|binding|INFO|Claiming lport c44951c8-f6ae-4440-8a09-2e528fc0c506 for this chassis.
Jan 31 03:01:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:43Z|00147|binding|INFO|c44951c8-f6ae-4440-8a09-2e528fc0c506: Claiming fa:16:3e:03:28:10 10.100.0.8
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.111 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.117 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.149 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:28:10 10.100.0.8'], port_security=['fa:16:3e:03:28:10 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f985f921-eed3-43c3-85e4-d15c03ebf004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e7009d6ac414afab6cdf704bd382133', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15fe8336-4f1b-400f-a78c-e4c5f81bdd88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac1e6de-12a2-48ba-8b5e-ff38c9d2de74, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=c44951c8-f6ae-4440-8a09-2e528fc0c506) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.151 139793 INFO neutron.agent.ovn.metadata.agent [-] Port c44951c8-f6ae-4440-8a09-2e528fc0c506 in datapath d15ebcc3-fa46-4e00-8ac9-a188c7e792ac bound to our chassis#033[00m
Jan 31 03:01:43 np0005603622 systemd-machined[190406]: New machine qemu-21-instance-00000033.
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.155 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d15ebcc3-fa46-4e00-8ac9-a188c7e792ac#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:43Z|00148|binding|INFO|Setting lport c44951c8-f6ae-4440-8a09-2e528fc0c506 ovn-installed in OVS
Jan 31 03:01:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:43Z|00149|binding|INFO|Setting lport c44951c8-f6ae-4440-8a09-2e528fc0c506 up in Southbound
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.161 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 systemd[1]: Started Virtual Machine qemu-21-instance-00000033.
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.170 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[65fee758-d75f-40d5-938f-539aabb9e9fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.173 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd15ebcc3-f1 in ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.176 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd15ebcc3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.176 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dedd22c0-bd1e-43d4-b1a8-d3c1dbe032ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.178 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[987c4090-7e9b-4e5d-b457-a55d264eb010]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 systemd-udevd[239759]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.1993] device (tapc44951c8-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.1998] device (tapc44951c8-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.203 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[070e15b8-f907-482e-8343-aa09e3b50316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.227 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[37b92062-b911-4409-ab19-b49e689914f9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.257 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6b4fa8-3043-45ce-84d4-aefcfbf36a3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 podman[239758]: 2026-01-31 08:01:43.263353413 +0000 UTC m=+0.077628863 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.2666] manager: (tapd15ebcc3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Jan 31 03:01:43 np0005603622 systemd-udevd[239773]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.266 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[060a72fb-19b7-427b-8bc4-063f28f9c979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.290 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f57e812-07b6-478e-b3ab-bb7bcfe2452b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.293 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a647a27a-83c4-4de7-83e4-0ec828498b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.3119] device (tapd15ebcc3-f0): carrier: link connected
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.317 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7adb44c2-0c44-408c-b158-bb5fd1344240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.331 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4e5eff1f-47b6-4efc-b993-38dc94ebccde]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd15ebcc3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:1e:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575465, 'reachable_time': 39684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239806, 'error': None, 'target': 'ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.343 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f188f76f-bb14-4910-bac5-dc6d7b6e4564]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:1e04'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575465, 'tstamp': 575465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239807, 'error': None, 'target': 'ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.359 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[49f11073-c20e-41ba-bf9a-c29189a27d82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd15ebcc3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:1e:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575465, 'reachable_time': 39684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239808, 'error': None, 'target': 'ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.388 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2030e101-8216-4816-8a7b-37bac2e7c4aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.444 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[df84f7d4-1a98-4528-8afb-f1f31c889dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.446 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd15ebcc3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.447 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.448 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd15ebcc3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:43 np0005603622 kernel: tapd15ebcc3-f0: entered promiscuous mode
Jan 31 03:01:43 np0005603622 NetworkManager[49080]: <info>  [1769846503.4524] manager: (tapd15ebcc3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.455 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd15ebcc3-f0, col_values=(('external_ids', {'iface-id': '3915df68-1b49-4f6d-99ac-5f15b9b4102a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:43Z|00150|binding|INFO|Releasing lport 3915df68-1b49-4f6d-99ac-5f15b9b4102a from this chassis (sb_readonly=0)
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.466 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d15ebcc3-fa46-4e00-8ac9-a188c7e792ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d15ebcc3-fa46-4e00-8ac9-a188c7e792ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.467 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[86d9fcac-f091-4f45-9474-cae307a2d62d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.469 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/d15ebcc3-fa46-4e00-8ac9-a188c7e792ac.pid.haproxy
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID d15ebcc3-fa46-4e00-8ac9-a188c7e792ac
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:01:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:01:43.470 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'env', 'PROCESS_TAG=haproxy-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d15ebcc3-fa46-4e00-8ac9-a188c7e792ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.579 221324 DEBUG nova.network.neutron [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updated VIF entry in instance network info cache for port c44951c8-f6ae-4440-8a09-2e528fc0c506. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.580 221324 DEBUG nova.network.neutron [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updating instance_info_cache with network_info: [{"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.687 221324 DEBUG oslo_concurrency.lockutils [req-6355ac46-c4b5-431d-b3b5-655926315a38 req-d59b0f29-adb4-41ad-b535-43159e1790f3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.786 221324 DEBUG nova.compute.manager [req-4113a085-26e7-44a7-87ad-f412b0277824 req-3a4c59d0-f2bf-45ad-a3fa-b3fe9a050da0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.787 221324 DEBUG oslo_concurrency.lockutils [req-4113a085-26e7-44a7-87ad-f412b0277824 req-3a4c59d0-f2bf-45ad-a3fa-b3fe9a050da0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.787 221324 DEBUG oslo_concurrency.lockutils [req-4113a085-26e7-44a7-87ad-f412b0277824 req-3a4c59d0-f2bf-45ad-a3fa-b3fe9a050da0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.787 221324 DEBUG oslo_concurrency.lockutils [req-4113a085-26e7-44a7-87ad-f412b0277824 req-3a4c59d0-f2bf-45ad-a3fa-b3fe9a050da0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:43 np0005603622 nova_compute[221301]: 2026-01-31 08:01:43.788 221324 DEBUG nova.compute.manager [req-4113a085-26e7-44a7-87ad-f412b0277824 req-3a4c59d0-f2bf-45ad-a3fa-b3fe9a050da0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Processing event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:01:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:43.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:43 np0005603622 podman[239859]: 2026-01-31 08:01:43.901024084 +0000 UTC m=+0.086047018 container create c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:01:43 np0005603622 podman[239859]: 2026-01-31 08:01:43.857109987 +0000 UTC m=+0.042132981 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:01:43 np0005603622 systemd[1]: Started libpod-conmon-c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0.scope.
Jan 31 03:01:43 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:01:43 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb993afaba96aa2c2b79173d3dac792e15d3b991268f42597a28e2f44231ec9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:01:44 np0005603622 podman[239859]: 2026-01-31 08:01:44.011544278 +0000 UTC m=+0.196567262 container init c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:01:44 np0005603622 podman[239859]: 2026-01-31 08:01:44.020229691 +0000 UTC m=+0.205252615 container start c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.050 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846504.048741, f985f921-eed3-43c3-85e4-d15c03ebf004 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.051 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] VM Started (Lifecycle Event)#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.054 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.059 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.064 221324 INFO nova.virt.libvirt.driver [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Instance spawned successfully.#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.065 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:01:44 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [NOTICE]   (239903) : New worker (239905) forked
Jan 31 03:01:44 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [NOTICE]   (239903) : Loading success.
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.138 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.144 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.219 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.220 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.221 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.222 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.223 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.224 221324 DEBUG nova.virt.libvirt.driver [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.288 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.289 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846504.0492127, f985f921-eed3-43c3-85e4-d15c03ebf004 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.289 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.343 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.349 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846504.067756, f985f921-eed3-43c3-85e4-d15c03ebf004 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.350 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.376 221324 INFO nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Took 15.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.377 221324 DEBUG nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.435 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.439 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.623 221324 INFO nova.compute.manager [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Took 17.31 seconds to build instance.#033[00m
Jan 31 03:01:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:44 np0005603622 nova_compute[221301]: 2026-01-31 08:01:44.846 221324 DEBUG oslo_concurrency.lockutils [None req-f0a8c87f-8751-4cc4-8808-480ce6c095af c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:45.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.000 221324 DEBUG nova.compute.manager [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.001 221324 DEBUG oslo_concurrency.lockutils [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.002 221324 DEBUG oslo_concurrency.lockutils [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.002 221324 DEBUG oslo_concurrency.lockutils [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.002 221324 DEBUG nova.compute.manager [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] No waiting events found dispatching network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.003 221324 WARNING nova.compute.manager [req-d27a6b33-1239-4567-bba6-aef61dd69100 req-b73a7346-069f-40a5-aa6b-0a5673c5f228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received unexpected event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:01:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.558 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:46.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:46 np0005603622 NetworkManager[49080]: <info>  [1769846506.9472] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 31 03:01:46 np0005603622 NetworkManager[49080]: <info>  [1769846506.9488] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 31 03:01:46 np0005603622 nova_compute[221301]: 2026-01-31 08:01:46.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:46Z|00151|binding|INFO|Releasing lport 3915df68-1b49-4f6d-99ac-5f15b9b4102a from this chassis (sb_readonly=0)
Jan 31 03:01:47 np0005603622 nova_compute[221301]: 2026-01-31 08:01:47.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 31 03:01:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:47.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:48 np0005603622 nova_compute[221301]: 2026-01-31 08:01:48.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 31 03:01:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:48.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:49.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:50.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:51 np0005603622 nova_compute[221301]: 2026-01-31 08:01:51.560 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:51.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:52.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:53.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.967 221324 DEBUG nova.compute.manager [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-changed-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.967 221324 DEBUG nova.compute.manager [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Refreshing instance network info cache due to event network-changed-c44951c8-f6ae-4440-8a09-2e528fc0c506. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.967 221324 DEBUG oslo_concurrency.lockutils [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.967 221324 DEBUG oslo_concurrency.lockutils [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:01:53 np0005603622 nova_compute[221301]: 2026-01-31 08:01:53.968 221324 DEBUG nova.network.neutron [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Refreshing network info cache for port c44951c8-f6ae-4440-8a09-2e528fc0c506 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:01:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:54.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:01:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:01:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:55.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:01:56 np0005603622 nova_compute[221301]: 2026-01-31 08:01:56.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:56 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:56Z|00152|binding|INFO|Releasing lport 3915df68-1b49-4f6d-99ac-5f15b9b4102a from this chassis (sb_readonly=0)
Jan 31 03:01:56 np0005603622 nova_compute[221301]: 2026-01-31 08:01:56.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:56.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:01:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 31 03:01:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:58 np0005603622 nova_compute[221301]: 2026-01-31 08:01:58.009 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:58 np0005603622 nova_compute[221301]: 2026-01-31 08:01:58.438 221324 DEBUG nova.network.neutron [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updated VIF entry in instance network info cache for port c44951c8-f6ae-4440-8a09-2e528fc0c506. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:01:58 np0005603622 nova_compute[221301]: 2026-01-31 08:01:58.439 221324 DEBUG nova.network.neutron [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updating instance_info_cache with network_info: [{"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:01:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 31 03:01:58 np0005603622 nova_compute[221301]: 2026-01-31 08:01:58.780 221324 DEBUG oslo_concurrency.lockutils [req-070de036-8414-4772-8104-03f0474b52ff req-1ca1ac53-c3f8-4631-b0b3-b1a0be54f68a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f985f921-eed3-43c3-85e4-d15c03ebf004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:01:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:01:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:01:58.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:01:59 np0005603622 nova_compute[221301]: 2026-01-31 08:01:59.255 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:01:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:59Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:28:10 10.100.0.8
Jan 31 03:01:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:01:59Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:28:10 10.100.0.8
Jan 31 03:01:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:01:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:01:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:01:59.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:00.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:01 np0005603622 nova_compute[221301]: 2026-01-31 08:02:01.565 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:02:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:01.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:02:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:02:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 19K writes, 76K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 19K writes, 6005 syncs, 3.18 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7493 writes, 27K keys, 7493 commit groups, 1.0 writes per commit group, ingest: 28.47 MB, 0.05 MB/s#012Interval WAL: 7494 writes, 2938 syncs, 2.55 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.520794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522521034, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2788, "num_deletes": 516, "total_data_size": 5779002, "memory_usage": 5848208, "flush_reason": "Manual Compaction"}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522554878, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3794221, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30986, "largest_seqno": 33769, "table_properties": {"data_size": 3783169, "index_size": 6714, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 26079, "raw_average_key_size": 20, "raw_value_size": 3758898, "raw_average_value_size": 2887, "num_data_blocks": 289, "num_entries": 1302, "num_filter_entries": 1302, "num_deletions": 516, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846327, "oldest_key_time": 1769846327, "file_creation_time": 1769846522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 34130 microseconds, and 7166 cpu microseconds.
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.554937) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3794221 bytes OK
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.554964) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.559175) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.559201) EVENT_LOG_v1 {"time_micros": 1769846522559194, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.559224) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5765774, prev total WAL file size 5765774, number of live WAL files 2.
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.560561) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3705KB)], [60(9014KB)]
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522560669, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 13024642, "oldest_snapshot_seqno": -1}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5754 keys, 10745728 bytes, temperature: kUnknown
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522674886, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10745728, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10705101, "index_size": 25157, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14405, "raw_key_size": 147915, "raw_average_key_size": 25, "raw_value_size": 10599649, "raw_average_value_size": 1842, "num_data_blocks": 1012, "num_entries": 5754, "num_filter_entries": 5754, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.675227) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10745728 bytes
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.678115) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.9 rd, 94.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.8 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.8) OK, records in: 6803, records dropped: 1049 output_compression: NoCompression
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.678141) EVENT_LOG_v1 {"time_micros": 1769846522678128, "job": 36, "event": "compaction_finished", "compaction_time_micros": 114353, "compaction_time_cpu_micros": 47489, "output_level": 6, "num_output_files": 1, "total_output_size": 10745728, "num_input_records": 6803, "num_output_records": 5754, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522678940, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846522679950, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.560375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.679981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.679986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.679989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.679991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:02:02.679993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:02:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:02.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:03 np0005603622 nova_compute[221301]: 2026-01-31 08:02:03.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:03.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:05.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:06 np0005603622 nova_compute[221301]: 2026-01-31 08:02:06.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:06.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 31 03:02:07 np0005603622 nova_compute[221301]: 2026-01-31 08:02:07.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:07.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:08 np0005603622 nova_compute[221301]: 2026-01-31 08:02:08.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:08.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:09 np0005603622 podman[239916]: 2026-01-31 08:02:09.170500878 +0000 UTC m=+0.092636515 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.504 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.505 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.506 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.506 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.506 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.509 221324 INFO nova.compute.manager [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Terminating instance#033[00m
Jan 31 03:02:09 np0005603622 nova_compute[221301]: 2026-01-31 08:02:09.511 221324 DEBUG nova.compute.manager [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:02:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:09.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:10 np0005603622 kernel: tapc44951c8-f6 (unregistering): left promiscuous mode
Jan 31 03:02:10 np0005603622 NetworkManager[49080]: <info>  [1769846530.0474] device (tapc44951c8-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:02:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:02:10Z|00153|binding|INFO|Releasing lport c44951c8-f6ae-4440-8a09-2e528fc0c506 from this chassis (sb_readonly=0)
Jan 31 03:02:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:02:10Z|00154|binding|INFO|Setting lport c44951c8-f6ae-4440-8a09-2e528fc0c506 down in Southbound
Jan 31 03:02:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:02:10Z|00155|binding|INFO|Removing iface tapc44951c8-f6 ovn-installed in OVS
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.057 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.071 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:28:10 10.100.0.8'], port_security=['fa:16:3e:03:28:10 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f985f921-eed3-43c3-85e4-d15c03ebf004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e7009d6ac414afab6cdf704bd382133', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15fe8336-4f1b-400f-a78c-e4c5f81bdd88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac1e6de-12a2-48ba-8b5e-ff38c9d2de74, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=c44951c8-f6ae-4440-8a09-2e528fc0c506) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.075 139793 INFO neutron.agent.ovn.metadata.agent [-] Port c44951c8-f6ae-4440-8a09-2e528fc0c506 in datapath d15ebcc3-fa46-4e00-8ac9-a188c7e792ac unbound from our chassis#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.077 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.079 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[70185cfe-7705-4e75-8342-4aee92cdfa42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.081 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac namespace which is not needed anymore#033[00m
Jan 31 03:02:10 np0005603622 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 31 03:02:10 np0005603622 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000033.scope: Consumed 14.438s CPU time.
Jan 31 03:02:10 np0005603622 systemd-machined[190406]: Machine qemu-21-instance-00000033 terminated.
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.129 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.145 221324 INFO nova.virt.libvirt.driver [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Instance destroyed successfully.#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.145 221324 DEBUG nova.objects.instance [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lazy-loading 'resources' on Instance uuid f985f921-eed3-43c3-85e4-d15c03ebf004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.187 221324 DEBUG nova.virt.libvirt.vif [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:01:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=51,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/uYRaqYHa3qmKDs8j510SjH/x09/LJdT5DB1Lyp5jyC8Bx9QGRkfREVLYyTBjRyp3qZ0eWyDgxo/U7fGt44lzHTc5UaKeSpk2hZqOZ347xDdbXcrkvJa9lpPhooHBhSg==',key_name='tempest-keypair-1034345940',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:01:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e7009d6ac414afab6cdf704bd382133',ramdisk_id='',reservation_id='r-1eago0iq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-2117672251',owner_user_name='tempest-ServersTestFqdnHostnames-2117672251-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:01:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c03bfebfc7e34f83b204997b98c8e690',uuid=f985f921-eed3-43c3-85e4-d15c03ebf004,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.188 221324 DEBUG nova.network.os_vif_util [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converting VIF {"id": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "address": "fa:16:3e:03:28:10", "network": {"id": "d15ebcc3-fa46-4e00-8ac9-a188c7e792ac", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-141290441-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e7009d6ac414afab6cdf704bd382133", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc44951c8-f6", "ovs_interfaceid": "c44951c8-f6ae-4440-8a09-2e528fc0c506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.190 221324 DEBUG nova.network.os_vif_util [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.191 221324 DEBUG os_vif [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.200 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc44951c8-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.212 221324 INFO os_vif [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:28:10,bridge_name='br-int',has_traffic_filtering=True,id=c44951c8-f6ae-4440-8a09-2e528fc0c506,network=Network(d15ebcc3-fa46-4e00-8ac9-a188c7e792ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc44951c8-f6')#033[00m
Jan 31 03:02:10 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [NOTICE]   (239903) : haproxy version is 2.8.14-c23fe91
Jan 31 03:02:10 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [NOTICE]   (239903) : path to executable is /usr/sbin/haproxy
Jan 31 03:02:10 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [WARNING]  (239903) : Exiting Master process...
Jan 31 03:02:10 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [ALERT]    (239903) : Current worker (239905) exited with code 143 (Terminated)
Jan 31 03:02:10 np0005603622 neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac[239898]: [WARNING]  (239903) : All workers exited. Exiting... (0)
Jan 31 03:02:10 np0005603622 systemd[1]: libpod-c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0.scope: Deactivated successfully.
Jan 31 03:02:10 np0005603622 podman[239976]: 2026-01-31 08:02:10.269829501 +0000 UTC m=+0.067152342 container died c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 03:02:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0-userdata-shm.mount: Deactivated successfully.
Jan 31 03:02:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-fb993afaba96aa2c2b79173d3dac792e15d3b991268f42597a28e2f44231ec9a-merged.mount: Deactivated successfully.
Jan 31 03:02:10 np0005603622 podman[239976]: 2026-01-31 08:02:10.346588169 +0000 UTC m=+0.143911000 container cleanup c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:02:10 np0005603622 systemd[1]: libpod-conmon-c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0.scope: Deactivated successfully.
Jan 31 03:02:10 np0005603622 podman[240026]: 2026-01-31 08:02:10.438997278 +0000 UTC m=+0.066525515 container remove c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.444 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a4208315-4330-4324-8c32-1e77f0800be9]: (4, ('Sat Jan 31 08:02:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac (c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0)\nc18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0\nSat Jan 31 08:02:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac (c18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0)\nc18f595baab04277db07e7c93897970391d27d539d7be6bd06f84a22209506b0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.446 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aec21ccd-87c2-41b7-a7b9-dcd62be8cc7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.448 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd15ebcc3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 kernel: tapd15ebcc3-f0: left promiscuous mode
Jan 31 03:02:10 np0005603622 nova_compute[221301]: 2026-01-31 08:02:10.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.498 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd62abc-ad9f-49f0-8e1d-38c4fd06aba4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.514 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b6f4c4-58ed-4967-8601-0389a24f713f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.515 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8755f003-d471-42c6-a774-602f3c00f461]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.528 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a20218c8-c399-47b0-b7bc-6548f3d66378]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575459, 'reachable_time': 42073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240041, 'error': None, 'target': 'ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 systemd[1]: run-netns-ovnmeta\x2dd15ebcc3\x2dfa46\x2d4e00\x2d8ac9\x2da188c7e792ac.mount: Deactivated successfully.
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.531 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d15ebcc3-fa46-4e00-8ac9-a188c7e792ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:02:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:10.532 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1e843b92-c414-4d5c-b582-789a1565dfd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:02:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:10.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.273 221324 DEBUG nova.compute.manager [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-unplugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.274 221324 DEBUG oslo_concurrency.lockutils [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.274 221324 DEBUG oslo_concurrency.lockutils [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.274 221324 DEBUG oslo_concurrency.lockutils [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.274 221324 DEBUG nova.compute.manager [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] No waiting events found dispatching network-vif-unplugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.274 221324 DEBUG nova.compute.manager [req-529ea25e-7207-4c09-bd84-47e213b33945 req-932c4811-47e2-41f6-8a37-80c86bd68e19 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-unplugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:02:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:11.676 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:02:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:11.679 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:02:11 np0005603622 nova_compute[221301]: 2026-01-31 08:02:11.690 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:11.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:12.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:13 np0005603622 nova_compute[221301]: 2026-01-31 08:02:13.047 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:13.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:13.999 221324 INFO nova.virt.libvirt.driver [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Deleting instance files /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004_del#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.000 221324 INFO nova.virt.libvirt.driver [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Deletion of /var/lib/nova/instances/f985f921-eed3-43c3-85e4-d15c03ebf004_del complete#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.078 221324 INFO nova.compute.manager [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Took 4.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.079 221324 DEBUG oslo.service.loopingcall [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.079 221324 DEBUG nova.compute.manager [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.080 221324 DEBUG nova.network.neutron [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:02:14 np0005603622 podman[240043]: 2026-01-31 08:02:14.153427393 +0000 UTC m=+0.066346451 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.429 221324 DEBUG nova.compute.manager [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.429 221324 DEBUG oslo_concurrency.lockutils [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.430 221324 DEBUG oslo_concurrency.lockutils [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.430 221324 DEBUG oslo_concurrency.lockutils [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.430 221324 DEBUG nova.compute.manager [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] No waiting events found dispatching network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:02:14 np0005603622 nova_compute[221301]: 2026-01-31 08:02:14.431 221324 WARNING nova.compute.manager [req-22e49c82-8d07-4c9c-9420-cf64b50f6957 req-4ab4b839-62c5-43bd-84f9-1c40490a6f47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received unexpected event network-vif-plugged-c44951c8-f6ae-4440-8a09-2e528fc0c506 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:02:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:14.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:15 np0005603622 nova_compute[221301]: 2026-01-31 08:02:15.243 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:15 np0005603622 nova_compute[221301]: 2026-01-31 08:02:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:15 np0005603622 nova_compute[221301]: 2026-01-31 08:02:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:02:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:15.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:15 np0005603622 nova_compute[221301]: 2026-01-31 08:02:15.889 221324 DEBUG nova.network.neutron [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:02:15 np0005603622 nova_compute[221301]: 2026-01-31 08:02:15.920 221324 INFO nova.compute.manager [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Took 1.84 seconds to deallocate network for instance.#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.033 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.034 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.128 221324 DEBUG oslo_concurrency.processutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.576 221324 DEBUG nova.compute.manager [req-e09d60e4-3246-4623-8548-d3d929127a8e req-95481ea1-7d54-4764-9ec8-48a794c5d611 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Received event network-vif-deleted-c44951c8-f6ae-4440-8a09-2e528fc0c506 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1654010770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.604 221324 DEBUG oslo_concurrency.processutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.612 221324 DEBUG nova.compute.provider_tree [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.640 221324 DEBUG nova.scheduler.client.report [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.694 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.723 221324 INFO nova.scheduler.client.report [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Deleted allocations for instance f985f921-eed3-43c3-85e4-d15c03ebf004#033[00m
Jan 31 03:02:16 np0005603622 nova_compute[221301]: 2026-01-31 08:02:16.801 221324 DEBUG oslo_concurrency.lockutils [None req-0e59d49d-8eba-4583-ab02-f23f8c0546d7 c03bfebfc7e34f83b204997b98c8e690 3e7009d6ac414afab6cdf704bd382133 - - default default] Lock "f985f921-eed3-43c3-85e4-d15c03ebf004" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:17.682 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:02:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:17.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:02:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:02:18 np0005603622 nova_compute[221301]: 2026-01-31 08:02:18.050 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:18.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:19.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:20 np0005603622 nova_compute[221301]: 2026-01-31 08:02:20.248 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:20.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:21 np0005603622 nova_compute[221301]: 2026-01-31 08:02:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:21.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:22 np0005603622 nova_compute[221301]: 2026-01-31 08:02:22.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:22 np0005603622 nova_compute[221301]: 2026-01-31 08:02:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:22 np0005603622 nova_compute[221301]: 2026-01-31 08:02:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:22.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:23 np0005603622 nova_compute[221301]: 2026-01-31 08:02:23.086 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:23 np0005603622 nova_compute[221301]: 2026-01-31 08:02:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:23.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.432 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.432 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:24.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/812959832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:24 np0005603622 nova_compute[221301]: 2026-01-31 08:02:24.902 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.074 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.076 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4774MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.077 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.077 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.143 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846530.1420002, f985f921-eed3-43c3-85e4-d15c03ebf004 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.144 221324 INFO nova.compute.manager [-] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.173 221324 DEBUG nova.compute.manager [None req-51dd806e-6066-48bf-ab2a-8969be968bab - - - - - -] [instance: f985f921-eed3-43c3-85e4-d15c03ebf004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:02:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.247 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.247 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.265 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.295 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.413 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2734964217' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.723 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.730 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.753 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.802 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:02:25 np0005603622 nova_compute[221301]: 2026-01-31 08:02:25.803 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:25.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:02:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 31 03:02:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:26.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 31 03:02:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:02:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6591 writes, 34K keys, 6591 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6591 writes, 6591 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1609 writes, 7959 keys, 1609 commit groups, 1.0 writes per commit group, ingest: 16.31 MB, 0.03 MB/s#012Interval WAL: 1609 writes, 1609 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     48.6      0.86              0.11        18    0.048       0      0       0.0       0.0#012  L6      1/0   10.25 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6     81.2     67.2      2.24              0.43        17    0.132     86K   9922       0.0       0.0#012 Sum      1/0   10.25 MB   0.0      0.2     0.0      0.1       0.2      0.1       0.0   4.6     58.8     62.1      3.10              0.53        35    0.089     86K   9922       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.8     87.3     89.7      0.59              0.13         8    0.074     25K   3110       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0     81.2     67.2      2.24              0.43        17    0.132     86K   9922       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     48.7      0.85              0.11        17    0.050       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.041, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.08 MB/s write, 0.18 GB read, 0.08 MB/s read, 3.1 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 20.24 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.00024 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1154,19.56 MB,6.43325%) FilterBlock(35,248.55 KB,0.0798426%) IndexBlock(35,455.86 KB,0.146439%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:02:27 np0005603622 nova_compute[221301]: 2026-01-31 08:02:27.807 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:27 np0005603622 nova_compute[221301]: 2026-01-31 08:02:27.811 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:02:27 np0005603622 nova_compute[221301]: 2026-01-31 08:02:27.812 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:02:27 np0005603622 nova_compute[221301]: 2026-01-31 08:02:27.849 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:02:27 np0005603622 nova_compute[221301]: 2026-01-31 08:02:27.850 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:27.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:28 np0005603622 nova_compute[221301]: 2026-01-31 08:02:28.088 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 31 03:02:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:28.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:29.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:30 np0005603622 nova_compute[221301]: 2026-01-31 08:02:30.299 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:30 np0005603622 nova_compute[221301]: 2026-01-31 08:02:30.437 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:30.857 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:30.858 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:02:30.858 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:30.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:31.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:32.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:33 np0005603622 nova_compute[221301]: 2026-01-31 08:02:33.131 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:33.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:34.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:35 np0005603622 nova_compute[221301]: 2026-01-31 08:02:35.305 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:35.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 31 03:02:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:36.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 31 03:02:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:37.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:38 np0005603622 nova_compute[221301]: 2026-01-31 08:02:38.132 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:38.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:39.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:40 np0005603622 podman[240312]: 2026-01-31 08:02:40.191801987 +0000 UTC m=+0.113279149 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:02:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:40 np0005603622 nova_compute[221301]: 2026-01-31 08:02:40.309 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:40.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:41.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:42.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:43 np0005603622 nova_compute[221301]: 2026-01-31 08:02:43.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:43.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:45 np0005603622 podman[240338]: 2026-01-31 08:02:45.127566905 +0000 UTC m=+0.051773999 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:02:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:45 np0005603622 nova_compute[221301]: 2026-01-31 08:02:45.313 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:45.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:46.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 31 03:02:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:47.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:48 np0005603622 nova_compute[221301]: 2026-01-31 08:02:48.172 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:48.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:49.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:50 np0005603622 nova_compute[221301]: 2026-01-31 08:02:50.318 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:50 np0005603622 nova_compute[221301]: 2026-01-31 08:02:50.884 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:50 np0005603622 nova_compute[221301]: 2026-01-31 08:02:50.885 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:50 np0005603622 nova_compute[221301]: 2026-01-31 08:02:50.903 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:02:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:02:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:50.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.008 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.009 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.018 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.018 221324 INFO nova.compute.claims [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.253 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:02:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3080879142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.748 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.759 221324 DEBUG nova.compute.provider_tree [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.791 221324 DEBUG nova.scheduler.client.report [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.845 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.846 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:02:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:51.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.955 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:02:51 np0005603622 nova_compute[221301]: 2026-01-31 08:02:51.956 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.017 221324 INFO nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.041 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.317 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.318 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.318 221324 INFO nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Creating image(s)#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.346 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.379 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.411 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.415 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.493 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.495 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.495 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.496 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.522 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.526 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:52 np0005603622 nova_compute[221301]: 2026-01-31 08:02:52.611 221324 DEBUG nova.policy [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4eb6c0e509f14fb2958ca76604db17c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29c39093d6684600bef00c63da85e6b7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:02:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:52.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:53 np0005603622 nova_compute[221301]: 2026-01-31 08:02:53.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:53.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:54 np0005603622 nova_compute[221301]: 2026-01-31 08:02:54.738 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Successfully created port: 9e93fdf5-6477-4b09-91c4-656c1d85b246 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:02:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:02:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:54.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:02:55 np0005603622 nova_compute[221301]: 2026-01-31 08:02:55.115 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.589s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:02:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:02:55 np0005603622 nova_compute[221301]: 2026-01-31 08:02:55.214 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] resizing rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:02:55 np0005603622 nova_compute[221301]: 2026-01-31 08:02:55.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:55.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.365 221324 DEBUG nova.objects.instance [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ae56bfd-f507-4765-a2ce-17d331c4f93d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.619 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.620 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Ensure instance console log exists: /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.621 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.621 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.622 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:02:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:02:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:56.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.960 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Successfully updated port: 9e93fdf5-6477-4b09-91c4-656c1d85b246 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.993 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.993 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquired lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:02:56 np0005603622 nova_compute[221301]: 2026-01-31 08:02:56.993 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:02:57 np0005603622 nova_compute[221301]: 2026-01-31 08:02:57.156 221324 DEBUG nova.compute.manager [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-changed-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:02:57 np0005603622 nova_compute[221301]: 2026-01-31 08:02:57.156 221324 DEBUG nova.compute.manager [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Refreshing instance network info cache due to event network-changed-9e93fdf5-6477-4b09-91c4-656c1d85b246. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:02:57 np0005603622 nova_compute[221301]: 2026-01-31 08:02:57.157 221324 DEBUG oslo_concurrency.lockutils [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:02:57 np0005603622 nova_compute[221301]: 2026-01-31 08:02:57.448 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:02:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:57.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:58 np0005603622 nova_compute[221301]: 2026-01-31 08:02:58.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:02:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:02:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:02:58.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:02:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.482 221324 DEBUG nova.network.neutron [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updating instance_info_cache with network_info: [{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.521 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Releasing lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.521 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Instance network_info: |[{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.522 221324 DEBUG oslo_concurrency.lockutils [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.522 221324 DEBUG nova.network.neutron [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Refreshing network info cache for port 9e93fdf5-6477-4b09-91c4-656c1d85b246 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.524 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Start _get_guest_xml network_info=[{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.530 221324 WARNING nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.537 221324 DEBUG nova.virt.libvirt.host [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.537 221324 DEBUG nova.virt.libvirt.host [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.543 221324 DEBUG nova.virt.libvirt.host [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.545 221324 DEBUG nova.virt.libvirt.host [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.547 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.548 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.549 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.549 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.550 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.551 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.551 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.552 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.552 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.553 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.554 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.554 221324 DEBUG nova.virt.hardware [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:02:59 np0005603622 nova_compute[221301]: 2026-01-31 08:02:59.560 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:02:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:02:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:02:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:02:59.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1201131045' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:00 np0005603622 nova_compute[221301]: 2026-01-31 08:03:00.046 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:00 np0005603622 nova_compute[221301]: 2026-01-31 08:03:00.074 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:00 np0005603622 nova_compute[221301]: 2026-01-31 08:03:00.079 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:00 np0005603622 nova_compute[221301]: 2026-01-31 08:03:00.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:00.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2026034858' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.081 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.002s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.085 221324 DEBUG nova.virt.libvirt.vif [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-468493135',display_name='tempest-ServersTestJSON-server-468493135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-468493135',id=55,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF2XpWAlfsRwpJSOAVFK4YkZRqP2ohouZY8DQgWjLRw8h6ouHzysUnYolJQ1sKy2stcFyqywTXIV0uIGeGj4a5boQRzjMGwEv2bE3Qf/2+L+o6rHW9A8mrJeZiOPli4ICw==',key_name='tempest-keypair-1574727576',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='29c39093d6684600bef00c63da85e6b7',ramdisk_id='',reservation_id='r-fgutxw9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-254633607',owner_user_name='tempest-ServersTestJSON-254633607-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:02:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4eb6c0e509f14fb2958ca76604db17c0',uuid=5ae56bfd-f507-4765-a2ce-17d331c4f93d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.087 221324 DEBUG nova.network.os_vif_util [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converting VIF {"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.089 221324 DEBUG nova.network.os_vif_util [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.091 221324 DEBUG nova.objects.instance [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ae56bfd-f507-4765-a2ce-17d331c4f93d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.114 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <uuid>5ae56bfd-f507-4765-a2ce-17d331c4f93d</uuid>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <name>instance-00000037</name>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersTestJSON-server-468493135</nova:name>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:02:59</nova:creationTime>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:user uuid="4eb6c0e509f14fb2958ca76604db17c0">tempest-ServersTestJSON-254633607-project-member</nova:user>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:project uuid="29c39093d6684600bef00c63da85e6b7">tempest-ServersTestJSON-254633607</nova:project>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <nova:port uuid="9e93fdf5-6477-4b09-91c4-656c1d85b246">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="serial">5ae56bfd-f507-4765-a2ce-17d331c4f93d</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="uuid">5ae56bfd-f507-4765-a2ce-17d331c4f93d</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:89:0b:0d"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <target dev="tap9e93fdf5-64"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/console.log" append="off"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:03:01 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:03:01 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:03:01 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:03:01 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.115 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Preparing to wait for external event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.115 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.115 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.116 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.116 221324 DEBUG nova.virt.libvirt.vif [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-468493135',display_name='tempest-ServersTestJSON-server-468493135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-468493135',id=55,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF2XpWAlfsRwpJSOAVFK4YkZRqP2ohouZY8DQgWjLRw8h6ouHzysUnYolJQ1sKy2stcFyqywTXIV0uIGeGj4a5boQRzjMGwEv2bE3Qf/2+L+o6rHW9A8mrJeZiOPli4ICw==',key_name='tempest-keypair-1574727576',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='29c39093d6684600bef00c63da85e6b7',ramdisk_id='',reservation_id='r-fgutxw9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-254633607',owner_user_name='tempest-ServersTestJSON-254633607-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:02:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4eb6c0e509f14fb2958ca76604db17c0',uuid=5ae56bfd-f507-4765-a2ce-17d331c4f93d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.117 221324 DEBUG nova.network.os_vif_util [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converting VIF {"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.117 221324 DEBUG nova.network.os_vif_util [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.118 221324 DEBUG os_vif [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.120 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.120 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.133 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.133 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e93fdf5-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.134 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e93fdf5-64, col_values=(('external_ids', {'iface-id': '9e93fdf5-6477-4b09-91c4-656c1d85b246', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:0b:0d', 'vm-uuid': '5ae56bfd-f507-4765-a2ce-17d331c4f93d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.137 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.140 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:03:01 np0005603622 NetworkManager[49080]: <info>  [1769846581.1409] manager: (tap9e93fdf5-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.148 221324 INFO os_vif [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64')#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.221 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.221 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.221 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] No VIF found with MAC fa:16:3e:89:0b:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.222 221324 INFO nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Using config drive#033[00m
Jan 31 03:03:01 np0005603622 nova_compute[221301]: 2026-01-31 08:03:01.247 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:01.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.117 221324 INFO nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Creating config drive at /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.124 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi7gcvxh6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.252 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpi7gcvxh6" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.291 221324 DEBUG nova.storage.rbd_utils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] rbd image 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.295 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.657 221324 DEBUG nova.network.neutron [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updated VIF entry in instance network info cache for port 9e93fdf5-6477-4b09-91c4-656c1d85b246. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.658 221324 DEBUG nova.network.neutron [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updating instance_info_cache with network_info: [{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:02 np0005603622 nova_compute[221301]: 2026-01-31 08:03:02.702 221324 DEBUG oslo_concurrency.lockutils [req-6e861fb6-d318-40ae-80d6-34eb3d9c0d36 req-86e082bb-7d60-4327-a1b8-034fefce52ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:02.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:03 np0005603622 nova_compute[221301]: 2026-01-31 08:03:03.266 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:03.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.731 221324 DEBUG oslo_concurrency.processutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config 5ae56bfd-f507-4765-a2ce-17d331c4f93d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.733 221324 INFO nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Deleting local config drive /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d/disk.config because it was imported into RBD.#033[00m
Jan 31 03:03:04 np0005603622 kernel: tap9e93fdf5-64: entered promiscuous mode
Jan 31 03:03:04 np0005603622 NetworkManager[49080]: <info>  [1769846584.8060] manager: (tap9e93fdf5-64): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 31 03:03:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:04Z|00156|binding|INFO|Claiming lport 9e93fdf5-6477-4b09-91c4-656c1d85b246 for this chassis.
Jan 31 03:03:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:04Z|00157|binding|INFO|9e93fdf5-6477-4b09-91c4-656c1d85b246: Claiming fa:16:3e:89:0b:0d 10.100.0.14
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.807 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.836 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:0b:0d 10.100.0.14'], port_security=['fa:16:3e:89:0b:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5ae56bfd-f507-4765-a2ce-17d331c4f93d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29c39093d6684600bef00c63da85e6b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd560041b-afb9-420f-903c-6f76e66a2643', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0047a36-b8b8-45b8-bab7-50d47452d2d5, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9e93fdf5-6477-4b09-91c4-656c1d85b246) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.838 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9e93fdf5-6477-4b09-91c4-656c1d85b246 in datapath f6748c1c-ecca-4a85-ad94-00568ceeba0f bound to our chassis#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.839 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6748c1c-ecca-4a85-ad94-00568ceeba0f#033[00m
Jan 31 03:03:04 np0005603622 systemd-machined[190406]: New machine qemu-22-instance-00000037.
Jan 31 03:03:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:04Z|00158|binding|INFO|Setting lport 9e93fdf5-6477-4b09-91c4-656c1d85b246 ovn-installed in OVS
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.854 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[851925f5-6b9d-451b-8ced-13556265225f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:04Z|00159|binding|INFO|Setting lport 9e93fdf5-6477-4b09-91c4-656c1d85b246 up in Southbound
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.856 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6748c1c-e1 in ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:03:04 np0005603622 nova_compute[221301]: 2026-01-31 08:03:04.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:04 np0005603622 systemd[1]: Started Virtual Machine qemu-22-instance-00000037.
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.863 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6748c1c-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.863 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7db998d1-6c4a-45c2-b864-d3915819bde0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.864 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4bf5e4-6f3c-4a8e-bb18-dfd146fce2a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.878 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[35a9d7c5-e27c-47e1-a56e-967dc393405b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 systemd-udevd[240685]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.906 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5094ad7c-e5cd-45fa-9a9b-5107a64451e3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 NetworkManager[49080]: <info>  [1769846584.9263] device (tap9e93fdf5-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:03:04 np0005603622 NetworkManager[49080]: <info>  [1769846584.9305] device (tap9e93fdf5-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:03:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.960 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bf505a55-265b-4caf-a525-de6d11a83882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.966 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6d08e9-2d64-457c-9ae3-8c9ed72ff658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:04 np0005603622 NetworkManager[49080]: <info>  [1769846584.9679] manager: (tapf6748c1c-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Jan 31 03:03:04 np0005603622 systemd-udevd[240692]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:04.998 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fc196bea-e73b-426e-b5ce-1040d586bf11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.002 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[71297e5e-23a3-4f82-bb95-4982a1e6193c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 NetworkManager[49080]: <info>  [1769846585.0245] device (tapf6748c1c-e0): carrier: link connected
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.027 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[416be8cb-fa41-4f8d-bb86-15614e03b095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.043 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b782fd83-785c-42fe-9af9-a58f58280348]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6748c1c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583636, 'reachable_time': 42729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240715, 'error': None, 'target': 'ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.065 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[03b018a4-5411-4a43-94c5-684b58193d8e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:94d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583636, 'tstamp': 583636}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240716, 'error': None, 'target': 'ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.086 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d34655d2-cfb6-4836-a4cf-c72c8c9ed154]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6748c1c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:94:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583636, 'reachable_time': 42729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240717, 'error': None, 'target': 'ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.120 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e39256e-afdd-48a8-b1b4-c2b5e86f3556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.178 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b88fa902-f4ce-4ab6-8664-f4375f89f248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.181 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6748c1c-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.181 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.181 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6748c1c-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:05 np0005603622 NetworkManager[49080]: <info>  [1769846585.1848] manager: (tapf6748c1c-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 31 03:03:05 np0005603622 nova_compute[221301]: 2026-01-31 08:03:05.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:05 np0005603622 kernel: tapf6748c1c-e0: entered promiscuous mode
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.191 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6748c1c-e0, col_values=(('external_ids', {'iface-id': '931fca9c-7a55-4ef4-9932-e142448d0ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:05 np0005603622 nova_compute[221301]: 2026-01-31 08:03:05.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:05Z|00160|binding|INFO|Releasing lport 931fca9c-7a55-4ef4-9932-e142448d0ff3 from this chassis (sb_readonly=0)
Jan 31 03:03:05 np0005603622 nova_compute[221301]: 2026-01-31 08:03:05.193 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:05 np0005603622 nova_compute[221301]: 2026-01-31 08:03:05.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.203 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6748c1c-ecca-4a85-ad94-00568ceeba0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6748c1c-ecca-4a85-ad94-00568ceeba0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.204 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a1821c45-63b0-4a79-ad8e-6f661988bd45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:05 np0005603622 nova_compute[221301]: 2026-01-31 08:03:05.204 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.205 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f6748c1c-ecca-4a85-ad94-00568ceeba0f
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f6748c1c-ecca-4a85-ad94-00568ceeba0f.pid.haproxy
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f6748c1c-ecca-4a85-ad94-00568ceeba0f
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:03:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:05.207 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'env', 'PROCESS_TAG=haproxy-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6748c1c-ecca-4a85-ad94-00568ceeba0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:03:05 np0005603622 podman[240767]: 2026-01-31 08:03:05.638873913 +0000 UTC m=+0.064921692 container create 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 03:03:05 np0005603622 systemd[1]: Started libpod-conmon-9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367.scope.
Jan 31 03:03:05 np0005603622 podman[240767]: 2026-01-31 08:03:05.603360801 +0000 UTC m=+0.029408680 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:03:05 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:03:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea300a73df940e4737c8b192ae804d2f91e0f9e356216899691ee27cbcf0941c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:03:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:05 np0005603622 podman[240767]: 2026-01-31 08:03:05.733237794 +0000 UTC m=+0.159285653 container init 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:03:05 np0005603622 podman[240767]: 2026-01-31 08:03:05.739746568 +0000 UTC m=+0.165794377 container start 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:03:05 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [NOTICE]   (240793) : New worker (240795) forked
Jan 31 03:03:05 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [NOTICE]   (240793) : Loading success.
Jan 31 03:03:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:05.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.137 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.248 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846586.2474914, 5ae56bfd-f507-4765-a2ce-17d331c4f93d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.249 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.278 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.284 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846586.2479599, 5ae56bfd-f507-4765-a2ce-17d331c4f93d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.284 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.307 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.312 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.339 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:03:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.676 221324 DEBUG nova.compute.manager [req-c90b2ff4-c99b-480e-b5a1-04e5085d1832 req-3a7847d8-10ce-4102-8ac5-42e6c55fb7b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.676 221324 DEBUG oslo_concurrency.lockutils [req-c90b2ff4-c99b-480e-b5a1-04e5085d1832 req-3a7847d8-10ce-4102-8ac5-42e6c55fb7b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.677 221324 DEBUG oslo_concurrency.lockutils [req-c90b2ff4-c99b-480e-b5a1-04e5085d1832 req-3a7847d8-10ce-4102-8ac5-42e6c55fb7b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.677 221324 DEBUG oslo_concurrency.lockutils [req-c90b2ff4-c99b-480e-b5a1-04e5085d1832 req-3a7847d8-10ce-4102-8ac5-42e6c55fb7b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.678 221324 DEBUG nova.compute.manager [req-c90b2ff4-c99b-480e-b5a1-04e5085d1832 req-3a7847d8-10ce-4102-8ac5-42e6c55fb7b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Processing event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.678 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.683 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846586.6835048, 5ae56bfd-f507-4765-a2ce-17d331c4f93d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.684 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.687 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.692 221324 INFO nova.virt.libvirt.driver [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Instance spawned successfully.#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.692 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.706 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.710 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.721 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.722 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.722 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.723 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.723 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.724 221324 DEBUG nova.virt.libvirt.driver [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.770 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.809 221324 INFO nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Took 14.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.809 221324 DEBUG nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:06.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.970 221324 INFO nova.compute.manager [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Took 15.99 seconds to build instance.#033[00m
Jan 31 03:03:06 np0005603622 nova_compute[221301]: 2026-01-31 08:03:06.993 221324 DEBUG oslo_concurrency.lockutils [None req-0edaa45c-36f3-442e-87de-7d314ed9fa1e 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:07.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.268 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.846 221324 DEBUG nova.compute.manager [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.847 221324 DEBUG oslo_concurrency.lockutils [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.847 221324 DEBUG oslo_concurrency.lockutils [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.847 221324 DEBUG oslo_concurrency.lockutils [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.847 221324 DEBUG nova.compute.manager [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] No waiting events found dispatching network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:08 np0005603622 nova_compute[221301]: 2026-01-31 08:03:08.847 221324 WARNING nova.compute.manager [req-d7d69a26-e0c8-45ff-8c49-a7cd582040c3 req-1846fd79-17c8-40c0-93ea-5939ab23d0be fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received unexpected event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:03:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:08.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:09.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 31 03:03:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:11 np0005603622 nova_compute[221301]: 2026-01-31 08:03:11.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:11 np0005603622 podman[240821]: 2026-01-31 08:03:11.184516798 +0000 UTC m=+0.114182693 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 03:03:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 31 03:03:11 np0005603622 nova_compute[221301]: 2026-01-31 08:03:11.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:11 np0005603622 NetworkManager[49080]: <info>  [1769846591.8790] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 31 03:03:11 np0005603622 NetworkManager[49080]: <info>  [1769846591.8799] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 31 03:03:11 np0005603622 nova_compute[221301]: 2026-01-31 08:03:11.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:11Z|00161|binding|INFO|Releasing lport 931fca9c-7a55-4ef4-9932-e142448d0ff3 from this chassis (sb_readonly=0)
Jan 31 03:03:11 np0005603622 nova_compute[221301]: 2026-01-31 08:03:11.916 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:11.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:12 np0005603622 nova_compute[221301]: 2026-01-31 08:03:12.727 221324 DEBUG nova.compute.manager [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-changed-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:12 np0005603622 nova_compute[221301]: 2026-01-31 08:03:12.728 221324 DEBUG nova.compute.manager [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Refreshing instance network info cache due to event network-changed-9e93fdf5-6477-4b09-91c4-656c1d85b246. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:03:12 np0005603622 nova_compute[221301]: 2026-01-31 08:03:12.728 221324 DEBUG oslo_concurrency.lockutils [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:12 np0005603622 nova_compute[221301]: 2026-01-31 08:03:12.728 221324 DEBUG oslo_concurrency.lockutils [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:12 np0005603622 nova_compute[221301]: 2026-01-31 08:03:12.728 221324 DEBUG nova.network.neutron [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Refreshing network info cache for port 9e93fdf5-6477-4b09-91c4-656c1d85b246 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:03:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:12.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:13 np0005603622 nova_compute[221301]: 2026-01-31 08:03:13.270 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:13.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 31 03:03:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:14.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:15 np0005603622 nova_compute[221301]: 2026-01-31 08:03:15.822 221324 DEBUG nova.network.neutron [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updated VIF entry in instance network info cache for port 9e93fdf5-6477-4b09-91c4-656c1d85b246. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:03:15 np0005603622 nova_compute[221301]: 2026-01-31 08:03:15.823 221324 DEBUG nova.network.neutron [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updating instance_info_cache with network_info: [{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:15 np0005603622 nova_compute[221301]: 2026-01-31 08:03:15.883 221324 DEBUG oslo_concurrency.lockutils [req-876de4ea-3eca-4983-ba51-18b8c695a7f0 req-f6fc579c-c7fb-4e6b-a592-f93ff0f28b48 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:15.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:16 np0005603622 podman[240849]: 2026-01-31 08:03:16.136785029 +0000 UTC m=+0.064325796 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 03:03:16 np0005603622 nova_compute[221301]: 2026-01-31 08:03:16.141 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:16 np0005603622 nova_compute[221301]: 2026-01-31 08:03:16.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:16 np0005603622 nova_compute[221301]: 2026-01-31 08:03:16.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:03:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:16.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 31 03:03:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:17.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:18 np0005603622 nova_compute[221301]: 2026-01-31 08:03:18.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:18.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 31 03:03:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:19.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:20.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:21Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:0b:0d 10.100.0.14
Jan 31 03:03:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:21Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:0b:0d 10.100.0.14
Jan 31 03:03:21 np0005603622 nova_compute[221301]: 2026-01-31 08:03:21.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:21.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:22 np0005603622 nova_compute[221301]: 2026-01-31 08:03:22.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 31 03:03:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:22.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:23 np0005603622 nova_compute[221301]: 2026-01-31 08:03:23.336 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:23 np0005603622 nova_compute[221301]: 2026-01-31 08:03:23.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:23 np0005603622 nova_compute[221301]: 2026-01-31 08:03:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:23 np0005603622 nova_compute[221301]: 2026-01-31 08:03:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:23.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.035 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.036 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.132 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.294 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.295 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.313 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.314 221324 INFO nova.compute.claims [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:03:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:25 np0005603622 nova_compute[221301]: 2026-01-31 08:03:25.617 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:25.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3635559240' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.091 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.101 221324 DEBUG nova.compute.provider_tree [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.146 221324 DEBUG nova.scheduler.client.report [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.187 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.188 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.267 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.268 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.296 221324 INFO nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.331 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.458 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.459 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.459 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.459 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.460 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.526 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.528 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.529 221324 INFO nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Creating image(s)#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.561 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.599 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.631 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.636 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "9eeb228435ed1e14c9f2dce3bbc2bc85a1a11e5f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.637 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "9eeb228435ed1e14c9f2dce3bbc2bc85a1a11e5f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.643 221324 DEBUG nova.policy [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '46ffd64a348845fab6cdc53249353575', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '521dcd459f144f2bb32de93d50ae0391', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2611978247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:26 np0005603622 nova_compute[221301]: 2026-01-31 08:03:26.904 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:26.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.021 221324 DEBUG nova.virt.libvirt.imagebackend [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/4f6468b7-082d-4995-a147-34ef0ccaa0d2/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/4f6468b7-082d-4995-a147-34ef0ccaa0d2/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:03:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:27.822 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:27.825 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.866 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.872 221324 DEBUG nova.virt.libvirt.imagebackend [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/4f6468b7-082d-4995-a147-34ef0ccaa0d2/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.872 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] cloning images/4f6468b7-082d-4995-a147-34ef0ccaa0d2@snap to None/070b0301-faed-40e8-877f-c185918ddfd0_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.935 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000037 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.935 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000037 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:03:27 np0005603622 nova_compute[221301]: 2026-01-31 08:03:27.969 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Successfully created port: d708a459-b972-4057-8948-3f31e556c338 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:03:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:27.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.104 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.105 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4532MB free_disk=20.852169036865234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.105 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.105 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.214 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5ae56bfd-f507-4765-a2ce-17d331c4f93d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.215 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 070b0301-faed-40e8-877f-c185918ddfd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.215 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.215 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.295 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/475415568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.708 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.713 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.741 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.820 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:03:28 np0005603622 nova_compute[221301]: 2026-01-31 08:03:28.821 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:28.827 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:28.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:03:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:29.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.224 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Successfully updated port: d708a459-b972-4057-8948-3f31e556c338 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:03:30 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.271 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "9eeb228435ed1e14c9f2dce3bbc2bc85a1a11e5f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.323 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.323 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquired lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.324 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:03:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:30 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.456 221324 DEBUG nova.compute.manager [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-changed-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.456 221324 DEBUG nova.compute.manager [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Refreshing instance network info cache due to event network-changed-d708a459-b972-4057-8948-3f31e556c338. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.457 221324 DEBUG oslo_concurrency.lockutils [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.463 221324 DEBUG nova.objects.instance [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'migration_context' on Instance uuid 070b0301-faed-40e8-877f-c185918ddfd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.821 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.822 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:03:30 np0005603622 nova_compute[221301]: 2026-01-31 08:03:30.822 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:30.857 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:30.858 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:30.859 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:30.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:31 np0005603622 nova_compute[221301]: 2026-01-31 08:03:31.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:31 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 31 03:03:31 np0005603622 nova_compute[221301]: 2026-01-31 08:03:31.524 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:03:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:03:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:03:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:03:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:32.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.331 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.332 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Ensure instance console log exists: /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.333 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.333 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.334 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:32 np0005603622 nova_compute[221301]: 2026-01-31 08:03:32.378 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:03:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:33.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:33 np0005603622 nova_compute[221301]: 2026-01-31 08:03:33.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:33 np0005603622 nova_compute[221301]: 2026-01-31 08:03:33.411 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:33 np0005603622 nova_compute[221301]: 2026-01-31 08:03:33.412 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:33 np0005603622 nova_compute[221301]: 2026-01-31 08:03:33.412 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:03:33 np0005603622 nova_compute[221301]: 2026-01-31 08:03:33.412 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5ae56bfd-f507-4765-a2ce-17d331c4f93d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:34.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.122 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.122 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.123 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.123 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.123 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.125 221324 INFO nova.compute.manager [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Terminating instance#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.126 221324 DEBUG nova.compute.manager [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.519 221324 DEBUG nova.network.neutron [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Updating instance_info_cache with network_info: [{"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.572 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Releasing lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.573 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Instance network_info: |[{"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.574 221324 DEBUG oslo_concurrency.lockutils [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.574 221324 DEBUG nova.network.neutron [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Refreshing network info cache for port d708a459-b972-4057-8948-3f31e556c338 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.578 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Start _get_guest_xml network_info=[{"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:03:01Z,direct_url=<?>,disk_format='raw',id=4f6468b7-082d-4995-a147-34ef0ccaa0d2,min_disk=1,min_ram=0,name='tempest-test-snap-1154036698',owner='521dcd459f144f2bb32de93d50ae0391',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:03:17Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '4f6468b7-082d-4995-a147-34ef0ccaa0d2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.584 221324 WARNING nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.595 221324 DEBUG nova.virt.libvirt.host [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.596 221324 DEBUG nova.virt.libvirt.host [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.599 221324 DEBUG nova.virt.libvirt.host [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.599 221324 DEBUG nova.virt.libvirt.host [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.601 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.601 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:03:01Z,direct_url=<?>,disk_format='raw',id=4f6468b7-082d-4995-a147-34ef0ccaa0d2,min_disk=1,min_ram=0,name='tempest-test-snap-1154036698',owner='521dcd459f144f2bb32de93d50ae0391',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:03:17Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.601 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.601 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.602 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.602 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.602 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.602 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.603 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.603 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.603 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.603 221324 DEBUG nova.virt.hardware [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:03:34 np0005603622 nova_compute[221301]: 2026-01-31 08:03:34.606 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:35.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/632375609' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:35 np0005603622 nova_compute[221301]: 2026-01-31 08:03:35.662 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:35 np0005603622 nova_compute[221301]: 2026-01-31 08:03:35.693 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:35 np0005603622 nova_compute[221301]: 2026-01-31 08:03:35.698 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:36.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:03:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/417049359' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.351 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.654s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.354 221324 DEBUG nova.virt.libvirt.vif [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:03:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-497297412',display_name='tempest-ImagesTestJSON-server-497297412',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-497297412',id=56,image_ref='4f6468b7-082d-4995-a147-34ef0ccaa0d2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-y8yj1je3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e549bdcc-5097-4890-8754-89d9f1d15a13',image_min_disk='1',image_min_ram='0',image_owner_id='521dcd459f144f2bb32de93d50ae0391',image_owner_project_name='tempest-ImagesTestJSON-1780438391',image_owner_user_name='tempest-ImagesTestJSON-1780438391-project-member',image_user_id='46ffd64a348845fab6cdc53249353575',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:03:26Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=070b0301-faed-40e8-877f-c185918ddfd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.354 221324 DEBUG nova.network.os_vif_util [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.356 221324 DEBUG nova.network.os_vif_util [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.357 221324 DEBUG nova.objects.instance [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'pci_devices' on Instance uuid 070b0301-faed-40e8-877f-c185918ddfd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.378 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <uuid>070b0301-faed-40e8-877f-c185918ddfd0</uuid>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <name>instance-00000038</name>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:name>tempest-ImagesTestJSON-server-497297412</nova:name>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:03:34</nova:creationTime>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:user uuid="46ffd64a348845fab6cdc53249353575">tempest-ImagesTestJSON-1780438391-project-member</nova:user>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:project uuid="521dcd459f144f2bb32de93d50ae0391">tempest-ImagesTestJSON-1780438391</nova:project>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="4f6468b7-082d-4995-a147-34ef0ccaa0d2"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <nova:port uuid="d708a459-b972-4057-8948-3f31e556c338">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="serial">070b0301-faed-40e8-877f-c185918ddfd0</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="uuid">070b0301-faed-40e8-877f-c185918ddfd0</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/070b0301-faed-40e8-877f-c185918ddfd0_disk">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/070b0301-faed-40e8-877f-c185918ddfd0_disk.config">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:de:87:ac"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <target dev="tapd708a459-b9"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/console.log" append="off"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:03:36 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:03:36 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:03:36 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:03:36 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.380 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Preparing to wait for external event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.381 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.381 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.382 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.382 221324 DEBUG nova.virt.libvirt.vif [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:03:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-497297412',display_name='tempest-ImagesTestJSON-server-497297412',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-497297412',id=56,image_ref='4f6468b7-082d-4995-a147-34ef0ccaa0d2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-y8yj1je3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e549bdcc-5097-4890-8754-89d9f1d15a13',image_min_disk='1',image_min_ram='0',image_owner_id='521dcd459f144f2bb32de93d50ae0391',image_owner_project_name='tempest-ImagesTestJSON-1780438391',image_owner_user_name='tempest-ImagesTestJSON-1780438391-project-member',image_user_id='46ffd64a348845fab6cdc53249353575',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:03:26Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=070b0301-faed-40e8-877f-c185918ddfd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.383 221324 DEBUG nova.network.os_vif_util [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.383 221324 DEBUG nova.network.os_vif_util [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.384 221324 DEBUG os_vif [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.385 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.385 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.386 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updating instance_info_cache with network_info: [{"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.392 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd708a459-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.393 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd708a459-b9, col_values=(('external_ids', {'iface-id': 'd708a459-b972-4057-8948-3f31e556c338', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:87:ac', 'vm-uuid': '070b0301-faed-40e8-877f-c185918ddfd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 NetworkManager[49080]: <info>  [1769846616.3959] manager: (tapd708a459-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.403 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.404 221324 INFO os_vif [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9')#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.407 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-5ae56bfd-f507-4765-a2ce-17d331c4f93d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.407 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.481 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.481 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.482 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No VIF found with MAC fa:16:3e:de:87:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.482 221324 INFO nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Using config drive#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.517 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:36 np0005603622 kernel: tap9e93fdf5-64 (unregistering): left promiscuous mode
Jan 31 03:03:36 np0005603622 NetworkManager[49080]: <info>  [1769846616.6967] device (tap9e93fdf5-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:03:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:36Z|00162|binding|INFO|Releasing lport 9e93fdf5-6477-4b09-91c4-656c1d85b246 from this chassis (sb_readonly=0)
Jan 31 03:03:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:36Z|00163|binding|INFO|Setting lport 9e93fdf5-6477-4b09-91c4-656c1d85b246 down in Southbound
Jan 31 03:03:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:36Z|00164|binding|INFO|Removing iface tap9e93fdf5-64 ovn-installed in OVS
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.724 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.726 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:36.731 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:0b:0d 10.100.0.14'], port_security=['fa:16:3e:89:0b:0d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5ae56bfd-f507-4765-a2ce-17d331c4f93d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29c39093d6684600bef00c63da85e6b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd560041b-afb9-420f-903c-6f76e66a2643', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0047a36-b8b8-45b8-bab7-50d47452d2d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9e93fdf5-6477-4b09-91c4-656c1d85b246) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:36.732 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9e93fdf5-6477-4b09-91c4-656c1d85b246 in datapath f6748c1c-ecca-4a85-ad94-00568ceeba0f unbound from our chassis#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.732 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:36.733 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6748c1c-ecca-4a85-ad94-00568ceeba0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:03:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:36.735 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[612d1464-93d7-4c5b-a244-b4b187de5c4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:36.736 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f namespace which is not needed anymore#033[00m
Jan 31 03:03:36 np0005603622 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 31 03:03:36 np0005603622 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Consumed 14.672s CPU time.
Jan 31 03:03:36 np0005603622 systemd-machined[190406]: Machine qemu-22-instance-00000037 terminated.
Jan 31 03:03:36 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [NOTICE]   (240793) : haproxy version is 2.8.14-c23fe91
Jan 31 03:03:36 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [NOTICE]   (240793) : path to executable is /usr/sbin/haproxy
Jan 31 03:03:36 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [WARNING]  (240793) : Exiting Master process...
Jan 31 03:03:36 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [ALERT]    (240793) : Current worker (240795) exited with code 143 (Terminated)
Jan 31 03:03:36 np0005603622 neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f[240782]: [WARNING]  (240793) : All workers exited. Exiting... (0)
Jan 31 03:03:36 np0005603622 systemd[1]: libpod-9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367.scope: Deactivated successfully.
Jan 31 03:03:36 np0005603622 podman[241470]: 2026-01-31 08:03:36.901846124 +0000 UTC m=+0.048844871 container died 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:03:36 np0005603622 systemd[1]: var-lib-containers-storage-overlay-ea300a73df940e4737c8b192ae804d2f91e0f9e356216899691ee27cbcf0941c-merged.mount: Deactivated successfully.
Jan 31 03:03:36 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367-userdata-shm.mount: Deactivated successfully.
Jan 31 03:03:36 np0005603622 podman[241470]: 2026-01-31 08:03:36.941626551 +0000 UTC m=+0.088625298 container cleanup 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:03:36 np0005603622 systemd[1]: libpod-conmon-9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367.scope: Deactivated successfully.
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.974 221324 INFO nova.virt.libvirt.driver [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Instance destroyed successfully.#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.975 221324 DEBUG nova.objects.instance [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lazy-loading 'resources' on Instance uuid 5ae56bfd-f507-4765-a2ce-17d331c4f93d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:36 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.997 221324 DEBUG nova.virt.libvirt.vif [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-468493135',display_name='tempest-ServersTestJSON-server-468493135',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-468493135',id=55,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF2XpWAlfsRwpJSOAVFK4YkZRqP2ohouZY8DQgWjLRw8h6ouHzysUnYolJQ1sKy2stcFyqywTXIV0uIGeGj4a5boQRzjMGwEv2bE3Qf/2+L+o6rHW9A8mrJeZiOPli4ICw==',key_name='tempest-keypair-1574727576',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:03:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='29c39093d6684600bef00c63da85e6b7',ramdisk_id='',reservation_id='r-fgutxw9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-254633607',owner_user_name='tempest-ServersTestJSON-254633607-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4eb6c0e509f14fb2958ca76604db17c0',uuid=5ae56bfd-f507-4765-a2ce-17d331c4f93d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:36.999 221324 DEBUG nova.network.os_vif_util [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converting VIF {"id": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "address": "fa:16:3e:89:0b:0d", "network": {"id": "f6748c1c-ecca-4a85-ad94-00568ceeba0f", "bridge": "br-int", "label": "tempest-ServersTestJSON-1761582344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "29c39093d6684600bef00c63da85e6b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e93fdf5-64", "ovs_interfaceid": "9e93fdf5-6477-4b09-91c4-656c1d85b246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.000 221324 DEBUG nova.network.os_vif_util [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.001 221324 DEBUG os_vif [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.004 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e93fdf5-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:03:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:37.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.011 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.014 221324 INFO os_vif [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:0b:0d,bridge_name='br-int',has_traffic_filtering=True,id=9e93fdf5-6477-4b09-91c4-656c1d85b246,network=Network(f6748c1c-ecca-4a85-ad94-00568ceeba0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e93fdf5-64')#033[00m
Jan 31 03:03:37 np0005603622 podman[241502]: 2026-01-31 08:03:37.036856285 +0000 UTC m=+0.075393743 container remove 9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.042 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5519ee8c-ab09-4e86-96c4-0d32d97b9171]: (4, ('Sat Jan 31 08:03:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f (9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367)\n9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367\nSat Jan 31 08:03:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f (9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367)\n9ebbf3cb149fc50fa85a1bced93f00069fb75ff9e5a21c0de1588e2d77eb3367\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.044 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[77280482-199d-42f8-96fc-61d54e79493c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.045 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6748c1c-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:37 np0005603622 kernel: tapf6748c1c-e0: left promiscuous mode
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.047 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.062 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[17a6bf51-7bee-4191-8972-ddfa76421752]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.075 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04096e51-4aff-497e-93b0-7bfb957140da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.076 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b40ea3f3-b67f-49ac-9e76-13b15590d147]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.095 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[42587adf-af86-4468-bc86-a52c2cf6db0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583629, 'reachable_time': 36285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241551, 'error': None, 'target': 'ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.098 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6748c1c-ecca-4a85-ad94-00568ceeba0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:03:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:37.099 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3879312c-a773-4302-a729-a411fc6e053d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:37 np0005603622 systemd[1]: run-netns-ovnmeta\x2df6748c1c\x2decca\x2d4a85\x2dad94\x2d00568ceeba0f.mount: Deactivated successfully.
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.427 221324 INFO nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Creating config drive at /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.431 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgtz0z436 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.573 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgtz0z436" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.608 221324 DEBUG nova.storage.rbd_utils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 070b0301-faed-40e8-877f-c185918ddfd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.613 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config 070b0301-faed-40e8-877f-c185918ddfd0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.707 221324 DEBUG nova.compute.manager [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-unplugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.708 221324 DEBUG oslo_concurrency.lockutils [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.708 221324 DEBUG oslo_concurrency.lockutils [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.709 221324 DEBUG oslo_concurrency.lockutils [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.709 221324 DEBUG nova.compute.manager [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] No waiting events found dispatching network-vif-unplugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:37 np0005603622 nova_compute[221301]: 2026-01-31 08:03:37.709 221324 DEBUG nova.compute.manager [req-3115f04f-e3e5-4d5b-ba17-5104b74986d8 req-7ead9222-eccd-4b8c-b2c2-6f9f0f0e682e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-unplugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:03:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 31 03:03:37 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 31 03:03:38 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 31 03:03:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:38.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.044 221324 DEBUG oslo_concurrency.processutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config 070b0301-faed-40e8-877f-c185918ddfd0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.045 221324 INFO nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Deleting local config drive /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0/disk.config because it was imported into RBD.#033[00m
Jan 31 03:03:38 np0005603622 kernel: tapd708a459-b9: entered promiscuous mode
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.1111] manager: (tapd708a459-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Jan 31 03:03:38 np0005603622 systemd-udevd[241450]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.1236] device (tapd708a459-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.1245] device (tapd708a459-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.149 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:38Z|00165|binding|INFO|Claiming lport d708a459-b972-4057-8948-3f31e556c338 for this chassis.
Jan 31 03:03:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:38Z|00166|binding|INFO|d708a459-b972-4057-8948-3f31e556c338: Claiming fa:16:3e:de:87:ac 10.100.0.14
Jan 31 03:03:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:38Z|00167|binding|INFO|Setting lport d708a459-b972-4057-8948-3f31e556c338 ovn-installed in OVS
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.162 221324 DEBUG nova.network.neutron [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Updated VIF entry in instance network info cache for port d708a459-b972-4057-8948-3f31e556c338. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.163 221324 DEBUG nova.network.neutron [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Updating instance_info_cache with network_info: [{"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:38Z|00168|binding|INFO|Setting lport d708a459-b972-4057-8948-3f31e556c338 up in Southbound
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.166 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:87:ac 10.100.0.14'], port_security=['fa:16:3e:de:87:ac 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '070b0301-faed-40e8-877f-c185918ddfd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24914779-babc-4c55-b38b-adf9bfc5c103', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '521dcd459f144f2bb32de93d50ae0391', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e123a0a-7228-4656-b140-3fc3dfcfddda', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a17edd6-cd7f-4fcb-84f3-df8148e78cb1, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d708a459-b972-4057-8948-3f31e556c338) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.167 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d708a459-b972-4057-8948-3f31e556c338 in datapath 24914779-babc-4c55-b38b-adf9bfc5c103 bound to our chassis#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.169 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24914779-babc-4c55-b38b-adf9bfc5c103#033[00m
Jan 31 03:03:38 np0005603622 systemd-machined[190406]: New machine qemu-23-instance-00000038.
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.180 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cba0ee3c-2d90-445e-91ca-0106d578fb2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.181 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24914779-b1 in ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.183 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24914779-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.183 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8083c2e4-51a1-465e-8ba8-417970459b87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.184 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c79f5c67-3383-4ac5-a302-f70e57193cb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.186 221324 DEBUG oslo_concurrency.lockutils [req-9feb12e4-7952-4feb-a80e-57572ed80e99 req-c3b83d7c-50ff-4d25-88ac-42e5c1cf371d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-070b0301-faed-40e8-877f-c185918ddfd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:03:38 np0005603622 systemd[1]: Started Virtual Machine qemu-23-instance-00000038.
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.194 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[6f174ba8-9fad-41e0-8ccd-fce110c24429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.207 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[89fab1b2-f5b2-4e44-8ff3-f9420158248a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.237 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5d8ff5-ac12-4288-b952-955c9aba5543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.2448] manager: (tap24914779-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fcbc63-3958-4d59-83cd-0cdfb7cbabcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.282 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b6977561-a6e8-4380-9aa8-083b4408322c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.286 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccb4e62-bdfc-431c-81de-b4d086564306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.3118] device (tap24914779-b0): carrier: link connected
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.318 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[47eaa950-ba53-4f6d-9bcb-c647c792fd95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.335 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6615a0fc-c1e4-467b-b2f6-a208c7557b83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24914779-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:0b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586965, 'reachable_time': 27818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241637, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.351 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb6f899-d352-4bc4-a12c-83ee226f8d29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:baf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 586965, 'tstamp': 586965}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241638, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.371 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.370 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d03f9c0c-cdb1-4852-93c0-22d8abd9d3bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24914779-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:0b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586965, 'reachable_time': 27818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241639, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.406 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7af123-553f-48be-8004-42679c33cd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.453 221324 DEBUG nova.compute.manager [req-2af564fe-8013-4c10-a4c0-fa82c882288b req-8683d076-9f7d-47c0-80f5-cbebe8247ba2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.454 221324 DEBUG oslo_concurrency.lockutils [req-2af564fe-8013-4c10-a4c0-fa82c882288b req-8683d076-9f7d-47c0-80f5-cbebe8247ba2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.454 221324 DEBUG oslo_concurrency.lockutils [req-2af564fe-8013-4c10-a4c0-fa82c882288b req-8683d076-9f7d-47c0-80f5-cbebe8247ba2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.454 221324 DEBUG oslo_concurrency.lockutils [req-2af564fe-8013-4c10-a4c0-fa82c882288b req-8683d076-9f7d-47c0-80f5-cbebe8247ba2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.455 221324 DEBUG nova.compute.manager [req-2af564fe-8013-4c10-a4c0-fa82c882288b req-8683d076-9f7d-47c0-80f5-cbebe8247ba2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Processing event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.469 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6978f3ef-ab71-46ef-a324-11813baea740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.471 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24914779-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.473 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.474 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24914779-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 NetworkManager[49080]: <info>  [1769846618.4766] manager: (tap24914779-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 31 03:03:38 np0005603622 kernel: tap24914779-b0: entered promiscuous mode
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.477 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.478 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24914779-b0, col_values=(('external_ids', {'iface-id': '23cfbf86-f443-4dea-a9ae-1c6f9be9ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:38Z|00169|binding|INFO|Releasing lport 23cfbf86-f443-4dea-a9ae-1c6f9be9ee53 from this chassis (sb_readonly=0)
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.482 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.483 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b4da59da-f687-41c5-bdfc-69a524127800]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.484 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-24914779-babc-4c55-b38b-adf9bfc5c103
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 24914779-babc-4c55-b38b-adf9bfc5c103
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:03:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:38.486 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'env', 'PROCESS_TAG=haproxy-24914779-babc-4c55-b38b-adf9bfc5c103', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24914779-babc-4c55-b38b-adf9bfc5c103.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.749 221324 INFO nova.virt.libvirt.driver [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Deleting instance files /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d_del#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.750 221324 INFO nova.virt.libvirt.driver [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Deletion of /var/lib/nova/instances/5ae56bfd-f507-4765-a2ce-17d331c4f93d_del complete#033[00m
Jan 31 03:03:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 31 03:03:38 np0005603622 podman[241688]: 2026-01-31 08:03:38.840623818 +0000 UTC m=+0.070394348 container create ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.850 221324 INFO nova.compute.manager [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Took 4.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.851 221324 DEBUG oslo.service.loopingcall [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.852 221324 DEBUG nova.compute.manager [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.852 221324 DEBUG nova.network.neutron [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:03:38 np0005603622 systemd[1]: Started libpod-conmon-ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772.scope.
Jan 31 03:03:38 np0005603622 podman[241688]: 2026-01-31 08:03:38.796605818 +0000 UTC m=+0.026376328 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:03:38 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:03:38 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aaaf80039331af55baa4f2604aecfb9df27a12424871bd051b785343a8e9b32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:03:38 np0005603622 podman[241688]: 2026-01-31 08:03:38.935072981 +0000 UTC m=+0.164843491 container init ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:03:38 np0005603622 podman[241688]: 2026-01-31 08:03:38.939768377 +0000 UTC m=+0.169538867 container start ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:03:38 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [NOTICE]   (241733) : New worker (241735) forked
Jan 31 03:03:38 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [NOTICE]   (241733) : Loading success.
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.975 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846618.9750094, 070b0301-faed-40e8-877f-c185918ddfd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.976 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] VM Started (Lifecycle Event)#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.978 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.982 221324 DEBUG nova.virt.libvirt.driver [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.985 221324 INFO nova.virt.libvirt.driver [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Instance spawned successfully.#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.986 221324 INFO nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Took 12.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:03:38 np0005603622 nova_compute[221301]: 2026-01-31 08:03:38.986 221324 DEBUG nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:39.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.033 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.037 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.087 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.088 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846618.9790833, 070b0301-faed-40e8-877f-c185918ddfd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.088 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.115 221324 INFO nova.compute.manager [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Took 13.87 seconds to build instance.#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.147 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.153 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846618.9807267, 070b0301-faed-40e8-877f-c185918ddfd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.153 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.159 221324 DEBUG oslo_concurrency.lockutils [None req-0c3fdac3-63be-4689-bba1-7fa34432f736 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.179 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.182 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:03:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.939 221324 DEBUG nova.compute.manager [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.940 221324 DEBUG oslo_concurrency.lockutils [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.940 221324 DEBUG oslo_concurrency.lockutils [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.940 221324 DEBUG oslo_concurrency.lockutils [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.940 221324 DEBUG nova.compute.manager [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] No waiting events found dispatching network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:39 np0005603622 nova_compute[221301]: 2026-01-31 08:03:39.941 221324 WARNING nova.compute.manager [req-96494a69-8b2d-4b7e-a39c-707b36599d60 req-2ad32446-243a-4d46-8397-d0584fe88db9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received unexpected event network-vif-plugged-9e93fdf5-6477-4b09-91c4-656c1d85b246 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:03:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:40.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.634 221324 DEBUG nova.compute.manager [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.635 221324 DEBUG oslo_concurrency.lockutils [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.635 221324 DEBUG oslo_concurrency.lockutils [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.636 221324 DEBUG oslo_concurrency.lockutils [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.636 221324 DEBUG nova.compute.manager [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] No waiting events found dispatching network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.636 221324 WARNING nova.compute.manager [req-eb555a57-2032-44eb-9465-419b32d36840 req-6f0d39f2-ae93-4ba1-b458-70e8ec47f5b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received unexpected event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:03:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.742 221324 DEBUG nova.network.neutron [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.781 221324 INFO nova.compute.manager [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Took 1.93 seconds to deallocate network for instance.#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.901 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:40 np0005603622 nova_compute[221301]: 2026-01-31 08:03:40.902 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:41.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.304 221324 DEBUG oslo_concurrency.processutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.756 221324 DEBUG oslo_concurrency.processutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.764 221324 DEBUG nova.compute.provider_tree [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.797 221324 DEBUG nova.scheduler.client.report [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.830 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.891 221324 INFO nova.scheduler.client.report [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Deleted allocations for instance 5ae56bfd-f507-4765-a2ce-17d331c4f93d#033[00m
Jan 31 03:03:41 np0005603622 nova_compute[221301]: 2026-01-31 08:03:41.988 221324 DEBUG oslo_concurrency.lockutils [None req-f79b5f35-254b-4444-a57c-9b9873e1fb08 4eb6c0e509f14fb2958ca76604db17c0 29c39093d6684600bef00c63da85e6b7 - - default default] Lock "5ae56bfd-f507-4765-a2ce-17d331c4f93d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:42.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:42 np0005603622 podman[241766]: 2026-01-31 08:03:42.163594175 +0000 UTC m=+0.087066916 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.183 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.184 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.184 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.185 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.185 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.187 221324 INFO nova.compute.manager [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Terminating instance#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.188 221324 DEBUG nova.compute.manager [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:03:42 np0005603622 kernel: tapd708a459-b9 (unregistering): left promiscuous mode
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.230 221324 DEBUG nova.compute.manager [req-c8d29981-45f3-482f-9876-3bcbd52102ff req-91e24d2c-221a-4ef8-a017-1425bf4fe258 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Received event network-vif-deleted-9e93fdf5-6477-4b09-91c4-656c1d85b246 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 NetworkManager[49080]: <info>  [1769846622.2345] device (tapd708a459-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.244 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:42Z|00170|binding|INFO|Releasing lport d708a459-b972-4057-8948-3f31e556c338 from this chassis (sb_readonly=0)
Jan 31 03:03:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:42Z|00171|binding|INFO|Setting lport d708a459-b972-4057-8948-3f31e556c338 down in Southbound
Jan 31 03:03:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:03:42Z|00172|binding|INFO|Removing iface tapd708a459-b9 ovn-installed in OVS
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.252 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:87:ac 10.100.0.14'], port_security=['fa:16:3e:de:87:ac 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '070b0301-faed-40e8-877f-c185918ddfd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24914779-babc-4c55-b38b-adf9bfc5c103', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '521dcd459f144f2bb32de93d50ae0391', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e123a0a-7228-4656-b140-3fc3dfcfddda', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a17edd6-cd7f-4fcb-84f3-df8148e78cb1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d708a459-b972-4057-8948-3f31e556c338) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.254 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d708a459-b972-4057-8948-3f31e556c338 in datapath 24914779-babc-4c55-b38b-adf9bfc5c103 unbound from our chassis#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.255 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.256 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24914779-babc-4c55-b38b-adf9bfc5c103, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.257 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5e7b8e-19c4-4e0f-8d16-86d005a7a84f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.260 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 namespace which is not needed anymore#033[00m
Jan 31 03:03:42 np0005603622 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 31 03:03:42 np0005603622 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000038.scope: Consumed 4.246s CPU time.
Jan 31 03:03:42 np0005603622 systemd-machined[190406]: Machine qemu-23-instance-00000038 terminated.
Jan 31 03:03:42 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [NOTICE]   (241733) : haproxy version is 2.8.14-c23fe91
Jan 31 03:03:42 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [NOTICE]   (241733) : path to executable is /usr/sbin/haproxy
Jan 31 03:03:42 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [WARNING]  (241733) : Exiting Master process...
Jan 31 03:03:42 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [ALERT]    (241733) : Current worker (241735) exited with code 143 (Terminated)
Jan 31 03:03:42 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[241726]: [WARNING]  (241733) : All workers exited. Exiting... (0)
Jan 31 03:03:42 np0005603622 systemd[1]: libpod-ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772.scope: Deactivated successfully.
Jan 31 03:03:42 np0005603622 podman[241838]: 2026-01-31 08:03:42.417658009 +0000 UTC m=+0.051487952 container died ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.456 221324 INFO nova.virt.libvirt.driver [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Instance destroyed successfully.#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.457 221324 DEBUG nova.objects.instance [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'resources' on Instance uuid 070b0301-faed-40e8-877f-c185918ddfd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:03:42 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772-userdata-shm.mount: Deactivated successfully.
Jan 31 03:03:42 np0005603622 systemd[1]: var-lib-containers-storage-overlay-6aaaf80039331af55baa4f2604aecfb9df27a12424871bd051b785343a8e9b32-merged.mount: Deactivated successfully.
Jan 31 03:03:42 np0005603622 podman[241838]: 2026-01-31 08:03:42.475230022 +0000 UTC m=+0.109059975 container cleanup ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:03:42 np0005603622 systemd[1]: libpod-conmon-ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772.scope: Deactivated successfully.
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.499 221324 DEBUG nova.virt.libvirt.vif [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:03:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-497297412',display_name='tempest-ImagesTestJSON-server-497297412',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-497297412',id=56,image_ref='4f6468b7-082d-4995-a147-34ef0ccaa0d2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:03:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-y8yj1je3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='e549bdcc-5097-4890-8754-89d9f1d15a13',image_min_disk='1',image_min_ram='0',image_owner_id='521dcd459f144f2bb32de93d50ae0391',image_owner_project_name='tempest-ImagesTestJSON-1780438391',image_owner_user_name='tempest-ImagesTestJSON-1780438391-project-member',image_user_id='46ffd64a348845fab6cdc53249353575',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:03:39Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=070b0301-faed-40e8-877f-c185918ddfd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.499 221324 DEBUG nova.network.os_vif_util [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "d708a459-b972-4057-8948-3f31e556c338", "address": "fa:16:3e:de:87:ac", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd708a459-b9", "ovs_interfaceid": "d708a459-b972-4057-8948-3f31e556c338", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.500 221324 DEBUG nova.network.os_vif_util [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.501 221324 DEBUG os_vif [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.503 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd708a459-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.510 221324 INFO os_vif [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:87:ac,bridge_name='br-int',has_traffic_filtering=True,id=d708a459-b972-4057-8948-3f31e556c338,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd708a459-b9')#033[00m
Jan 31 03:03:42 np0005603622 podman[241908]: 2026-01-31 08:03:42.546504774 +0000 UTC m=+0.052485999 container remove ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.554 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4a070178-2ac3-4da6-9d8d-ac51a5be27c6]: (4, ('Sat Jan 31 08:03:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 (ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772)\nffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772\nSat Jan 31 08:03:42 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 (ffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772)\nffeda309806590b81a57e224a0ae7e651534d4c3b37b66ac8e7c8ebc09300772\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.556 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3808a5-e8d4-4b3e-8960-677fc2134912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.557 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24914779-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 kernel: tap24914779-b0: left promiscuous mode
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.564 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[91f8771b-113d-4549-bfc3-70d8b13bbbcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.583 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4e5b5697-b5dc-44f1-8549-e12d414d7c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ebecd440-e62c-4114-b3df-bbda2143c190]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.596 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[480ce1fc-8443-4a0f-8b3b-2adf554a3bfa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 586957, 'reachable_time': 25499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241937, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 systemd[1]: run-netns-ovnmeta\x2d24914779\x2dbabc\x2d4c55\x2db38b\x2dadf9bfc5c103.mount: Deactivated successfully.
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.600 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:03:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:03:42.601 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[81c52e57-0d45-4a98-9404-8cf823d26221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.814 221324 DEBUG nova.compute.manager [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-unplugged-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.814 221324 DEBUG oslo_concurrency.lockutils [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.815 221324 DEBUG oslo_concurrency.lockutils [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.815 221324 DEBUG oslo_concurrency.lockutils [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.816 221324 DEBUG nova.compute.manager [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] No waiting events found dispatching network-vif-unplugged-d708a459-b972-4057-8948-3f31e556c338 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.816 221324 DEBUG nova.compute.manager [req-3d7cb9b5-c722-4b6c-8be3-4af69955c4dc req-5afb6426-2f1f-4a2a-97eb-0be99d03713f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-unplugged-d708a459-b972-4057-8948-3f31e556c338 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.928 221324 INFO nova.virt.libvirt.driver [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Deleting instance files /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0_del#033[00m
Jan 31 03:03:42 np0005603622 nova_compute[221301]: 2026-01-31 08:03:42.929 221324 INFO nova.virt.libvirt.driver [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Deletion of /var/lib/nova/instances/070b0301-faed-40e8-877f-c185918ddfd0_del complete#033[00m
Jan 31 03:03:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:03:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:43.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:03:43 np0005603622 nova_compute[221301]: 2026-01-31 08:03:43.039 221324 INFO nova.compute.manager [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:03:43 np0005603622 nova_compute[221301]: 2026-01-31 08:03:43.040 221324 DEBUG oslo.service.loopingcall [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:03:43 np0005603622 nova_compute[221301]: 2026-01-31 08:03:43.040 221324 DEBUG nova.compute.manager [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:03:43 np0005603622 nova_compute[221301]: 2026-01-31 08:03:43.040 221324 DEBUG nova.network.neutron [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:03:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:03:43 np0005603622 nova_compute[221301]: 2026-01-31 08:03:43.374 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:44.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.153 221324 DEBUG nova.network.neutron [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.176 221324 INFO nova.compute.manager [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.245 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.246 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.308 221324 DEBUG oslo_concurrency.processutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.340 221324 DEBUG nova.compute.manager [req-b327e6c8-4ef0-417d-abea-6293d2852eb2 req-512a06b2-cc31-4bb6-a738-02f3ba653b46 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-deleted-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/481556654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.805 221324 DEBUG oslo_concurrency.processutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.813 221324 DEBUG nova.compute.provider_tree [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.831 221324 DEBUG nova.scheduler.client.report [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.887 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.959 221324 INFO nova.scheduler.client.report [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Deleted allocations for instance 070b0301-faed-40e8-877f-c185918ddfd0#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.965 221324 DEBUG nova.compute.manager [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.965 221324 DEBUG oslo_concurrency.lockutils [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "070b0301-faed-40e8-877f-c185918ddfd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.965 221324 DEBUG oslo_concurrency.lockutils [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.966 221324 DEBUG oslo_concurrency.lockutils [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.966 221324 DEBUG nova.compute.manager [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] No waiting events found dispatching network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:03:44 np0005603622 nova_compute[221301]: 2026-01-31 08:03:44.966 221324 WARNING nova.compute.manager [req-55ec1273-01ad-4433-a718-d85e480385f6 req-679f5e11-ef55-462e-8880-56f22058190f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Received unexpected event network-vif-plugged-d708a459-b972-4057-8948-3f31e556c338 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:03:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:45.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:45 np0005603622 nova_compute[221301]: 2026-01-31 08:03:45.039 221324 DEBUG oslo_concurrency.lockutils [None req-867cd7c2-a446-4401-8ccf-9c41d8900b71 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "070b0301-faed-40e8-877f-c185918ddfd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:46.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:47.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 31 03:03:47 np0005603622 podman[241961]: 2026-01-31 08:03:47.146227041 +0000 UTC m=+0.062799225 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:03:47 np0005603622 nova_compute[221301]: 2026-01-31 08:03:47.506 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 31 03:03:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:48.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:48 np0005603622 nova_compute[221301]: 2026-01-31 08:03:48.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:48 np0005603622 nova_compute[221301]: 2026-01-31 08:03:48.308 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:48 np0005603622 nova_compute[221301]: 2026-01-31 08:03:48.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:50.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:51.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:51 np0005603622 nova_compute[221301]: 2026-01-31 08:03:51.973 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846616.971751, 5ae56bfd-f507-4765-a2ce-17d331c4f93d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:51 np0005603622 nova_compute[221301]: 2026-01-31 08:03:51.974 221324 INFO nova.compute.manager [-] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:03:51 np0005603622 nova_compute[221301]: 2026-01-31 08:03:51.996 221324 DEBUG nova.compute.manager [None req-e5f0fb8b-cab7-48be-a4de-6d355e0e69f9 - - - - - -] [instance: 5ae56bfd-f507-4765-a2ce-17d331c4f93d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:52.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:52 np0005603622 nova_compute[221301]: 2026-01-31 08:03:52.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:53.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:53 np0005603622 nova_compute[221301]: 2026-01-31 08:03:53.408 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:54.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1393906496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.797 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.798 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.834 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.958 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.959 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.965 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:03:54 np0005603622 nova_compute[221301]: 2026-01-31 08:03:54.966 221324 INFO nova.compute.claims [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:03:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:55.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.097 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:03:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2529161677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.579 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.587 221324 DEBUG nova.compute.provider_tree [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.612 221324 DEBUG nova.scheduler.client.report [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.656 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.657 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:03:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.740 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.740 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:03:55 np0005603622 nova_compute[221301]: 2026-01-31 08:03:55.950 221324 INFO nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.009 221324 DEBUG nova.policy [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '46ffd64a348845fab6cdc53249353575', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '521dcd459f144f2bb32de93d50ae0391', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:03:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:56.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.068 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:03:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.310 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.311 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.312 221324 INFO nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Creating image(s)#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.548 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.585 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.621 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.625 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.711 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.712 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.713 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.713 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.750 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.757 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:03:56 np0005603622 nova_compute[221301]: 2026-01-31 08:03:56.803 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Successfully created port: 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:03:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:57.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:57 np0005603622 nova_compute[221301]: 2026-01-31 08:03:57.454 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846622.4536254, 070b0301-faed-40e8-877f-c185918ddfd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:03:57 np0005603622 nova_compute[221301]: 2026-01-31 08:03:57.455 221324 INFO nova.compute.manager [-] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:03:57 np0005603622 nova_compute[221301]: 2026-01-31 08:03:57.509 221324 DEBUG nova.compute.manager [None req-7beb0a1d-9572-453c-afc6-e5b6faba9fb0 - - - - - -] [instance: 070b0301-faed-40e8-877f-c185918ddfd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:03:57 np0005603622 nova_compute[221301]: 2026-01-31 08:03:57.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:03:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:03:58.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:03:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.410 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.563 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Successfully updated port: 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.577 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.577 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquired lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.577 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.680 221324 DEBUG nova.compute.manager [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-changed-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.681 221324 DEBUG nova.compute.manager [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Refreshing instance network info cache due to event network-changed-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.681 221324 DEBUG oslo_concurrency.lockutils [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:03:58 np0005603622 nova_compute[221301]: 2026-01-31 08:03:58.955 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:03:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:03:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:03:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:03:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:03:59 np0005603622 nova_compute[221301]: 2026-01-31 08:03:59.778 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:03:59 np0005603622 nova_compute[221301]: 2026-01-31 08:03:59.855 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] resizing rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:04:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:00.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.331 221324 DEBUG nova.network.neutron [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Updating instance_info_cache with network_info: [{"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.359 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Releasing lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.360 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Instance network_info: |[{"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.360 221324 DEBUG oslo_concurrency.lockutils [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.360 221324 DEBUG nova.network.neutron [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Refreshing network info cache for port 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:04:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.784 221324 DEBUG nova.objects.instance [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'migration_context' on Instance uuid 734f32c9-a26a-41c8-bd90-f34e16a6503c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.798 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.798 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Ensure instance console log exists: /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.798 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.799 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.799 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.801 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Start _get_guest_xml network_info=[{"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.805 221324 WARNING nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.811 221324 DEBUG nova.virt.libvirt.host [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.812 221324 DEBUG nova.virt.libvirt.host [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.816 221324 DEBUG nova.virt.libvirt.host [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.816 221324 DEBUG nova.virt.libvirt.host [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.817 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.818 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.818 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.818 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.818 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.819 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.819 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.819 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.819 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.819 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.820 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.820 221324 DEBUG nova.virt.hardware [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:04:00 np0005603622 nova_compute[221301]: 2026-01-31 08:04:00.822 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:04:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1659773761' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:04:01 np0005603622 nova_compute[221301]: 2026-01-31 08:04:01.576 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.754s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:01 np0005603622 nova_compute[221301]: 2026-01-31 08:04:01.618 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:01 np0005603622 nova_compute[221301]: 2026-01-31 08:04:01.624 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:04:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1657072717' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:04:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:02.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.059 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.062 221324 DEBUG nova.virt.libvirt.vif [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-393664750',display_name='tempest-ImagesTestJSON-server-393664750',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-393664750',id=59,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-w4t0oqj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:03:56Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=734f32c9-a26a-41c8-bd90-f34e16a6503c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.063 221324 DEBUG nova.network.os_vif_util [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.064 221324 DEBUG nova.network.os_vif_util [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.067 221324 DEBUG nova.objects.instance [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'pci_devices' on Instance uuid 734f32c9-a26a-41c8-bd90-f34e16a6503c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.099 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <uuid>734f32c9-a26a-41c8-bd90-f34e16a6503c</uuid>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <name>instance-0000003b</name>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:name>tempest-ImagesTestJSON-server-393664750</nova:name>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:04:00</nova:creationTime>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:user uuid="46ffd64a348845fab6cdc53249353575">tempest-ImagesTestJSON-1780438391-project-member</nova:user>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:project uuid="521dcd459f144f2bb32de93d50ae0391">tempest-ImagesTestJSON-1780438391</nova:project>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <nova:port uuid="9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="serial">734f32c9-a26a-41c8-bd90-f34e16a6503c</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="uuid">734f32c9-a26a-41c8-bd90-f34e16a6503c</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/734f32c9-a26a-41c8-bd90-f34e16a6503c_disk">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:49:32:ed"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <target dev="tap9bb7c5ba-8f"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/console.log" append="off"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:04:02 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:04:02 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:04:02 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:04:02 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.100 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Preparing to wait for external event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.101 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.101 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.102 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.103 221324 DEBUG nova.virt.libvirt.vif [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-393664750',display_name='tempest-ImagesTestJSON-server-393664750',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-393664750',id=59,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-w4t0oqj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:03:56Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=734f32c9-a26a-41c8-bd90-f34e16a6503c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.103 221324 DEBUG nova.network.os_vif_util [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.104 221324 DEBUG nova.network.os_vif_util [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.104 221324 DEBUG os_vif [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.105 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.105 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.106 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.109 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.109 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb7c5ba-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.110 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bb7c5ba-8f, col_values=(('external_ids', {'iface-id': '9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:32:ed', 'vm-uuid': '734f32c9-a26a-41c8-bd90-f34e16a6503c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.112 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:02 np0005603622 NetworkManager[49080]: <info>  [1769846642.1131] manager: (tap9bb7c5ba-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.114 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.122 221324 INFO os_vif [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f')#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.224 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.226 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.227 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No VIF found with MAC fa:16:3e:49:32:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.228 221324 INFO nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Using config drive#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.262 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.699 221324 DEBUG nova.network.neutron [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Updated VIF entry in instance network info cache for port 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.699 221324 DEBUG nova.network.neutron [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Updating instance_info_cache with network_info: [{"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.726 221324 DEBUG oslo_concurrency.lockutils [req-98d9be7c-73f7-4227-9f6f-1b6baba662c1 req-835ec10d-6b9f-41bd-af2e-acb5aefca572 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-734f32c9-a26a-41c8-bd90-f34e16a6503c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.805 221324 INFO nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Creating config drive at /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.808 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3juxozg0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.934 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3juxozg0" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.963 221324 DEBUG nova.storage.rbd_utils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] rbd image 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:02 np0005603622 nova_compute[221301]: 2026-01-31 08:04:02.967 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:03.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:03 np0005603622 nova_compute[221301]: 2026-01-31 08:04:03.442 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:04.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.443 221324 DEBUG oslo_concurrency.processutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config 734f32c9-a26a-41c8-bd90-f34e16a6503c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.444 221324 INFO nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Deleting local config drive /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c/disk.config because it was imported into RBD.#033[00m
Jan 31 03:04:04 np0005603622 kernel: tap9bb7c5ba-8f: entered promiscuous mode
Jan 31 03:04:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:04Z|00173|binding|INFO|Claiming lport 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 for this chassis.
Jan 31 03:04:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:04Z|00174|binding|INFO|9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9: Claiming fa:16:3e:49:32:ed 10.100.0.7
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.5079] manager: (tap9bb7c5ba-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.511 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.521 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:32:ed 10.100.0.7'], port_security=['fa:16:3e:49:32:ed 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '734f32c9-a26a-41c8-bd90-f34e16a6503c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24914779-babc-4c55-b38b-adf9bfc5c103', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '521dcd459f144f2bb32de93d50ae0391', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3e123a0a-7228-4656-b140-3fc3dfcfddda', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a17edd6-cd7f-4fcb-84f3-df8148e78cb1, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.523 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 in datapath 24914779-babc-4c55-b38b-adf9bfc5c103 bound to our chassis#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.525 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24914779-babc-4c55-b38b-adf9bfc5c103#033[00m
Jan 31 03:04:04 np0005603622 systemd-udevd[242307]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.535 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2e5563-b0be-4de9-8f57-213f6b5bfebc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.536 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24914779-b1 in ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:04:04 np0005603622 systemd-machined[190406]: New machine qemu-24-instance-0000003b.
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.539 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24914779-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.539 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09401cf9-e4ba-4c8d-918e-a9aa6d5cef55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.541 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9dce3f-6b6c-4590-858a-60ae73bf3640]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.5480] device (tap9bb7c5ba-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.546 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:04Z|00175|binding|INFO|Setting lport 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 ovn-installed in OVS
Jan 31 03:04:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:04Z|00176|binding|INFO|Setting lport 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 up in Southbound
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.5490] device (tap9bb7c5ba-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:04:04 np0005603622 systemd[1]: Started Virtual Machine qemu-24-instance-0000003b.
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.556 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e73266ca-52fd-4a51-99ff-7f9e5044956f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.568 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc8ed3f-ff61-4637-8e9f-63744955359c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.597 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b495864e-05b9-4cee-81bc-4d55cb8272aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.602 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1862677d-d86b-4f5c-9bc6-8ce2cc6d0637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.6036] manager: (tap24914779-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.630 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6fea4b28-ff5d-41da-bdf3-1ed888c4794c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.634 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cfcf9e-4dda-49b0-9e67-2ad88b24b74e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.6566] device (tap24914779-b0): carrier: link connected
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.661 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[83fa15df-9a3b-46d4-9b99-023c53104fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.676 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc0f415-56d5-4fb9-a5e9-2aca08c5aa17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24914779-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:0b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589599, 'reachable_time': 19072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242340, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.691 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fad4b2-c78f-44ae-8e02-f6535d4d474e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:baf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589599, 'tstamp': 589599}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242341, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.707 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f933a5f-029b-4e60-8a1b-f1ed4766994d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24914779-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:0b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589599, 'reachable_time': 19072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242342, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.738 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[833a736a-82dd-4dcb-a8d7-9a704f235b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.799 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f90a511-7ebe-4225-9320-b6ab90687646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.801 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24914779-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.801 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.802 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24914779-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 NetworkManager[49080]: <info>  [1769846644.8050] manager: (tap24914779-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 31 03:04:04 np0005603622 kernel: tap24914779-b0: entered promiscuous mode
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.808 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24914779-b0, col_values=(('external_ids', {'iface-id': '23cfbf86-f443-4dea-a9ae-1c6f9be9ee53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.810 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:04Z|00177|binding|INFO|Releasing lport 23cfbf86-f443-4dea-a9ae-1c6f9be9ee53 from this chassis (sb_readonly=0)
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.816 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.819 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[891670ce-0f38-4e62-aa39-111fec7bc673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.820 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-24914779-babc-4c55-b38b-adf9bfc5c103
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/24914779-babc-4c55-b38b-adf9bfc5c103.pid.haproxy
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 24914779-babc-4c55-b38b-adf9bfc5c103
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:04:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:04.822 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'env', 'PROCESS_TAG=haproxy-24914779-babc-4c55-b38b-adf9bfc5c103', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24914779-babc-4c55-b38b-adf9bfc5c103.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.965 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846644.9646268, 734f32c9-a26a-41c8-bd90-f34e16a6503c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.965 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] VM Started (Lifecycle Event)#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.992 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.996 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846644.9648223, 734f32c9-a26a-41c8-bd90-f34e16a6503c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:04 np0005603622 nova_compute[221301]: 2026-01-31 08:04:04.996 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.013 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.017 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.036 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:04:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:05.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.059 221324 DEBUG nova.compute.manager [req-d09a5ff7-ebba-478f-8937-bc29092b4d23 req-3f876334-97c0-4e4e-8ba6-3628363b1cf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.059 221324 DEBUG oslo_concurrency.lockutils [req-d09a5ff7-ebba-478f-8937-bc29092b4d23 req-3f876334-97c0-4e4e-8ba6-3628363b1cf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.060 221324 DEBUG oslo_concurrency.lockutils [req-d09a5ff7-ebba-478f-8937-bc29092b4d23 req-3f876334-97c0-4e4e-8ba6-3628363b1cf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.060 221324 DEBUG oslo_concurrency.lockutils [req-d09a5ff7-ebba-478f-8937-bc29092b4d23 req-3f876334-97c0-4e4e-8ba6-3628363b1cf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.060 221324 DEBUG nova.compute.manager [req-d09a5ff7-ebba-478f-8937-bc29092b4d23 req-3f876334-97c0-4e4e-8ba6-3628363b1cf0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Processing event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.061 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.064 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846645.0641744, 734f32c9-a26a-41c8-bd90-f34e16a6503c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.064 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.066 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.069 221324 INFO nova.virt.libvirt.driver [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Instance spawned successfully.#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.069 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.095 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.100 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.100 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.100 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.101 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.101 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.102 221324 DEBUG nova.virt.libvirt.driver [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.106 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.136 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:04:05 np0005603622 podman[242416]: 2026-01-31 08:04:05.180595403 +0000 UTC m=+0.049651683 container create df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.206 221324 INFO nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Took 8.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.207 221324 DEBUG nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:05 np0005603622 systemd[1]: Started libpod-conmon-df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070.scope.
Jan 31 03:04:05 np0005603622 podman[242416]: 2026-01-31 08:04:05.151065121 +0000 UTC m=+0.020121421 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:04:05 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:04:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9c080e27c399be099839ecc39e37d61633dff66de49eb96b1f09c4b9cd6239/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:04:05 np0005603622 podman[242416]: 2026-01-31 08:04:05.293972053 +0000 UTC m=+0.163028323 container init df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:04:05 np0005603622 podman[242416]: 2026-01-31 08:04:05.306248812 +0000 UTC m=+0.175305062 container start df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.307 221324 INFO nova.compute.manager [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Took 10.39 seconds to build instance.#033[00m
Jan 31 03:04:05 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [NOTICE]   (242435) : New worker (242437) forked
Jan 31 03:04:05 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [NOTICE]   (242435) : Loading success.
Jan 31 03:04:05 np0005603622 nova_compute[221301]: 2026-01-31 08:04:05.327 221324 DEBUG oslo_concurrency.lockutils [None req-6863f4c7-03c1-49ea-be50-100f591f0702 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 31 03:04:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:06.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 31 03:04:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:07.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.807551) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647807658, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1878, "num_deletes": 262, "total_data_size": 4214693, "memory_usage": 4272400, "flush_reason": "Manual Compaction"}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647820160, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1881148, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33774, "largest_seqno": 35647, "table_properties": {"data_size": 1874307, "index_size": 3787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 17237, "raw_average_key_size": 22, "raw_value_size": 1859739, "raw_average_value_size": 2375, "num_data_blocks": 164, "num_entries": 783, "num_filter_entries": 783, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846523, "oldest_key_time": 1769846523, "file_creation_time": 1769846647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 12875 microseconds, and 5276 cpu microseconds.
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.820429) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1881148 bytes OK
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.820500) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.822053) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.822081) EVENT_LOG_v1 {"time_micros": 1769846647822072, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.822112) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 4205977, prev total WAL file size 4205977, number of live WAL files 2.
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.823589) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323533' seq:0, type:0; will stop at (end)
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1837KB)], [63(10MB)]
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647823658, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 12626876, "oldest_snapshot_seqno": -1}
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.855 221324 DEBUG nova.compute.manager [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.856 221324 DEBUG oslo_concurrency.lockutils [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.856 221324 DEBUG oslo_concurrency.lockutils [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.856 221324 DEBUG oslo_concurrency.lockutils [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.857 221324 DEBUG nova.compute.manager [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] No waiting events found dispatching network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:04:07 np0005603622 nova_compute[221301]: 2026-01-31 08:04:07.857 221324 WARNING nova.compute.manager [req-829c91a5-7d3c-4569-af6a-145f725a393a req-a885bec8-2ad4-4bde-8642-e1277fac16d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received unexpected event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6056 keys, 9605367 bytes, temperature: kUnknown
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647893174, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 9605367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9564900, "index_size": 24229, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 154849, "raw_average_key_size": 25, "raw_value_size": 9456340, "raw_average_value_size": 1561, "num_data_blocks": 977, "num_entries": 6056, "num_filter_entries": 6056, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.893881) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 9605367 bytes
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.895520) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.3 rd, 137.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 10.2 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(11.8) write-amplify(5.1) OK, records in: 6537, records dropped: 481 output_compression: NoCompression
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.895550) EVENT_LOG_v1 {"time_micros": 1769846647895535, "job": 38, "event": "compaction_finished", "compaction_time_micros": 69632, "compaction_time_cpu_micros": 25402, "output_level": 6, "num_output_files": 1, "total_output_size": 9605367, "num_input_records": 6537, "num_output_records": 6056, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647896219, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846647898092, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.823502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.898272) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.898297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.898299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.898302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:07.898304) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:08.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:08 np0005603622 nova_compute[221301]: 2026-01-31 08:04:08.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:09 np0005603622 nova_compute[221301]: 2026-01-31 08:04:09.651 221324 DEBUG nova.compute.manager [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:09 np0005603622 nova_compute[221301]: 2026-01-31 08:04:09.726 221324 INFO nova.compute.manager [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] instance snapshotting#033[00m
Jan 31 03:04:10 np0005603622 nova_compute[221301]: 2026-01-31 08:04:10.050 221324 INFO nova.virt.libvirt.driver [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Beginning live snapshot process#033[00m
Jan 31 03:04:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:10.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:10 np0005603622 nova_compute[221301]: 2026-01-31 08:04:10.274 221324 DEBUG nova.virt.libvirt.imagebackend [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:04:10 np0005603622 nova_compute[221301]: 2026-01-31 08:04:10.663 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] creating snapshot(623b893a25e643f88623935782467a92) on rbd image(734f32c9-a26a-41c8-bd90-f34e16a6503c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:04:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:04:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:11.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:04:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 31 03:04:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:12.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:12 np0005603622 nova_compute[221301]: 2026-01-31 08:04:12.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:12 np0005603622 nova_compute[221301]: 2026-01-31 08:04:12.442 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] cloning vms/734f32c9-a26a-41c8-bd90-f34e16a6503c_disk@623b893a25e643f88623935782467a92 to images/0c74ddc7-49a4-4f2d-83ce-f79f5ae16288 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:04:12 np0005603622 nova_compute[221301]: 2026-01-31 08:04:12.986 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] flattening images/0c74ddc7-49a4-4f2d-83ce-f79f5ae16288 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:04:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 31 03:04:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:13.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:13 np0005603622 podman[242551]: 2026-01-31 08:04:13.220295384 +0000 UTC m=+0.146079769 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:04:13 np0005603622 nova_compute[221301]: 2026-01-31 08:04:13.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:14.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:14 np0005603622 nova_compute[221301]: 2026-01-31 08:04:14.067 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] removing snapshot(623b893a25e643f88623935782467a92) on rbd image(734f32c9-a26a-41c8-bd90-f34e16a6503c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:04:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:04:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1239223878' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:04:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:04:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1239223878' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:04:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 31 03:04:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:15.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:15 np0005603622 nova_compute[221301]: 2026-01-31 08:04:15.115 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] creating snapshot(snap) on rbd image(0c74ddc7-49a4-4f2d-83ce-f79f5ae16288) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:04:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 0c74ddc7-49a4-4f2d-83ce-f79f5ae16288 could not be found.
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 0c74ddc7-49a4-4f2d-83ce-f79f5ae16288
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver 
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver 
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 0c74ddc7-49a4-4f2d-83ce-f79f5ae16288 could not be found.
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.515 221324 ERROR nova.virt.libvirt.driver #033[00m
Jan 31 03:04:16 np0005603622 nova_compute[221301]: 2026-01-31 08:04:16.611 221324 DEBUG nova.storage.rbd_utils [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] removing snapshot(snap) on rbd image(0c74ddc7-49a4-4f2d-83ce-f79f5ae16288) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:04:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:17.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:17 np0005603622 nova_compute[221301]: 2026-01-31 08:04:17.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 31 03:04:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:18.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:18 np0005603622 podman[242649]: 2026-01-31 08:04:18.165704851 +0000 UTC m=+0.080206702 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 03:04:18 np0005603622 nova_compute[221301]: 2026-01-31 08:04:18.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:18 np0005603622 nova_compute[221301]: 2026-01-31 08:04:18.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:04:18 np0005603622 nova_compute[221301]: 2026-01-31 08:04:18.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:18 np0005603622 nova_compute[221301]: 2026-01-31 08:04:18.472 221324 WARNING nova.compute.manager [None req-35e54e37-dc63-41f6-9357-246f20871c40 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Image not found during snapshot: nova.exception.ImageNotFound: Image 0c74ddc7-49a4-4f2d-83ce-f79f5ae16288 could not be found.#033[00m
Jan 31 03:04:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:19.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 31 03:04:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 31 03:04:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:20Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:32:ed 10.100.0.7
Jan 31 03:04:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:20Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:32:ed 10.100.0.7
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.721 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.722 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.722 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.722 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.723 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.724 221324 INFO nova.compute.manager [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Terminating instance#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.725 221324 DEBUG nova.compute.manager [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:04:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:20 np0005603622 kernel: tap9bb7c5ba-8f (unregistering): left promiscuous mode
Jan 31 03:04:20 np0005603622 NetworkManager[49080]: <info>  [1769846660.8066] device (tap9bb7c5ba-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:04:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:20Z|00178|binding|INFO|Releasing lport 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 from this chassis (sb_readonly=0)
Jan 31 03:04:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:20Z|00179|binding|INFO|Setting lport 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 down in Southbound
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.851 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:20Z|00180|binding|INFO|Removing iface tap9bb7c5ba-8f ovn-installed in OVS
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.854 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:20.861 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:32:ed 10.100.0.7'], port_security=['fa:16:3e:49:32:ed 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '734f32c9-a26a-41c8-bd90-f34e16a6503c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24914779-babc-4c55-b38b-adf9bfc5c103', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '521dcd459f144f2bb32de93d50ae0391', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3e123a0a-7228-4656-b140-3fc3dfcfddda', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a17edd6-cd7f-4fcb-84f3-df8148e78cb1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:20.864 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 in datapath 24914779-babc-4c55-b38b-adf9bfc5c103 unbound from our chassis#033[00m
Jan 31 03:04:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:20.867 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24914779-babc-4c55-b38b-adf9bfc5c103, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:04:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:20.869 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e17706e8-c46b-4472-8bb9-d445e6421461]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:20.870 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 namespace which is not needed anymore#033[00m
Jan 31 03:04:20 np0005603622 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Jan 31 03:04:20 np0005603622 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003b.scope: Consumed 13.499s CPU time.
Jan 31 03:04:20 np0005603622 systemd-machined[190406]: Machine qemu-24-instance-0000003b terminated.
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.963 221324 INFO nova.virt.libvirt.driver [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Instance destroyed successfully.#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.965 221324 DEBUG nova.objects.instance [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lazy-loading 'resources' on Instance uuid 734f32c9-a26a-41c8-bd90-f34e16a6503c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.988 221324 DEBUG nova.virt.libvirt.vif [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-393664750',display_name='tempest-ImagesTestJSON-server-393664750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-393664750',id=59,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='521dcd459f144f2bb32de93d50ae0391',ramdisk_id='',reservation_id='r-w4t0oqj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1780438391',owner_user_name='tempest-ImagesTestJSON-1780438391-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:18Z,user_data=None,user_id='46ffd64a348845fab6cdc53249353575',uuid=734f32c9-a26a-41c8-bd90-f34e16a6503c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.989 221324 DEBUG nova.network.os_vif_util [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converting VIF {"id": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "address": "fa:16:3e:49:32:ed", "network": {"id": "24914779-babc-4c55-b38b-adf9bfc5c103", "bridge": "br-int", "label": "tempest-ImagesTestJSON-721321856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "521dcd459f144f2bb32de93d50ae0391", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb7c5ba-8f", "ovs_interfaceid": "9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.990 221324 DEBUG nova.network.os_vif_util [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.990 221324 DEBUG os_vif [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.993 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb7c5ba-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.997 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:20 np0005603622 nova_compute[221301]: 2026-01-31 08:04:20.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.001 221324 INFO os_vif [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:32:ed,bridge_name='br-int',has_traffic_filtering=True,id=9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9,network=Network(24914779-babc-4c55-b38b-adf9bfc5c103),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb7c5ba-8f')#033[00m
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [NOTICE]   (242435) : haproxy version is 2.8.14-c23fe91
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [NOTICE]   (242435) : path to executable is /usr/sbin/haproxy
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [WARNING]  (242435) : Exiting Master process...
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [WARNING]  (242435) : Exiting Master process...
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [ALERT]    (242435) : Current worker (242437) exited with code 143 (Terminated)
Jan 31 03:04:21 np0005603622 neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103[242431]: [WARNING]  (242435) : All workers exited. Exiting... (0)
Jan 31 03:04:21 np0005603622 systemd[1]: libpod-df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070.scope: Deactivated successfully.
Jan 31 03:04:21 np0005603622 podman[242703]: 2026-01-31 08:04:21.050202478 +0000 UTC m=+0.073758699 container died df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:04:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:21.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:21 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070-userdata-shm.mount: Deactivated successfully.
Jan 31 03:04:21 np0005603622 systemd[1]: var-lib-containers-storage-overlay-4f9c080e27c399be099839ecc39e37d61633dff66de49eb96b1f09c4b9cd6239-merged.mount: Deactivated successfully.
Jan 31 03:04:21 np0005603622 podman[242703]: 2026-01-31 08:04:21.136209875 +0000 UTC m=+0.159766086 container cleanup df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:04:21 np0005603622 systemd[1]: libpod-conmon-df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070.scope: Deactivated successfully.
Jan 31 03:04:21 np0005603622 podman[242749]: 2026-01-31 08:04:21.212739697 +0000 UTC m=+0.061398268 container remove df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.217 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[56360d6d-e598-42bb-a6a7-1b38f615557e]: (4, ('Sat Jan 31 08:04:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 (df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070)\ndf3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070\nSat Jan 31 08:04:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 (df3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070)\ndf3feec3d03e8c0f6fa9108b63c2273c8cf96519dc47abdc61e186a94d7c0070\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.219 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[29dac015-f316-4379-ac6e-1adba5b381ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.220 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24914779-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:21 np0005603622 kernel: tap24914779-b0: left promiscuous mode
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.224 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.226 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a038bd87-2e4b-47be-9f6d-8f8665debd5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.255 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec0d8b0-4ca6-4d03-b793-9d52f035f72f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.257 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4a2ce1-a5f8-42ec-a857-680e8a153c72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.270 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[640a0aa2-6f2c-4482-b6f6-c5b7c216b863]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589593, 'reachable_time': 43953, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242764, 'error': None, 'target': 'ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 systemd[1]: run-netns-ovnmeta\x2d24914779\x2dbabc\x2d4c55\x2db38b\x2dadf9bfc5c103.mount: Deactivated successfully.
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.273 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24914779-babc-4c55-b38b-adf9bfc5c103 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:04:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:21.273 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5474317d-dd06-4246-8799-93ab28032eb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.500 221324 DEBUG nova.compute.manager [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-unplugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.500 221324 DEBUG oslo_concurrency.lockutils [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.501 221324 DEBUG oslo_concurrency.lockutils [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.501 221324 DEBUG oslo_concurrency.lockutils [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.501 221324 DEBUG nova.compute.manager [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] No waiting events found dispatching network-vif-unplugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.502 221324 DEBUG nova.compute.manager [req-ffb46a42-b8ca-4e5f-a6b4-c0e8af8fd5b9 req-a6ec40b9-936c-4231-939f-a3dbd3443431 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-unplugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.590 221324 INFO nova.virt.libvirt.driver [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Deleting instance files /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c_del#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.591 221324 INFO nova.virt.libvirt.driver [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Deletion of /var/lib/nova/instances/734f32c9-a26a-41c8-bd90-f34e16a6503c_del complete#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.647 221324 INFO nova.compute.manager [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.648 221324 DEBUG oslo.service.loopingcall [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.648 221324 DEBUG nova.compute.manager [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:04:21 np0005603622 nova_compute[221301]: 2026-01-31 08:04:21.649 221324 DEBUG nova.network.neutron [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:04:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:22 np0005603622 nova_compute[221301]: 2026-01-31 08:04:22.784 221324 DEBUG nova.network.neutron [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:22 np0005603622 nova_compute[221301]: 2026-01-31 08:04:22.819 221324 INFO nova.compute.manager [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Took 1.17 seconds to deallocate network for instance.#033[00m
Jan 31 03:04:22 np0005603622 nova_compute[221301]: 2026-01-31 08:04:22.949 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:22 np0005603622 nova_compute[221301]: 2026-01-31 08:04:22.949 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:22 np0005603622 nova_compute[221301]: 2026-01-31 08:04:22.957 221324 DEBUG nova.compute.manager [req-685241a8-f68e-4ee6-9356-49fc6b1a9dfc req-1b6f4a90-eae1-4e66-8d0b-3fac8495245c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-deleted-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.028 221324 DEBUG oslo_concurrency.processutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:23.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.399 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2278436089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.448 221324 DEBUG oslo_concurrency.processutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.459 221324 DEBUG nova.compute.provider_tree [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.478 221324 DEBUG nova.scheduler.client.report [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.503 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.684 221324 INFO nova.scheduler.client.report [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Deleted allocations for instance 734f32c9-a26a-41c8-bd90-f34e16a6503c#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.875 221324 DEBUG nova.compute.manager [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.876 221324 DEBUG oslo_concurrency.lockutils [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.876 221324 DEBUG oslo_concurrency.lockutils [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.876 221324 DEBUG oslo_concurrency.lockutils [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.876 221324 DEBUG nova.compute.manager [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] No waiting events found dispatching network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.877 221324 WARNING nova.compute.manager [req-ba694733-e85a-4765-8dd2-f293df4212bc req-cd0e9cee-f922-4fe2-9521-3ae93c3364ad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Received unexpected event network-vif-plugged-9bb7c5ba-8f57-48dd-8747-3fa6b21adcf9 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:04:23 np0005603622 nova_compute[221301]: 2026-01-31 08:04:23.982 221324 DEBUG oslo_concurrency.lockutils [None req-0d0c12f6-defd-4292-af90-593047684d0e 46ffd64a348845fab6cdc53249353575 521dcd459f144f2bb32de93d50ae0391 - - default default] Lock "734f32c9-a26a-41c8-bd90-f34e16a6503c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:24.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:24 np0005603622 nova_compute[221301]: 2026-01-31 08:04:24.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:24 np0005603622 nova_compute[221301]: 2026-01-31 08:04:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:25.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:26 np0005603622 nova_compute[221301]: 2026-01-31 08:04:25.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:26.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:26 np0005603622 nova_compute[221301]: 2026-01-31 08:04:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:27.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.452 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.452 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.452 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 31 03:04:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2212245067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:27 np0005603622 nova_compute[221301]: 2026-01-31 08:04:27.939 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 31 03:04:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:28.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.124 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.126 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=20.876415252685547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.127 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.127 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.188 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.189 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.217 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:28 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1857452514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.652 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.657 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.678 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.719 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:04:28 np0005603622 nova_compute[221301]: 2026-01-31 08:04:28.719 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:29.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.720 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.721 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.721 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.752 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.753 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.976 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:29 np0005603622 nova_compute[221301]: 2026-01-31 08:04:29.977 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.025 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:04:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:30.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.145 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.145 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.154 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.154 221324 INFO nova.compute.claims [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.313 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:04:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3875097700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.768 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.773 221324 DEBUG nova.compute.provider_tree [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:30.858 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:30.859 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:30.859 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.860 221324 DEBUG nova.scheduler.client.report [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:04:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.949 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:30 np0005603622 nova_compute[221301]: 2026-01-31 08:04:30.951 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.052 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.055 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.056 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.085 221324 INFO nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:04:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:31.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.117 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.276 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.279 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.279 221324 INFO nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Creating image(s)#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.334 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.375 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.418 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.424 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.507 221324 DEBUG nova.policy [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.511 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.512 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.512 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.513 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.548 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:31 np0005603622 nova_compute[221301]: 2026-01-31 08:04:31.553 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:32.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.424 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.468 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.915s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.583 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] resizing rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.849 221324 DEBUG nova.objects.instance [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.874 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.875 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Ensure instance console log exists: /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.877 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.877 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:32 np0005603622 nova_compute[221301]: 2026-01-31 08:04:32.878 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:33.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:33 np0005603622 nova_compute[221301]: 2026-01-31 08:04:33.192 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Successfully created port: a9638137-8ca3-40db-8068-46e2bd256380 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:04:33 np0005603622 nova_compute[221301]: 2026-01-31 08:04:33.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:34.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:34 np0005603622 nova_compute[221301]: 2026-01-31 08:04:34.838 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Successfully updated port: a9638137-8ca3-40db-8068-46e2bd256380 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:04:34 np0005603622 nova_compute[221301]: 2026-01-31 08:04:34.863 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:04:34 np0005603622 nova_compute[221301]: 2026-01-31 08:04:34.863 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:34 np0005603622 nova_compute[221301]: 2026-01-31 08:04:34.863 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:04:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:35.036 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:04:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:35.039 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.046 221324 DEBUG nova.compute.manager [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-changed-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.046 221324 DEBUG nova.compute.manager [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing instance network info cache due to event network-changed-a9638137-8ca3-40db-8068-46e2bd256380. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.047 221324 DEBUG oslo_concurrency.lockutils [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.047 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:35.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.204 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:04:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.961 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846660.9599981, 734f32c9-a26a-41c8-bd90-f34e16a6503c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:35 np0005603622 nova_compute[221301]: 2026-01-31 08:04:35.962 221324 INFO nova.compute.manager [-] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.004 221324 DEBUG nova.compute.manager [None req-c08e5fa2-beb4-4cbd-9ffd-72b9f358d6e1 - - - - - -] [instance: 734f32c9-a26a-41c8-bd90-f34e16a6503c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:36.042 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:36.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.099 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.763 221324 DEBUG nova.network.neutron [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.793 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.793 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance network_info: |[{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.794 221324 DEBUG oslo_concurrency.lockutils [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.794 221324 DEBUG nova.network.neutron [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing network info cache for port a9638137-8ca3-40db-8068-46e2bd256380 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.798 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Start _get_guest_xml network_info=[{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.803 221324 WARNING nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.820 221324 DEBUG nova.virt.libvirt.host [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.821 221324 DEBUG nova.virt.libvirt.host [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.825 221324 DEBUG nova.virt.libvirt.host [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.825 221324 DEBUG nova.virt.libvirt.host [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.826 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.826 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.827 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.827 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.827 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.827 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.828 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.828 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.828 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.828 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.828 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.829 221324 DEBUG nova.virt.hardware [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:04:36 np0005603622 nova_compute[221301]: 2026-01-31 08:04:36.831 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:37.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:04:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1685615940' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.312 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.342 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.347 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:04:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2982982436' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.843 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.847 221324 DEBUG nova.virt.libvirt.vif [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:04:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.849 221324 DEBUG nova.network.os_vif_util [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.851 221324 DEBUG nova.network.os_vif_util [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.853 221324 DEBUG nova.objects.instance [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.880 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <uuid>8653e14e-eb16-4c07-bc3c-ff1133b80425</uuid>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <name>instance-0000003d</name>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:04:36</nova:creationTime>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="serial">8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="uuid">8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ad:7a:b9"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <target dev="tapa9638137-8c"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log" append="off"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:04:37 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:04:37 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:04:37 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:04:37 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.882 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Preparing to wait for external event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.882 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.883 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.883 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.884 221324 DEBUG nova.virt.libvirt.vif [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:04:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.884 221324 DEBUG nova.network.os_vif_util [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.885 221324 DEBUG nova.network.os_vif_util [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.885 221324 DEBUG os_vif [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.886 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.887 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.887 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.891 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9638137-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.892 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9638137-8c, col_values=(('external_ids', {'iface-id': 'a9638137-8ca3-40db-8068-46e2bd256380', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:7a:b9', 'vm-uuid': '8653e14e-eb16-4c07-bc3c-ff1133b80425'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.894 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:37 np0005603622 NetworkManager[49080]: <info>  [1769846677.8965] manager: (tapa9638137-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.902 221324 INFO os_vif [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c')#033[00m
Jan 31 03:04:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.962 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.963 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.963 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:ad:7a:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.964 221324 INFO nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Using config drive#033[00m
Jan 31 03:04:37 np0005603622 nova_compute[221301]: 2026-01-31 08:04:37.994 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:38.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.465 221324 DEBUG nova.network.neutron [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updated VIF entry in instance network info cache for port a9638137-8ca3-40db-8068-46e2bd256380. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.466 221324 DEBUG nova.network.neutron [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.493 221324 DEBUG oslo_concurrency.lockutils [req-7ec20fdd-ea22-430a-a1cc-39216d43f035 req-0632c710-469a-4dd9-bd53-05c22594e686 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.545 221324 INFO nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Creating config drive at /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.549 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk6zsjdnr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.687 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk6zsjdnr" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.720 221324 DEBUG nova.storage.rbd_utils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.725 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.931 221324 DEBUG oslo_concurrency.processutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config 8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.932 221324 INFO nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Deleting local config drive /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/disk.config because it was imported into RBD.#033[00m
Jan 31 03:04:38 np0005603622 kernel: tapa9638137-8c: entered promiscuous mode
Jan 31 03:04:38 np0005603622 NetworkManager[49080]: <info>  [1769846678.9922] manager: (tapa9638137-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 31 03:04:38 np0005603622 nova_compute[221301]: 2026-01-31 08:04:38.994 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:38Z|00181|binding|INFO|Claiming lport a9638137-8ca3-40db-8068-46e2bd256380 for this chassis.
Jan 31 03:04:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:38Z|00182|binding|INFO|a9638137-8ca3-40db-8068-46e2bd256380: Claiming fa:16:3e:ad:7a:b9 10.100.0.10
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.010 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:7a:b9 10.100.0.10'], port_security=['fa:16:3e:ad:7a:b9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8653e14e-eb16-4c07-bc3c-ff1133b80425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '68e717e2-a49f-445d-8e1a-284e91505fef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=a9638137-8ca3-40db-8068-46e2bd256380) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.011 139793 INFO neutron.agent.ovn.metadata.agent [-] Port a9638137-8ca3-40db-8068-46e2bd256380 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.013 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:04:39 np0005603622 systemd-udevd[243159]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.027 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ef80c638-1761-46a4-882c-9909613cf863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.028 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455fab34-b1 in ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.031 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455fab34-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.031 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1726c15f-690f-4332-a23d-49306719e72e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 systemd-machined[190406]: New machine qemu-25-instance-0000003d.
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.033 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f6faa21b-1be5-41f6-bd53-e93e4a41c56d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 NetworkManager[49080]: <info>  [1769846679.0395] device (tapa9638137-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:04:39 np0005603622 NetworkManager[49080]: <info>  [1769846679.0406] device (tapa9638137-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.046 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[09b6c205-9ffb-4762-b80c-17892e3284e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 systemd[1]: Started Virtual Machine qemu-25-instance-0000003d.
Jan 31 03:04:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:39Z|00183|binding|INFO|Setting lport a9638137-8ca3-40db-8068-46e2bd256380 ovn-installed in OVS
Jan 31 03:04:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:39Z|00184|binding|INFO|Setting lport a9638137-8ca3-40db-8068-46e2bd256380 up in Southbound
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.055 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.063 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[edb9ea5a-f9ab-4910-bd65-4d5a555cfcb4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.096 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e04b74-2f92-4260-8932-b148a729bfbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.101 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cce9a3f9-c52d-4198-9c73-64abf8e714bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 NetworkManager[49080]: <info>  [1769846679.1028] manager: (tap455fab34-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Jan 31 03:04:39 np0005603622 systemd-udevd[243162]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:04:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:39.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.131 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3ced298f-b9df-49ab-ae7d-ea7c43c756d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.136 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[66884b76-b3b7-4ffc-b5b3-d5cfa8d72494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 NetworkManager[49080]: <info>  [1769846679.1661] device (tap455fab34-b0): carrier: link connected
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.170 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2cd2fe-00df-4d05-a109-540d1f6a5226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.191 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d8869190-1c86-4c91-b4af-c38e02720513]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593050, 'reachable_time': 33303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243191, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.211 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0422a8-58fa-47f7-8f38-a493f9ffb443]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8f98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593050, 'tstamp': 593050}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243192, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.236 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0e549af8-6ec2-4a8b-8899-92bbe2b87b2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593050, 'reachable_time': 33303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243193, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.276 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee7deb2-eb40-4aa0-b589-8fe296c00c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.348 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[885a1aa8-ead1-4b3c-a6be-90a47cde8160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.349 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.350 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.350 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.354 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:39 np0005603622 kernel: tap455fab34-b0: entered promiscuous mode
Jan 31 03:04:39 np0005603622 NetworkManager[49080]: <info>  [1769846679.3549] manager: (tap455fab34-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.359 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:04:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:39Z|00185|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.364 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.365 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[00d7618f-717e-4818-9129-ca9f8fd8230a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.367 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:04:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:04:39.368 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'env', 'PROCESS_TAG=haproxy-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455fab34-b015-4d97-a96d-f7ebd7f7555f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.511 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846679.5104668, 8653e14e-eb16-4c07-bc3c-ff1133b80425 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.512 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] VM Started (Lifecycle Event)#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.542 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.547 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846679.5107262, 8653e14e-eb16-4c07-bc3c-ff1133b80425 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.547 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.566 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.570 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.592 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:04:39 np0005603622 podman[243267]: 2026-01-31 08:04:39.756947441 +0000 UTC m=+0.057471252 container create 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:04:39 np0005603622 systemd[1]: Started libpod-conmon-31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5.scope.
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.817 221324 DEBUG nova.compute.manager [req-5df83cf2-c9b6-45ec-9474-734503a44f89 req-a41ed4e4-0679-4c32-a14f-ef5d200c7d77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.818 221324 DEBUG oslo_concurrency.lockutils [req-5df83cf2-c9b6-45ec-9474-734503a44f89 req-a41ed4e4-0679-4c32-a14f-ef5d200c7d77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:39 np0005603622 podman[243267]: 2026-01-31 08:04:39.726739221 +0000 UTC m=+0.027263102 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.818 221324 DEBUG oslo_concurrency.lockutils [req-5df83cf2-c9b6-45ec-9474-734503a44f89 req-a41ed4e4-0679-4c32-a14f-ef5d200c7d77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.819 221324 DEBUG oslo_concurrency.lockutils [req-5df83cf2-c9b6-45ec-9474-734503a44f89 req-a41ed4e4-0679-4c32-a14f-ef5d200c7d77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.819 221324 DEBUG nova.compute.manager [req-5df83cf2-c9b6-45ec-9474-734503a44f89 req-a41ed4e4-0679-4c32-a14f-ef5d200c7d77 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Processing event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.820 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.826 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846679.8260446, 8653e14e-eb16-4c07-bc3c-ff1133b80425 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:04:39 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.828 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.831 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:04:39 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af1f110a156dbffb72e14c8ff53936f91595b6e44bf50f737a0aeb140d8db8db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.837 221324 INFO nova.virt.libvirt.driver [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance spawned successfully.#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.837 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.856 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:39 np0005603622 podman[243267]: 2026-01-31 08:04:39.857509248 +0000 UTC m=+0.158033149 container init 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:04:39 np0005603622 podman[243267]: 2026-01-31 08:04:39.867498986 +0000 UTC m=+0.168022787 container start 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.867 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.872 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.872 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.873 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.874 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.874 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.875 221324 DEBUG nova.virt.libvirt.driver [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:04:39 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [NOTICE]   (243287) : New worker (243289) forked
Jan 31 03:04:39 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [NOTICE]   (243287) : Loading success.
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.911 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.967 221324 INFO nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Took 8.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:04:39 np0005603622 nova_compute[221301]: 2026-01-31 08:04:39.967 221324 DEBUG nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:04:40 np0005603622 nova_compute[221301]: 2026-01-31 08:04:40.069 221324 INFO nova.compute.manager [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Took 9.96 seconds to build instance.#033[00m
Jan 31 03:04:40 np0005603622 nova_compute[221301]: 2026-01-31 08:04:40.100 221324 DEBUG oslo_concurrency.lockutils [None req-a6bcd212-f390-4fce-b300-f9b4911eedf5 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:04:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:40.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:04:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:41.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.007 221324 DEBUG nova.compute.manager [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.007 221324 DEBUG oslo_concurrency.lockutils [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.009 221324 DEBUG oslo_concurrency.lockutils [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.009 221324 DEBUG oslo_concurrency.lockutils [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.010 221324 DEBUG nova.compute.manager [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] No waiting events found dispatching network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.010 221324 WARNING nova.compute.manager [req-3069e148-a960-472a-9aff-fc2d5705e654 req-169fac12-8983-40a7-9833-c6e369aca317 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received unexpected event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:04:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:42.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:42 np0005603622 nova_compute[221301]: 2026-01-31 08:04:42.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:43.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:04:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:04:43 np0005603622 nova_compute[221301]: 2026-01-31 08:04:43.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:43 np0005603622 nova_compute[221301]: 2026-01-31 08:04:43.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:43 np0005603622 NetworkManager[49080]: <info>  [1769846683.8497] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 31 03:04:43 np0005603622 NetworkManager[49080]: <info>  [1769846683.8509] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 31 03:04:43 np0005603622 nova_compute[221301]: 2026-01-31 08:04:43.910 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:43Z|00186|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:04:43 np0005603622 nova_compute[221301]: 2026-01-31 08:04:43.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:04:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:44.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:04:44 np0005603622 podman[243429]: 2026-01-31 08:04:44.210718884 +0000 UTC m=+0.123058601 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:04:44 np0005603622 nova_compute[221301]: 2026-01-31 08:04:44.390 221324 DEBUG nova.compute.manager [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-changed-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:04:44 np0005603622 nova_compute[221301]: 2026-01-31 08:04:44.391 221324 DEBUG nova.compute.manager [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing instance network info cache due to event network-changed-a9638137-8ca3-40db-8068-46e2bd256380. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:04:44 np0005603622 nova_compute[221301]: 2026-01-31 08:04:44.392 221324 DEBUG oslo_concurrency.lockutils [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:04:44 np0005603622 nova_compute[221301]: 2026-01-31 08:04:44.392 221324 DEBUG oslo_concurrency.lockutils [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:04:44 np0005603622 nova_compute[221301]: 2026-01-31 08:04:44.393 221324 DEBUG nova.network.neutron [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing network info cache for port a9638137-8ca3-40db-8068-46e2bd256380 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:04:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:45.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:46.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 31 03:04:46 np0005603622 nova_compute[221301]: 2026-01-31 08:04:46.638 221324 DEBUG nova.network.neutron [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updated VIF entry in instance network info cache for port a9638137-8ca3-40db-8068-46e2bd256380. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:04:46 np0005603622 nova_compute[221301]: 2026-01-31 08:04:46.639 221324 DEBUG nova.network.neutron [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:04:46 np0005603622 nova_compute[221301]: 2026-01-31 08:04:46.672 221324 DEBUG oslo_concurrency.lockutils [req-d558ee3e-8474-4256-8ede-d7d05ed36c82 req-3313560a-b64b-478c-b4da-0c03eda557c0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:04:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:47.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:47 np0005603622 nova_compute[221301]: 2026-01-31 08:04:47.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:48.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:48 np0005603622 nova_compute[221301]: 2026-01-31 08:04:48.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:48 np0005603622 nova_compute[221301]: 2026-01-31 08:04:48.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 31 03:04:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:49.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:49 np0005603622 podman[243455]: 2026-01-31 08:04:49.159421309 +0000 UTC m=+0.079822281 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:04:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:04:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:50.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e229 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 31 03:04:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:51.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:52.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:52 np0005603622 nova_compute[221301]: 2026-01-31 08:04:52.904 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 31 03:04:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:53.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:53 np0005603622 nova_compute[221301]: 2026-01-31 08:04:53.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:54.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:54Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:7a:b9 10.100.0.10
Jan 31 03:04:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:04:54Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:7a:b9 10.100.0.10
Jan 31 03:04:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:55.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:04:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:56.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:04:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:57.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:04:57 np0005603622 nova_compute[221301]: 2026-01-31 08:04:57.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 31 03:04:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:04:58.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:58 np0005603622 nova_compute[221301]: 2026-01-31 08:04:58.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:04:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:04:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:04:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:04:59.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.648766) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699648949, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1043, "num_deletes": 259, "total_data_size": 1840675, "memory_usage": 1874800, "flush_reason": "Manual Compaction"}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699658248, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1210284, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35652, "largest_seqno": 36690, "table_properties": {"data_size": 1205423, "index_size": 2385, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11182, "raw_average_key_size": 20, "raw_value_size": 1195517, "raw_average_value_size": 2213, "num_data_blocks": 104, "num_entries": 540, "num_filter_entries": 540, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846648, "oldest_key_time": 1769846648, "file_creation_time": 1769846699, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9575 microseconds, and 3645 cpu microseconds.
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.658350) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1210284 bytes OK
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.658376) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.659653) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.659669) EVENT_LOG_v1 {"time_micros": 1769846699659663, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.659693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1835419, prev total WAL file size 1835419, number of live WAL files 2.
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.660217) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1181KB)], [66(9380KB)]
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699660267, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 10815651, "oldest_snapshot_seqno": -1}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6066 keys, 8862090 bytes, temperature: kUnknown
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699703639, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8862090, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8821864, "index_size": 23988, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 155944, "raw_average_key_size": 25, "raw_value_size": 8713452, "raw_average_value_size": 1436, "num_data_blocks": 958, "num_entries": 6066, "num_filter_entries": 6066, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846699, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.703903) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8862090 bytes
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.705393) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 248.9 rd, 203.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 9.2 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(16.3) write-amplify(7.3) OK, records in: 6596, records dropped: 530 output_compression: NoCompression
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.705409) EVENT_LOG_v1 {"time_micros": 1769846699705401, "job": 40, "event": "compaction_finished", "compaction_time_micros": 43454, "compaction_time_cpu_micros": 18767, "output_level": 6, "num_output_files": 1, "total_output_size": 8862090, "num_input_records": 6596, "num_output_records": 6066, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699705697, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846699706579, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.660159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.706651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.706655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.706660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.706661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:04:59 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:04:59.706665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:05:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:00.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:01.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 31 03:05:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:02.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:02Z|00187|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:05:02 np0005603622 nova_compute[221301]: 2026-01-31 08:05:02.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:02 np0005603622 nova_compute[221301]: 2026-01-31 08:05:02.909 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.016 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.017 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.018 221324 DEBUG nova.objects.instance [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:03.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.696 221324 DEBUG nova.objects.instance [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.732 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:03 np0005603622 nova_compute[221301]: 2026-01-31 08:05:03.929 221324 DEBUG nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:05:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:04.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:04 np0005603622 nova_compute[221301]: 2026-01-31 08:05:04.276 221324 DEBUG nova.policy [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:05:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 31 03:05:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:05.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:06.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 31 03:05:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:07.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:07 np0005603622 nova_compute[221301]: 2026-01-31 08:05:07.221 221324 DEBUG nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Successfully created port: e23aacbf-a775-4d28-941e-3359aa4807a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:05:07 np0005603622 nova_compute[221301]: 2026-01-31 08:05:07.912 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:08.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:08 np0005603622 nova_compute[221301]: 2026-01-31 08:05:08.735 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:09.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.288 221324 DEBUG nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Successfully updated port: e23aacbf-a775-4d28-941e-3359aa4807a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.368 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.369 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.369 221324 DEBUG nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.697 221324 DEBUG nova.compute.manager [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-changed-e23aacbf-a775-4d28-941e-3359aa4807a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.697 221324 DEBUG nova.compute.manager [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing instance network info cache due to event network-changed-e23aacbf-a775-4d28-941e-3359aa4807a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:05:09 np0005603622 nova_compute[221301]: 2026-01-31 08:05:09.698 221324 DEBUG oslo_concurrency.lockutils [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:05:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:10.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:05:10 np0005603622 nova_compute[221301]: 2026-01-31 08:05:10.367 221324 WARNING nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:05:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:11.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:12.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:12 np0005603622 nova_compute[221301]: 2026-01-31 08:05:12.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:12 np0005603622 nova_compute[221301]: 2026-01-31 08:05:12.916 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 31 03:05:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:13.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:13 np0005603622 nova_compute[221301]: 2026-01-31 08:05:13.737 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:05:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:14.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:05:15 np0005603622 podman[243525]: 2026-01-31 08:05:15.169343831 +0000 UTC m=+0.085464313 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:05:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:05:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:15.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.602 221324 DEBUG nova.network.neutron [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.637 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.639 221324 DEBUG oslo_concurrency.lockutils [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.639 221324 DEBUG nova.network.neutron [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Refreshing network info cache for port e23aacbf-a775-4d28-941e-3359aa4807a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.643 221324 DEBUG nova.virt.libvirt.vif [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.643 221324 DEBUG nova.network.os_vif_util [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.644 221324 DEBUG nova.network.os_vif_util [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.645 221324 DEBUG os_vif [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.646 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.646 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.647 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.650 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape23aacbf-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.651 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape23aacbf-a7, col_values=(('external_ids', {'iface-id': 'e23aacbf-a775-4d28-941e-3359aa4807a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:cd:72', 'vm-uuid': '8653e14e-eb16-4c07-bc3c-ff1133b80425'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 NetworkManager[49080]: <info>  [1769846715.6544] manager: (tape23aacbf-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.667 221324 INFO os_vif [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7')#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.669 221324 DEBUG nova.virt.libvirt.vif [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.669 221324 DEBUG nova.network.os_vif_util [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.670 221324 DEBUG nova.network.os_vif_util [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.675 221324 DEBUG nova.virt.libvirt.guest [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:33:cd:72"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <target dev="tape23aacbf-a7"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:05:15 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:05:15 np0005603622 kernel: tape23aacbf-a7: entered promiscuous mode
Jan 31 03:05:15 np0005603622 NetworkManager[49080]: <info>  [1769846715.6963] manager: (tape23aacbf-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 31 03:05:15 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:15Z|00188|binding|INFO|Claiming lport e23aacbf-a775-4d28-941e-3359aa4807a3 for this chassis.
Jan 31 03:05:15 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:15Z|00189|binding|INFO|e23aacbf-a775-4d28-941e-3359aa4807a3: Claiming fa:16:3e:33:cd:72 10.100.0.14
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:15Z|00190|binding|INFO|Setting lport e23aacbf-a775-4d28-941e-3359aa4807a3 ovn-installed in OVS
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.713 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:15Z|00191|binding|INFO|Setting lport e23aacbf-a775-4d28-941e-3359aa4807a3 up in Southbound
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.716 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:cd:72 10.100.0.14'], port_security=['fa:16:3e:33:cd:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8653e14e-eb16-4c07-bc3c-ff1133b80425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=e23aacbf-a775-4d28-941e-3359aa4807a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.717 139793 INFO neutron.agent.ovn.metadata.agent [-] Port e23aacbf-a775-4d28-941e-3359aa4807a3 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.718 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:05:15 np0005603622 systemd-udevd[243557]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.735 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c826e82-2f2b-4b32-98f8-fb02c05af30e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 NetworkManager[49080]: <info>  [1769846715.7408] device (tape23aacbf-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:05:15 np0005603622 NetworkManager[49080]: <info>  [1769846715.7419] device (tape23aacbf-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:05:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.760 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f17d10cc-2994-4753-8930-90e5b2f0b135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.763 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1b790b57-922d-4506-a2a8-1bebecf105b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.793 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9e45d4-7ff7-4d48-8beb-28d499aa5520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.807 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ab87af6f-5a00-4d78-ae75-19f4d2daca1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593050, 'reachable_time': 33303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243565, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.812 221324 DEBUG nova.virt.libvirt.driver [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.813 221324 DEBUG nova.virt.libvirt.driver [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.813 221324 DEBUG nova.virt.libvirt.driver [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:ad:7a:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.813 221324 DEBUG nova.virt.libvirt.driver [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:33:cd:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae20ff1-5e02-41b2-aa95-c1bf56d376e4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593064, 'tstamp': 593064}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243566, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593068, 'tstamp': 593068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243566, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.828 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.831 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.831 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:15.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.876 221324 DEBUG nova.virt.libvirt.guest [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:15</nova:creationTime>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:15 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    <nova:port uuid="e23aacbf-a775-4d28-941e-3359aa4807a3">
Jan 31 03:05:15 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:15 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:15 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:15 np0005603622 nova_compute[221301]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:05:15 np0005603622 nova_compute[221301]: 2026-01-31 08:05:15.926 221324 DEBUG oslo_concurrency.lockutils [None req-ada4bb61-69d0-42fe-b888-1b69bbb70d2b 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 12.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:16.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.239 221324 DEBUG nova.compute.manager [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.240 221324 DEBUG oslo_concurrency.lockutils [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.240 221324 DEBUG oslo_concurrency.lockutils [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.240 221324 DEBUG oslo_concurrency.lockutils [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.241 221324 DEBUG nova.compute.manager [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] No waiting events found dispatching network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:16 np0005603622 nova_compute[221301]: 2026-01-31 08:05:16.241 221324 WARNING nova.compute.manager [req-5aaad87f-ec5f-4b93-b9fb-6502d2c3f90e req-be1e1779-78f8-43f2-a3bd-a7a9f4c827f5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received unexpected event network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:05:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:17.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 31 03:05:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:18.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.413 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.413 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.434 221324 DEBUG nova.compute.manager [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.434 221324 DEBUG oslo_concurrency.lockutils [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.435 221324 DEBUG oslo_concurrency.lockutils [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.435 221324 DEBUG oslo_concurrency.lockutils [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.436 221324 DEBUG nova.compute.manager [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] No waiting events found dispatching network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.436 221324 WARNING nova.compute.manager [req-c9902fa7-61ab-4501-8989-0adc29bae5a3 req-33cd6448-a790-48ae-beca-966d263157ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received unexpected event network-vif-plugged-e23aacbf-a775-4d28-941e-3359aa4807a3 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:05:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:18Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:cd:72 10.100.0.14
Jan 31 03:05:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:18Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:cd:72 10.100.0.14
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.685 221324 DEBUG nova.network.neutron [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updated VIF entry in instance network info cache for port e23aacbf-a775-4d28-941e-3359aa4807a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.685 221324 DEBUG nova.network.neutron [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.705 221324 DEBUG oslo_concurrency.lockutils [req-8386ac40-7ab9-426f-82b9-9d8e2666bd34 req-cea39cdd-a5a7-4050-8fb0-9638d317e8c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:18 np0005603622 nova_compute[221301]: 2026-01-31 08:05:18.739 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.075 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-e23aacbf-a775-4d28-941e-3359aa4807a3" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.075 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-e23aacbf-a775-4d28-941e-3359aa4807a3" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.127 221324 DEBUG nova.objects.instance [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.157 221324 DEBUG nova.virt.libvirt.vif [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.157 221324 DEBUG nova.network.os_vif_util [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.158 221324 DEBUG nova.network.os_vif_util [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.162 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.166 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.169 221324 DEBUG nova.virt.libvirt.driver [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Attempting to detach device tape23aacbf-a7 from instance 8653e14e-eb16-4c07-bc3c-ff1133b80425 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.170 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:33:cd:72"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <target dev="tape23aacbf-a7"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.179 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:19.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.184 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface>not found in domain: <domain type='kvm' id='25'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <name>instance-0000003d</name>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <uuid>8653e14e-eb16-4c07-bc3c-ff1133b80425</uuid>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:15</nova:creationTime>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:port uuid="e23aacbf-a775-4d28-941e-3359aa4807a3">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='serial'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='uuid'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk' index='2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config' index='1'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:ad:7a:b9'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='tapa9638137-8c'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:33:cd:72'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='tape23aacbf-a7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='net1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c452,c767</label>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c452,c767</imagelabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.185 221324 INFO nova.virt.libvirt.driver [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tape23aacbf-a7 from instance 8653e14e-eb16-4c07-bc3c-ff1133b80425 from the persistent domain config.#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.186 221324 DEBUG nova.virt.libvirt.driver [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] (1/8): Attempting to detach device tape23aacbf-a7 with device alias net1 from instance 8653e14e-eb16-4c07-bc3c-ff1133b80425 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.187 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:33:cd:72"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <target dev="tape23aacbf-a7"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:05:19 np0005603622 kernel: tape23aacbf-a7 (unregistering): left promiscuous mode
Jan 31 03:05:19 np0005603622 NetworkManager[49080]: <info>  [1769846719.2530] device (tape23aacbf-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:19Z|00192|binding|INFO|Releasing lport e23aacbf-a775-4d28-941e-3359aa4807a3 from this chassis (sb_readonly=0)
Jan 31 03:05:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:19Z|00193|binding|INFO|Setting lport e23aacbf-a775-4d28-941e-3359aa4807a3 down in Southbound
Jan 31 03:05:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:19Z|00194|binding|INFO|Removing iface tape23aacbf-a7 ovn-installed in OVS
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.266 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769846719.2637792, 8653e14e-eb16-4c07-bc3c-ff1133b80425 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.268 221324 DEBUG nova.virt.libvirt.driver [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Start waiting for the detach event from libvirt for device tape23aacbf-a7 with device alias net1 for instance 8653e14e-eb16-4c07-bc3c-ff1133b80425 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.270 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.271 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:cd:72 10.100.0.14'], port_security=['fa:16:3e:33:cd:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8653e14e-eb16-4c07-bc3c-ff1133b80425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=e23aacbf-a775-4d28-941e-3359aa4807a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.273 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.272 139793 INFO neutron.agent.ovn.metadata.agent [-] Port e23aacbf-a775-4d28-941e-3359aa4807a3 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.273 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.278 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface>not found in domain: <domain type='kvm' id='25'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <name>instance-0000003d</name>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <uuid>8653e14e-eb16-4c07-bc3c-ff1133b80425</uuid>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:15</nova:creationTime>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:port uuid="e23aacbf-a775-4d28-941e-3359aa4807a3">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='serial'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='uuid'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk' index='2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config' index='1'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:ad:7a:b9'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target dev='tapa9638137-8c'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c452,c767</label>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c452,c767</imagelabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.280 221324 INFO nova.virt.libvirt.driver [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tape23aacbf-a7 from instance 8653e14e-eb16-4c07-bc3c-ff1133b80425 from the live domain config.#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.281 221324 DEBUG nova.virt.libvirt.vif [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.281 221324 DEBUG nova.network.os_vif_util [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.282 221324 DEBUG nova.network.os_vif_util [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.283 221324 DEBUG os_vif [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.285 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.286 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23aacbf-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.288 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.290 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.292 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[016b0fc6-ec51-4c96-bba0-5d7e5b3a4f36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.295 221324 INFO os_vif [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7')#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.296 221324 DEBUG nova.virt.libvirt.guest [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:19</nova:creationTime>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:19 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:19 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:19 np0005603622 nova_compute[221301]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.327 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcd534b-0aa0-4451-ad5f-19104a7926b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.330 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b090aa-4fb7-453f-9c42-1f6a5e02f4b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.355 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b576b9-a8cb-41bf-a2d6-77ca94d4824d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 podman[243568]: 2026-01-31 08:05:19.370042026 +0000 UTC m=+0.078275330 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.372 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a321345a-8fac-4004-b1be-e88388b88cd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593050, 'reachable_time': 33303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243596, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.388 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[44ece322-58c8-4535-9b1e-8b33addd2270]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593064, 'tstamp': 593064}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243599, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593068, 'tstamp': 593068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243599, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.390 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:19 np0005603622 nova_compute[221301]: 2026-01-31 08:05:19.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.394 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.395 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.395 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:19.396 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:20.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:21.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.306 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.307 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.308 221324 DEBUG nova.network.neutron [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.656 221324 DEBUG nova.compute.manager [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-deleted-e23aacbf-a775-4d28-941e-3359aa4807a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.657 221324 INFO nova.compute.manager [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Neutron deleted interface e23aacbf-a775-4d28-941e-3359aa4807a3; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.657 221324 DEBUG nova.network.neutron [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.693 221324 DEBUG nova.objects.instance [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'system_metadata' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.764 221324 DEBUG nova.objects.instance [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lazy-loading 'flavor' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.793 221324 DEBUG nova.virt.libvirt.vif [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.793 221324 DEBUG nova.network.os_vif_util [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.794 221324 DEBUG nova.network.os_vif_util [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.798 221324 DEBUG nova.virt.libvirt.guest [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.802 221324 DEBUG nova.virt.libvirt.guest [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface>not found in domain: <domain type='kvm' id='25'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <name>instance-0000003d</name>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <uuid>8653e14e-eb16-4c07-bc3c-ff1133b80425</uuid>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:19</nova:creationTime>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='serial'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='uuid'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk' index='2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config' index='1'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:ad:7a:b9'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='tapa9638137-8c'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c452,c767</label>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c452,c767</imagelabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.802 221324 DEBUG nova.virt.libvirt.guest [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.806 221324 DEBUG nova.virt.libvirt.guest [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:33:cd:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape23aacbf-a7"/></interface>not found in domain: <domain type='kvm' id='25'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <name>instance-0000003d</name>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <uuid>8653e14e-eb16-4c07-bc3c-ff1133b80425</uuid>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:19</nova:creationTime>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='serial'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='uuid'>8653e14e-eb16-4c07-bc3c-ff1133b80425</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk' index='2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/8653e14e-eb16-4c07-bc3c-ff1133b80425_disk.config' index='1'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:ad:7a:b9'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target dev='tapa9638137-8c'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425/console.log' append='off'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c452,c767</label>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c452,c767</imagelabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.806 221324 WARNING nova.virt.libvirt.driver [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Detaching interface fa:16:3e:33:cd:72 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tape23aacbf-a7' not found.#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.807 221324 DEBUG nova.virt.libvirt.vif [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.808 221324 DEBUG nova.network.os_vif_util [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converting VIF {"id": "e23aacbf-a775-4d28-941e-3359aa4807a3", "address": "fa:16:3e:33:cd:72", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape23aacbf-a7", "ovs_interfaceid": "e23aacbf-a775-4d28-941e-3359aa4807a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.808 221324 DEBUG nova.network.os_vif_util [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.809 221324 DEBUG os_vif [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.811 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23aacbf-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.811 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.814 221324 INFO os_vif [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:cd:72,bridge_name='br-int',has_traffic_filtering=True,id=e23aacbf-a775-4d28-941e-3359aa4807a3,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape23aacbf-a7')#033[00m
Jan 31 03:05:21 np0005603622 nova_compute[221301]: 2026-01-31 08:05:21.815 221324 DEBUG nova.virt.libvirt.guest [req-262a4a17-3a11-4ab1-a32f-17fad660ae89 req-4301984e-2085-418a-be4a-c8122cc9dcb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:name>tempest-AttachInterfacesTestJSON-server-859264707</nova:name>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:05:21</nova:creationTime>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    <nova:port uuid="a9638137-8ca3-40db-8068-46e2bd256380">
Jan 31 03:05:21 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:05:21 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:05:21 np0005603622 nova_compute[221301]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:05:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:22.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.303 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.303 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.303 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.303 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.304 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.305 221324 INFO nova.compute.manager [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Terminating instance#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.305 221324 DEBUG nova.compute.manager [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:05:22 np0005603622 kernel: tapa9638137-8c (unregistering): left promiscuous mode
Jan 31 03:05:22 np0005603622 NetworkManager[49080]: <info>  [1769846722.3660] device (tapa9638137-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:05:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:22Z|00195|binding|INFO|Releasing lport a9638137-8ca3-40db-8068-46e2bd256380 from this chassis (sb_readonly=0)
Jan 31 03:05:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:22Z|00196|binding|INFO|Setting lport a9638137-8ca3-40db-8068-46e2bd256380 down in Southbound
Jan 31 03:05:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:22Z|00197|binding|INFO|Removing iface tapa9638137-8c ovn-installed in OVS
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.420 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 31 03:05:22 np0005603622 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Consumed 15.506s CPU time.
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.456 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:7a:b9 10.100.0.10'], port_security=['fa:16:3e:ad:7a:b9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8653e14e-eb16-4c07-bc3c-ff1133b80425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '68e717e2-a49f-445d-8e1a-284e91505fef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=a9638137-8ca3-40db-8068-46e2bd256380) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:22 np0005603622 systemd-machined[190406]: Machine qemu-25-instance-0000003d terminated.
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.458 139793 INFO neutron.agent.ovn.metadata.agent [-] Port a9638137-8ca3-40db-8068-46e2bd256380 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.461 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455fab34-b015-4d97-a96d-f7ebd7f7555f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.462 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[97c051e2-4772-462f-a508-4dc2ef76c8a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.463 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace which is not needed anymore#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.540 221324 INFO nova.virt.libvirt.driver [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance destroyed successfully.#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.541 221324 DEBUG nova.objects.instance [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'resources' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.565 221324 DEBUG nova.virt.libvirt.vif [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:04:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-859264707',display_name='tempest-AttachInterfacesTestJSON-server-859264707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-859264707',id=61,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNwaidda/kIMdfmj3ClJdKVNSaX8NiSXLyTF3bZESgxLGEnWbCwUHHLCO81KDKzFUuwfz3xMcIaGhc0g84sO5H2E7f2H/NV/gV+olSevjLt9DAODwWz62kmDzVvkZGYuFQ==',key_name='tempest-keypair-1293449320',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-0asmw4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:04:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=8653e14e-eb16-4c07-bc3c-ff1133b80425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.566 221324 DEBUG nova.network.os_vif_util [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.566 221324 DEBUG nova.network.os_vif_util [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.566 221324 DEBUG os_vif [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.568 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9638137-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.573 221324 INFO os_vif [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:7a:b9,bridge_name='br-int',has_traffic_filtering=True,id=a9638137-8ca3-40db-8068-46e2bd256380,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9638137-8c')#033[00m
Jan 31 03:05:22 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [NOTICE]   (243287) : haproxy version is 2.8.14-c23fe91
Jan 31 03:05:22 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [NOTICE]   (243287) : path to executable is /usr/sbin/haproxy
Jan 31 03:05:22 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [WARNING]  (243287) : Exiting Master process...
Jan 31 03:05:22 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [ALERT]    (243287) : Current worker (243289) exited with code 143 (Terminated)
Jan 31 03:05:22 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[243283]: [WARNING]  (243287) : All workers exited. Exiting... (0)
Jan 31 03:05:22 np0005603622 systemd[1]: libpod-31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5.scope: Deactivated successfully.
Jan 31 03:05:22 np0005603622 podman[243637]: 2026-01-31 08:05:22.624210467 +0000 UTC m=+0.059263790 container died 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:05:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5-userdata-shm.mount: Deactivated successfully.
Jan 31 03:05:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay-af1f110a156dbffb72e14c8ff53936f91595b6e44bf50f737a0aeb140d8db8db-merged.mount: Deactivated successfully.
Jan 31 03:05:22 np0005603622 podman[243637]: 2026-01-31 08:05:22.682160082 +0000 UTC m=+0.117213435 container cleanup 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:05:22 np0005603622 systemd[1]: libpod-conmon-31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5.scope: Deactivated successfully.
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 podman[243686]: 2026-01-31 08:05:22.75255924 +0000 UTC m=+0.049631802 container remove 31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.757 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[03542250-3271-4405-8d78-dae847e7e884]: (4, ('Sat Jan 31 08:05:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5)\n31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5\nSat Jan 31 08:05:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5)\n31ebf4307d7451346914b9f8e4de37210cbd1ab014cbae992bf2566a376411f5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.760 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[337e38a0-474e-4670-a460-b76663f24c53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.761 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.764 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 kernel: tap455fab34-b0: left promiscuous mode
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.766 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 nova_compute[221301]: 2026-01-31 08:05:22.772 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0849131c-dda2-40b0-b692-045f5cfaf32c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.796 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9fb31b-ef02-4c96-84a6-d139561eada0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.797 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee81147-496b-4517-bb01-2581c379902f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.812 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[53387aca-7551-4e36-a598-6a6b8236519f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593043, 'reachable_time': 30816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243700, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.815 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:05:22 np0005603622 systemd[1]: run-netns-ovnmeta\x2d455fab34\x2db015\x2d4d97\x2da96d\x2df7ebd7f7555f.mount: Deactivated successfully.
Jan 31 03:05:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:22.816 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1465b5-4c27-4144-9340-f5bddfe4af6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:05:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:23.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.627 221324 INFO nova.virt.libvirt.driver [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Deleting instance files /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425_del#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.628 221324 INFO nova.virt.libvirt.driver [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Deletion of /var/lib/nova/instances/8653e14e-eb16-4c07-bc3c-ff1133b80425_del complete#033[00m
Jan 31 03:05:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.724 221324 INFO nova.compute.manager [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Took 1.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.725 221324 DEBUG oslo.service.loopingcall [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.725 221324 DEBUG nova.compute.manager [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.725 221324 DEBUG nova.network.neutron [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.935 221324 DEBUG nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-unplugged-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.935 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.936 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.936 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.936 221324 DEBUG nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] No waiting events found dispatching network-vif-unplugged-a9638137-8ca3-40db-8068-46e2bd256380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.936 221324 DEBUG nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-unplugged-a9638137-8ca3-40db-8068-46e2bd256380 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.937 221324 DEBUG nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.937 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.937 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.937 221324 DEBUG oslo_concurrency.lockutils [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.938 221324 DEBUG nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] No waiting events found dispatching network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:23 np0005603622 nova_compute[221301]: 2026-01-31 08:05:23.938 221324 WARNING nova.compute.manager [req-1dd704d4-d200-4f50-8b5e-238a24802308 req-9f931568-f1a3-45c9-823a-3ea45911c503 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received unexpected event network-vif-plugged-a9638137-8ca3-40db-8068-46e2bd256380 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:05:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:05:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:24.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:05:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:25.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:25 np0005603622 nova_compute[221301]: 2026-01-31 08:05:25.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:26.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.745 221324 INFO nova.network.neutron [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Port e23aacbf-a775-4d28-941e-3359aa4807a3 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.746 221324 DEBUG nova.network.neutron [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [{"id": "a9638137-8ca3-40db-8068-46e2bd256380", "address": "fa:16:3e:ad:7a:b9", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9638137-8c", "ovs_interfaceid": "a9638137-8ca3-40db-8068-46e2bd256380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.776 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:26 np0005603622 nova_compute[221301]: 2026-01-31 08:05:26.819 221324 DEBUG oslo_concurrency.lockutils [None req-ccc52bb1-fd60-414d-8d4e-9103fb7fc619 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-8653e14e-eb16-4c07-bc3c-ff1133b80425-e23aacbf-a775-4d28-941e-3359aa4807a3" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:27.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.399 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.706 221324 DEBUG nova.network.neutron [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.742 221324 INFO nova.compute.manager [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Took 4.02 seconds to deallocate network for instance.#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.909 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.909 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:27 np0005603622 nova_compute[221301]: 2026-01-31 08:05:27.978 221324 DEBUG nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.007 221324 DEBUG nova.compute.manager [req-a008258b-f489-447a-9022-ba081306f298 req-2b641f08-4cc6-4d9e-bfb0-bdcf8333f19a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Received event network-vif-deleted-a9638137-8ca3-40db-8068-46e2bd256380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.029 221324 DEBUG nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.029 221324 DEBUG nova.compute.provider_tree [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:05:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:28.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.230 221324 DEBUG nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:05:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.260 221324 DEBUG nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.324 221324 DEBUG oslo_concurrency.processutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.574 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2898377566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.796 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.814 221324 DEBUG oslo_concurrency.processutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:28 np0005603622 nova_compute[221301]: 2026-01-31 08:05:28.823 221324 DEBUG nova.compute.provider_tree [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:29.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 31 03:05:29 np0005603622 nova_compute[221301]: 2026-01-31 08:05:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:29 np0005603622 nova_compute[221301]: 2026-01-31 08:05:29.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:05:29 np0005603622 nova_compute[221301]: 2026-01-31 08:05:29.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.015 221324 DEBUG nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.076 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.140 221324 INFO nova.scheduler.client.report [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Deleted allocations for instance 8653e14e-eb16-4c07-bc3c-ff1133b80425#033[00m
Jan 31 03:05:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:30.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.336 221324 DEBUG oslo_concurrency.lockutils [None req-82ad22c1-17f9-4319-8ffa-e4774ffbd45f 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "8653e14e-eb16-4c07-bc3c-ff1133b80425" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.512 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.512 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.513 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.513 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8653e14e-eb16-4c07-bc3c-ff1133b80425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.597 221324 DEBUG nova.compute.utils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010#033[00m
Jan 31 03:05:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:30.859 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:30.860 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:30.860 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:30 np0005603622 nova_compute[221301]: 2026-01-31 08:05:30.966 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:05:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:31.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:32.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:32 np0005603622 nova_compute[221301]: 2026-01-31 08:05:32.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.015 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.117 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-8653e14e-eb16-4c07-bc3c-ff1133b80425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.118 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.118 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.162 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.164 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.165 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.165 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.166 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:33.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2667582567' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.609 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.814 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.815 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4675MB free_disk=20.942794799804688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.816 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.816 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.846 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.846 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:33 np0005603622 nova_compute[221301]: 2026-01-31 08:05:33.890 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.014 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e631901a-0850-43e0-baa6-ad6051bab98f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.015 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.015 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.078 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.096 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:34.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2176737050' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.544 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.551 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.611 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.734 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.735 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.736 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.773 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:05:34 np0005603622 nova_compute[221301]: 2026-01-31 08:05:34.774 221324 INFO nova.compute.claims [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:05:35 np0005603622 nova_compute[221301]: 2026-01-31 08:05:35.118 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:35.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2612909929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:35 np0005603622 nova_compute[221301]: 2026-01-31 08:05:35.582 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:35 np0005603622 nova_compute[221301]: 2026-01-31 08:05:35.591 221324 DEBUG nova.compute.provider_tree [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:36.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.201 221324 DEBUG nova.scheduler.client.report [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.278 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.280 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.378 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.379 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.425 221324 INFO nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.468 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.619 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.621 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.622 221324 INFO nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Creating image(s)#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.665 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.706 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.739 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.744 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.776 221324 DEBUG nova.policy [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16d731f5875748ca9b8036b2ba061042', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3469c253459e40e39dcf5bcb6a32008f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.803 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.804 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.805 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.805 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.834 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:36 np0005603622 nova_compute[221301]: 2026-01-31 08:05:36.839 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e631901a-0850-43e0-baa6-ad6051bab98f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.154 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e631901a-0850-43e0-baa6-ad6051bab98f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:37.211 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:37.212 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:05:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.258 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] resizing rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.435 221324 DEBUG nova.objects.instance [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'migration_context' on Instance uuid e631901a-0850-43e0-baa6-ad6051bab98f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.458 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.458 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Ensure instance console log exists: /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.459 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.459 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.460 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.537 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846722.536439, 8653e14e-eb16-4c07-bc3c-ff1133b80425 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.538 221324 INFO nova.compute.manager [-] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.577 221324 DEBUG nova.compute.manager [None req-7a6f948e-1415-4059-8e88-a09faeba75a1 - - - - - -] [instance: 8653e14e-eb16-4c07-bc3c-ff1133b80425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:37 np0005603622 nova_compute[221301]: 2026-01-31 08:05:37.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 31 03:05:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:38.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:38.215 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:38 np0005603622 nova_compute[221301]: 2026-01-31 08:05:38.331 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Successfully created port: 971ab75d-0059-498c-acdc-abc169e8d990 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:05:38 np0005603622 nova_compute[221301]: 2026-01-31 08:05:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:05:38 np0005603622 nova_compute[221301]: 2026-01-31 08:05:38.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:05:38 np0005603622 nova_compute[221301]: 2026-01-31 08:05:38.447 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:05:38 np0005603622 nova_compute[221301]: 2026-01-31 08:05:38.801 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:39.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:40.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.475 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Successfully updated port: 971ab75d-0059-498c-acdc-abc169e8d990 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.539 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.539 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.540 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.695 221324 DEBUG nova.compute.manager [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-changed-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.696 221324 DEBUG nova.compute.manager [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Refreshing instance network info cache due to event network-changed-971ab75d-0059-498c-acdc-abc169e8d990. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:05:40 np0005603622 nova_compute[221301]: 2026-01-31 08:05:40.696 221324 DEBUG oslo_concurrency.lockutils [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:05:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:41.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:42.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:42 np0005603622 nova_compute[221301]: 2026-01-31 08:05:42.582 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:43.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:43 np0005603622 nova_compute[221301]: 2026-01-31 08:05:43.734 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:05:43 np0005603622 nova_compute[221301]: 2026-01-31 08:05:43.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:44.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:45.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:46.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:46 np0005603622 podman[243957]: 2026-01-31 08:05:46.209196117 +0000 UTC m=+0.133112661 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.505 221324 DEBUG nova.network.neutron [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Updating instance_info_cache with network_info: [{"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.560 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.561 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Instance network_info: |[{"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.562 221324 DEBUG oslo_concurrency.lockutils [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.562 221324 DEBUG nova.network.neutron [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Refreshing network info cache for port 971ab75d-0059-498c-acdc-abc169e8d990 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.566 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Start _get_guest_xml network_info=[{"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.575 221324 WARNING nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.587 221324 DEBUG nova.virt.libvirt.host [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.588 221324 DEBUG nova.virt.libvirt.host [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.593 221324 DEBUG nova.virt.libvirt.host [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.594 221324 DEBUG nova.virt.libvirt.host [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.596 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.597 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.598 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.598 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.599 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.599 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.600 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.600 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.601 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.602 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.602 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.603 221324 DEBUG nova.virt.hardware [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:05:46 np0005603622 nova_compute[221301]: 2026-01-31 08:05:46.608 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:05:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3830499818' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.071 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.114 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.119 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:47.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:05:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2209280229' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.554 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.557 221324 DEBUG nova.virt.libvirt.vif [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-849803866',display_name='tempest-DeleteServersTestJSON-server-849803866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-849803866',id=63,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-x5hn0mye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:05:36Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e631901a-0850-43e0-baa6-ad6051bab98f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.558 221324 DEBUG nova.network.os_vif_util [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.559 221324 DEBUG nova.network.os_vif_util [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.562 221324 DEBUG nova.objects.instance [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid e631901a-0850-43e0-baa6-ad6051bab98f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:47 np0005603622 nova_compute[221301]: 2026-01-31 08:05:47.584 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.136 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <uuid>e631901a-0850-43e0-baa6-ad6051bab98f</uuid>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <name>instance-0000003f</name>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:name>tempest-DeleteServersTestJSON-server-849803866</nova:name>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:05:46</nova:creationTime>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <nova:port uuid="971ab75d-0059-498c-acdc-abc169e8d990">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="serial">e631901a-0850-43e0-baa6-ad6051bab98f</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="uuid">e631901a-0850-43e0-baa6-ad6051bab98f</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e631901a-0850-43e0-baa6-ad6051bab98f_disk">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e631901a-0850-43e0-baa6-ad6051bab98f_disk.config">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:fa:cb:f8"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <target dev="tap971ab75d-00"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/console.log" append="off"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:05:48 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:05:48 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:05:48 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:05:48 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.139 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Preparing to wait for external event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.140 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.140 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.140 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.141 221324 DEBUG nova.virt.libvirt.vif [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-849803866',display_name='tempest-DeleteServersTestJSON-server-849803866',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-849803866',id=63,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-x5hn0mye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:05:36Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e631901a-0850-43e0-baa6-ad6051bab98f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.142 221324 DEBUG nova.network.os_vif_util [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.143 221324 DEBUG nova.network.os_vif_util [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.143 221324 DEBUG os_vif [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.145 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.145 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.152 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap971ab75d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.153 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap971ab75d-00, col_values=(('external_ids', {'iface-id': '971ab75d-0059-498c-acdc-abc169e8d990', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:cb:f8', 'vm-uuid': 'e631901a-0850-43e0-baa6-ad6051bab98f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:48 np0005603622 NetworkManager[49080]: <info>  [1769846748.1567] manager: (tap971ab75d-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.165 221324 INFO os_vif [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00')#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.182 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:48.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.295 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.296 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.296 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:fa:cb:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.297 221324 INFO nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Using config drive#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.340 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:48 np0005603622 nova_compute[221301]: 2026-01-31 08:05:48.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:49.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.238 221324 INFO nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Creating config drive at /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.244 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpg5zc_xx6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.382 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpg5zc_xx6" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.417 221324 DEBUG nova.storage.rbd_utils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e631901a-0850-43e0-baa6-ad6051bab98f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.424 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config e631901a-0850-43e0-baa6-ad6051bab98f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.746 221324 DEBUG oslo_concurrency.processutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config e631901a-0850-43e0-baa6-ad6051bab98f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.747 221324 INFO nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Deleting local config drive /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f/disk.config because it was imported into RBD.#033[00m
Jan 31 03:05:49 np0005603622 kernel: tap971ab75d-00: entered promiscuous mode
Jan 31 03:05:49 np0005603622 NetworkManager[49080]: <info>  [1769846749.8105] manager: (tap971ab75d-00): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Jan 31 03:05:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:49Z|00198|binding|INFO|Claiming lport 971ab75d-0059-498c-acdc-abc169e8d990 for this chassis.
Jan 31 03:05:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:49Z|00199|binding|INFO|971ab75d-0059-498c-acdc-abc169e8d990: Claiming fa:16:3e:fa:cb:f8 10.100.0.4
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:49Z|00200|binding|INFO|Setting lport 971ab75d-0059-498c-acdc-abc169e8d990 ovn-installed in OVS
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.859 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:49 np0005603622 nova_compute[221301]: 2026-01-31 08:05:49.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:49Z|00201|binding|INFO|Setting lport 971ab75d-0059-498c-acdc-abc169e8d990 up in Southbound
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.881 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:cb:f8 10.100.0.4'], port_security=['fa:16:3e:fa:cb:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e631901a-0850-43e0-baa6-ad6051bab98f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=971ab75d-0059-498c-acdc-abc169e8d990) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.884 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 971ab75d-0059-498c-acdc-abc169e8d990 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:05:49 np0005603622 systemd-udevd[244124]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.887 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:05:49 np0005603622 systemd-machined[190406]: New machine qemu-26-instance-0000003f.
Jan 31 03:05:49 np0005603622 systemd[1]: Started Virtual Machine qemu-26-instance-0000003f.
Jan 31 03:05:49 np0005603622 NetworkManager[49080]: <info>  [1769846749.9031] device (tap971ab75d-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.902 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9308538a-2188-41b5-aee9-71bf2b867f63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 NetworkManager[49080]: <info>  [1769846749.9040] device (tap971ab75d-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.905 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.908 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.908 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[86d237b0-1f0c-4e55-aae6-83658100011a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.909 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68ed8699-91d2-42a1-92ea-60b03cae5bd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.923 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ad0391-63f7-4cf5-ba48-8ee64ba528fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.940 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[94b01562-eb6a-4603-9134-0f9acfdc1228]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 podman[244116]: 2026-01-31 08:05:49.976747104 +0000 UTC m=+0.108344904 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.978 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[94660ff5-bc20-4ffe-8d6e-59b70345be10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:49 np0005603622 NetworkManager[49080]: <info>  [1769846749.9867] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Jan 31 03:05:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:49.987 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc37032-234f-43bd-821a-0d9aaf84ee2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.019 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3adbea-ca8c-4fce-8352-d330f4a84c5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.022 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[abd335c2-66e2-414f-9a2f-667a49bf8ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 NetworkManager[49080]: <info>  [1769846750.0423] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.047 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fea378f5-912a-45b8-b229-7d581bcf5c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.063 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d555af39-0f18-448a-973f-38ff1cf82aaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600138, 'reachable_time': 39757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244214, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.081 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c76edc43-8f49-40e9-a505-8131e7da6864]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600138, 'tstamp': 600138}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244220, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.096 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[21b81515-5d08-4bf4-82d6-3f325afd4232]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600138, 'reachable_time': 39757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244223, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.127 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cf039429-b2c5-4c71-88c1-9e733761ebc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.173 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[11562970-2232-446a-8b4e-bdf1094e0cfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.174 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.174 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.175 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:50 np0005603622 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:05:50 np0005603622 NetworkManager[49080]: <info>  [1769846750.1775] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.176 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.179 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:50Z|00202|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.182 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.184 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3458c8-cd91-4497-bfd1-8210763b796a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.186 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:50.187 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:05:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:50.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:50 np0005603622 podman[244320]: 2026-01-31 08:05:50.544579465 +0000 UTC m=+0.055688578 container create c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:05:50 np0005603622 systemd[1]: Started libpod-conmon-c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e.scope.
Jan 31 03:05:50 np0005603622 podman[244320]: 2026-01-31 08:05:50.511532787 +0000 UTC m=+0.022641960 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.604 221324 DEBUG nova.compute.manager [req-d9b5f482-f3a9-4133-a158-f24a4117639a req-9650ba47-98e1-456b-8421-805d4cbdd88f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.605 221324 DEBUG oslo_concurrency.lockutils [req-d9b5f482-f3a9-4133-a158-f24a4117639a req-9650ba47-98e1-456b-8421-805d4cbdd88f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.605 221324 DEBUG oslo_concurrency.lockutils [req-d9b5f482-f3a9-4133-a158-f24a4117639a req-9650ba47-98e1-456b-8421-805d4cbdd88f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.606 221324 DEBUG oslo_concurrency.lockutils [req-d9b5f482-f3a9-4133-a158-f24a4117639a req-9650ba47-98e1-456b-8421-805d4cbdd88f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.606 221324 DEBUG nova.compute.manager [req-d9b5f482-f3a9-4133-a158-f24a4117639a req-9650ba47-98e1-456b-8421-805d4cbdd88f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Processing event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:05:50 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:05:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb1529091bb05c52669a4aeaf8bcd474412db2de60a7b119603b1a65f5da513/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:05:50 np0005603622 podman[244320]: 2026-01-31 08:05:50.642825567 +0000 UTC m=+0.153934700 container init c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:05:50 np0005603622 podman[244320]: 2026-01-31 08:05:50.648751137 +0000 UTC m=+0.159860230 container start c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:05:50 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [NOTICE]   (244355) : New worker (244357) forked
Jan 31 03:05:50 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [NOTICE]   (244355) : Loading success.
Jan 31 03:05:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.985 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.987 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846750.984171, e631901a-0850-43e0-baa6-ad6051bab98f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.987 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] VM Started (Lifecycle Event)#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.993 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.997 221324 INFO nova.virt.libvirt.driver [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Instance spawned successfully.#033[00m
Jan 31 03:05:50 np0005603622 nova_compute[221301]: 2026-01-31 08:05:50.997 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.001 221324 DEBUG nova.network.neutron [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Updated VIF entry in instance network info cache for port 971ab75d-0059-498c-acdc-abc169e8d990. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.002 221324 DEBUG nova.network.neutron [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Updating instance_info_cache with network_info: [{"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.031 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.035 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.054 221324 DEBUG oslo_concurrency.lockutils [req-5130dd10-2a28-49ef-891e-2062f3c76566 req-4b99a86e-1d72-4c4f-9c2a-ed9cb67cb4db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e631901a-0850-43e0-baa6-ad6051bab98f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.060 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.061 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.062 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.063 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.064 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.065 221324 DEBUG nova.virt.libvirt.driver [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.072 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.073 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846750.9895175, e631901a-0850-43e0-baa6-ad6051bab98f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.074 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.103 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.109 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846750.992314, e631901a-0850-43e0-baa6-ad6051bab98f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.109 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.141 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.145 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.157 221324 INFO nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Took 14.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.158 221324 DEBUG nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.191 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:05:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.266 221324 INFO nova.compute.manager [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Took 17.25 seconds to build instance.#033[00m
Jan 31 03:05:51 np0005603622 nova_compute[221301]: 2026-01-31 08:05:51.319 221324 DEBUG oslo_concurrency.lockutils [None req-32f97132-1b02-4a70-b626-0b74ffa26304 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:05:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:05:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:52.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.857 221324 DEBUG nova.compute.manager [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.857 221324 DEBUG oslo_concurrency.lockutils [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.857 221324 DEBUG oslo_concurrency.lockutils [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.857 221324 DEBUG oslo_concurrency.lockutils [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.858 221324 DEBUG nova.compute.manager [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] No waiting events found dispatching network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:52 np0005603622 nova_compute[221301]: 2026-01-31 08:05:52.858 221324 WARNING nova.compute.manager [req-7fe291c7-a013-4d25-963f-a57b4da14832 req-73126ba8-6010-4d1c-8918-8d36eb0ecb90 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received unexpected event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.155 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.177 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.178 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.179 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.179 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.179 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.181 221324 INFO nova.compute.manager [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Terminating instance#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.182 221324 DEBUG nova.compute.manager [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:05:53 np0005603622 kernel: tap971ab75d-00 (unregistering): left promiscuous mode
Jan 31 03:05:53 np0005603622 NetworkManager[49080]: <info>  [1769846753.2204] device (tap971ab75d-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:05:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:53Z|00203|binding|INFO|Releasing lport 971ab75d-0059-498c-acdc-abc169e8d990 from this chassis (sb_readonly=0)
Jan 31 03:05:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:53Z|00204|binding|INFO|Setting lport 971ab75d-0059-498c-acdc-abc169e8d990 down in Southbound
Jan 31 03:05:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:05:53Z|00205|binding|INFO|Removing iface tap971ab75d-00 ovn-installed in OVS
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.232 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:53.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.241 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:cb:f8 10.100.0.4'], port_security=['fa:16:3e:fa:cb:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e631901a-0850-43e0-baa6-ad6051bab98f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=971ab75d-0059-498c-acdc-abc169e8d990) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.244 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 971ab75d-0059-498c-acdc-abc169e8d990 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.246 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.248 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.248 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7d8a94-6147-48d8-801d-6ec067fd2e80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.249 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:05:53 np0005603622 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 31 03:05:53 np0005603622 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Consumed 3.304s CPU time.
Jan 31 03:05:53 np0005603622 systemd-machined[190406]: Machine qemu-26-instance-0000003f terminated.
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [NOTICE]   (244355) : haproxy version is 2.8.14-c23fe91
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [NOTICE]   (244355) : path to executable is /usr/sbin/haproxy
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [WARNING]  (244355) : Exiting Master process...
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [WARNING]  (244355) : Exiting Master process...
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [ALERT]    (244355) : Current worker (244357) exited with code 143 (Terminated)
Jan 31 03:05:53 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244351]: [WARNING]  (244355) : All workers exited. Exiting... (0)
Jan 31 03:05:53 np0005603622 systemd[1]: libpod-c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e.scope: Deactivated successfully.
Jan 31 03:05:53 np0005603622 podman[244432]: 2026-01-31 08:05:53.375254933 +0000 UTC m=+0.042307359 container died c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:05:53 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:05:53 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0cb1529091bb05c52669a4aeaf8bcd474412db2de60a7b119603b1a65f5da513-merged.mount: Deactivated successfully.
Jan 31 03:05:53 np0005603622 podman[244432]: 2026-01-31 08:05:53.420686855 +0000 UTC m=+0.087739251 container cleanup c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.419 221324 INFO nova.virt.libvirt.driver [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Instance destroyed successfully.#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.422 221324 DEBUG nova.objects.instance [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid e631901a-0850-43e0-baa6-ad6051bab98f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:05:53 np0005603622 systemd[1]: libpod-conmon-c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e.scope: Deactivated successfully.
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.450 221324 DEBUG nova.virt.libvirt.vif [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:05:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-849803866',display_name='tempest-DeleteServersTestJSON-server-849803866',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-849803866',id=63,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:05:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-x5hn0mye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:05:51Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e631901a-0850-43e0-baa6-ad6051bab98f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.452 221324 DEBUG nova.network.os_vif_util [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "971ab75d-0059-498c-acdc-abc169e8d990", "address": "fa:16:3e:fa:cb:f8", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap971ab75d-00", "ovs_interfaceid": "971ab75d-0059-498c-acdc-abc169e8d990", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.453 221324 DEBUG nova.network.os_vif_util [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.453 221324 DEBUG os_vif [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.458 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.458 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap971ab75d-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.465 221324 INFO os_vif [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:cb:f8,bridge_name='br-int',has_traffic_filtering=True,id=971ab75d-0059-498c-acdc-abc169e8d990,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap971ab75d-00')#033[00m
Jan 31 03:05:53 np0005603622 podman[244471]: 2026-01-31 08:05:53.509158744 +0000 UTC m=+0.052175294 container remove c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.515 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[456d2241-6357-474b-90e3-a7a6752b464a]: (4, ('Sat Jan 31 08:05:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e)\nc1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e\nSat Jan 31 08:05:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (c1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e)\nc1429ca8c6aed698149cd42ed448f4d5b163573a85f2b8ce02cb4b830d1f610e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.516 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d08dbc01-e36a-460f-9f17-45dda234da90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.517 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.528 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e707907b-5946-4839-acf7-c79598877e77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.549 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fc984f2d-88f6-47cc-8ea9-d96366915510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.551 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d948d8f4-d6b2-46d6-a348-f4c74d65eb7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.563 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bc15febc-2597-48ed-acb4-6bb222670ec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600131, 'reachable_time': 15335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244504, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.567 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:05:53 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:05:53.567 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[0571515e-c67e-4885-b81b-8113e48bfbf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:05:53 np0005603622 nova_compute[221301]: 2026-01-31 08:05:53.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.004 221324 INFO nova.virt.libvirt.driver [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Deleting instance files /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f_del#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.005 221324 INFO nova.virt.libvirt.driver [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Deletion of /var/lib/nova/instances/e631901a-0850-43e0-baa6-ad6051bab98f_del complete#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.079 221324 INFO nova.compute.manager [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Took 0.90 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.080 221324 DEBUG oslo.service.loopingcall [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.080 221324 DEBUG nova.compute.manager [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:05:54 np0005603622 nova_compute[221301]: 2026-01-31 08:05:54.081 221324 DEBUG nova.network.neutron [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:05:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:54.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.059 221324 DEBUG nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-unplugged-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.059 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] No waiting events found dispatching network-vif-unplugged-971ab75d-0059-498c-acdc-abc169e8d990 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-unplugged-971ab75d-0059-498c-acdc-abc169e8d990 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.060 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.061 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.061 221324 DEBUG oslo_concurrency.lockutils [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.061 221324 DEBUG nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] No waiting events found dispatching network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:05:55 np0005603622 nova_compute[221301]: 2026-01-31 08:05:55.061 221324 WARNING nova.compute.manager [req-2b12d51c-861d-4cf2-b63b-4e39bb20dfb6 req-08f91b35-14fe-44ee-b2d5-fb937ffcc9ca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received unexpected event network-vif-plugged-971ab75d-0059-498c-acdc-abc169e8d990 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:05:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.118 221324 DEBUG nova.network.neutron [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.151 221324 INFO nova.compute.manager [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Took 2.07 seconds to deallocate network for instance.#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.217 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.218 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:05:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:05:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:56.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.316 221324 DEBUG oslo_concurrency.processutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:05:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:05:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3409571477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.823 221324 DEBUG oslo_concurrency.processutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.830 221324 DEBUG nova.compute.provider_tree [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.869 221324 DEBUG nova.scheduler.client.report [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:05:56 np0005603622 nova_compute[221301]: 2026-01-31 08:05:56.949 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:57 np0005603622 nova_compute[221301]: 2026-01-31 08:05:57.135 221324 INFO nova.scheduler.client.report [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance e631901a-0850-43e0-baa6-ad6051bab98f#033[00m
Jan 31 03:05:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:05:57 np0005603622 nova_compute[221301]: 2026-01-31 08:05:57.234 221324 DEBUG nova.compute.manager [req-13f49315-f374-4b89-9932-0ca9592e6adb req-7015aecd-2521-4f22-9f95-c89d4c9e277a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Received event network-vif-deleted-971ab75d-0059-498c-acdc-abc169e8d990 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:05:57 np0005603622 nova_compute[221301]: 2026-01-31 08:05:57.245 221324 DEBUG oslo_concurrency.lockutils [None req-76077520-05f7-44b9-ade4-7959cacdff48 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e631901a-0850-43e0-baa6-ad6051bab98f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:05:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:57.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:05:58.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:05:58 np0005603622 nova_compute[221301]: 2026-01-31 08:05:58.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:58 np0005603622 nova_compute[221301]: 2026-01-31 08:05:58.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:05:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:05:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:05:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:05:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:00.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.007 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.008 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.043 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.147 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.148 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.162 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.162 221324 INFO nova.compute.claims [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:06:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:02.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.319 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2158537027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.794 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.800 221324 DEBUG nova.compute.provider_tree [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.838 221324 DEBUG nova.scheduler.client.report [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.866 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.867 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.928 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.929 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:06:02 np0005603622 nova_compute[221301]: 2026-01-31 08:06:02.999 221324 INFO nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.023 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.147 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.149 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.150 221324 INFO nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Creating image(s)#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.187 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.227 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:03.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.267 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.274 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.310 221324 DEBUG nova.policy [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16d731f5875748ca9b8036b2ba061042', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3469c253459e40e39dcf5bcb6a32008f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.347 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.348 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.349 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.350 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.379 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.384 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2d00dd71-0bdd-41cb-b673-26806329b43a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:03 np0005603622 nova_compute[221301]: 2026-01-31 08:06:03.843 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.051 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 2d00dd71-0bdd-41cb-b673-26806329b43a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.668s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.131 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] resizing rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:06:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:04.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.269 221324 DEBUG nova.objects.instance [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'migration_context' on Instance uuid 2d00dd71-0bdd-41cb-b673-26806329b43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.297 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.297 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Ensure instance console log exists: /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.298 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.298 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.299 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:04 np0005603622 nova_compute[221301]: 2026-01-31 08:06:04.781 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Successfully created port: 31ef21c6-de51-41a3-a1b3-186778c38da2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:06:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:05.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:06.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.814 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Successfully updated port: 31ef21c6-de51-41a3-a1b3-186778c38da2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.853 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.853 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.853 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.967 221324 DEBUG nova.compute.manager [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-changed-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.967 221324 DEBUG nova.compute.manager [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Refreshing instance network info cache due to event network-changed-31ef21c6-de51-41a3-a1b3-186778c38da2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:06 np0005603622 nova_compute[221301]: 2026-01-31 08:06:06.967 221324 DEBUG oslo_concurrency.lockutils [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:07 np0005603622 nova_compute[221301]: 2026-01-31 08:06:07.134 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:06:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:06:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:06:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:08.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.417 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846753.4167504, e631901a-0850-43e0-baa6-ad6051bab98f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.418 221324 INFO nova.compute.manager [-] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.515 221324 DEBUG nova.compute.manager [None req-c7a1f882-478b-4e37-b1c3-ae5bcfc6cb6b - - - - - -] [instance: e631901a-0850-43e0-baa6-ad6051bab98f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:08 np0005603622 nova_compute[221301]: 2026-01-31 08:06:08.990 221324 DEBUG nova.network.neutron [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Updating instance_info_cache with network_info: [{"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.013 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.014 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Instance network_info: |[{"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.015 221324 DEBUG oslo_concurrency.lockutils [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.016 221324 DEBUG nova.network.neutron [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Refreshing network info cache for port 31ef21c6-de51-41a3-a1b3-186778c38da2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.021 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Start _get_guest_xml network_info=[{"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.029 221324 WARNING nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.036 221324 DEBUG nova.virt.libvirt.host [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.037 221324 DEBUG nova.virt.libvirt.host [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.051 221324 DEBUG nova.virt.libvirt.host [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.052 221324 DEBUG nova.virt.libvirt.host [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.054 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.054 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.055 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.056 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.056 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.056 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.057 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.057 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.058 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.058 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.058 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.059 221324 DEBUG nova.virt.hardware [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.064 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:06:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/743642764' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.494 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.529 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.534 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 31 03:06:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:06:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/254966819' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.975 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.978 221324 DEBUG nova.virt.libvirt.vif [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-418575325',display_name='tempest-DeleteServersTestJSON-server-418575325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-418575325',id=66,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-0pkr9n57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:06:03Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=2d00dd71-0bdd-41cb-b673-26806329b43a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.980 221324 DEBUG nova.network.os_vif_util [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.982 221324 DEBUG nova.network.os_vif_util [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:09 np0005603622 nova_compute[221301]: 2026-01-31 08:06:09.984 221324 DEBUG nova.objects.instance [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d00dd71-0bdd-41cb-b673-26806329b43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.009 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <uuid>2d00dd71-0bdd-41cb-b673-26806329b43a</uuid>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <name>instance-00000042</name>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:name>tempest-DeleteServersTestJSON-server-418575325</nova:name>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:06:09</nova:creationTime>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <nova:port uuid="31ef21c6-de51-41a3-a1b3-186778c38da2">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="serial">2d00dd71-0bdd-41cb-b673-26806329b43a</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="uuid">2d00dd71-0bdd-41cb-b673-26806329b43a</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/2d00dd71-0bdd-41cb-b673-26806329b43a_disk">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:97:40:16"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <target dev="tap31ef21c6-de"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/console.log" append="off"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:06:10 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:06:10 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:06:10 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:06:10 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.011 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Preparing to wait for external event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.012 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.012 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.012 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.013 221324 DEBUG nova.virt.libvirt.vif [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-418575325',display_name='tempest-DeleteServersTestJSON-server-418575325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-418575325',id=66,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-0pkr9n57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:06:03Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=2d00dd71-0bdd-41cb-b673-26806329b43a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.013 221324 DEBUG nova.network.os_vif_util [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.014 221324 DEBUG nova.network.os_vif_util [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.014 221324 DEBUG os_vif [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.015 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.016 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.018 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31ef21c6-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.019 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31ef21c6-de, col_values=(('external_ids', {'iface-id': '31ef21c6-de51-41a3-a1b3-186778c38da2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:40:16', 'vm-uuid': '2d00dd71-0bdd-41cb-b673-26806329b43a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:10 np0005603622 NetworkManager[49080]: <info>  [1769846770.0213] manager: (tap31ef21c6-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.030 221324 INFO os_vif [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de')#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.119 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.119 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.120 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:97:40:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.120 221324 INFO nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Using config drive#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.147 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:10.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.766 221324 INFO nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Creating config drive at /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.772 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpj1a29wgx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.906 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpj1a29wgx" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.937 221324 DEBUG nova.storage.rbd_utils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:10 np0005603622 nova_compute[221301]: 2026-01-31 08:06:10.940 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config 2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 31 03:06:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:11.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.483 221324 DEBUG oslo_concurrency.processutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config 2d00dd71-0bdd-41cb-b673-26806329b43a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.485 221324 INFO nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Deleting local config drive /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a/disk.config because it was imported into RBD.#033[00m
Jan 31 03:06:11 np0005603622 kernel: tap31ef21c6-de: entered promiscuous mode
Jan 31 03:06:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:11Z|00206|binding|INFO|Claiming lport 31ef21c6-de51-41a3-a1b3-186778c38da2 for this chassis.
Jan 31 03:06:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:11Z|00207|binding|INFO|31ef21c6-de51-41a3-a1b3-186778c38da2: Claiming fa:16:3e:97:40:16 10.100.0.9
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.5488] manager: (tap31ef21c6-de): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.549 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:11Z|00208|binding|INFO|Setting lport 31ef21c6-de51-41a3-a1b3-186778c38da2 ovn-installed in OVS
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:11Z|00209|binding|INFO|Setting lport 31ef21c6-de51-41a3-a1b3-186778c38da2 up in Southbound
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.564 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:40:16 10.100.0.9'], port_security=['fa:16:3e:97:40:16 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2d00dd71-0bdd-41cb-b673-26806329b43a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=31ef21c6-de51-41a3-a1b3-186778c38da2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.566 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 31ef21c6-de51-41a3-a1b3-186778c38da2 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.569 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:06:11 np0005603622 systemd-udevd[244905]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:11 np0005603622 systemd-machined[190406]: New machine qemu-27-instance-00000042.
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fe5f2f-c885-4960-952b-9a305381801e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.585 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.588 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.589 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[18e70697-248b-4218-bfe4-58a514750432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.589 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ee18bfee-306f-4b23-ae43-c9d4ac500f93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 systemd[1]: Started Virtual Machine qemu-27-instance-00000042.
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.600 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c36b2fd5-127e-4684-9e46-bec130c051e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.6019] device (tap31ef21c6-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.6025] device (tap31ef21c6-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.615 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a08c01a1-11ac-4ac1-b752-bb92394ed244]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.643 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea92efd-b98d-4ed4-a0a0-fb9865b65cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.6501] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Jan 31 03:06:11 np0005603622 systemd-udevd[244908]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.648 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0ddef1-ee2a-4747-a246-ce882b326bc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.687 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9580334c-963b-41f6-b142-79ecae700b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.692 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3a54b4-c3e6-426e-949b-d23adc681415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.7165] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.723 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a86c25-b2d4-4a28-94cd-0e536222d880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0aecfd1d-6730-451c-8d17-e9af9b516672]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602305, 'reachable_time': 39203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244937, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.759 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca5b8d0-f55f-4387-b219-9ff9ad3f37b8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602305, 'tstamp': 602305}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244938, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3530285d-1365-47dd-851f-5280229f07cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602305, 'reachable_time': 39203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244939, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.798 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cd21916d-feea-4512-8257-db7734367b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.845 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1326bd95-33fc-4317-83e7-1688e3bfe35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.847 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.847 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.848 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:11 np0005603622 NetworkManager[49080]: <info>  [1769846771.8523] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.851 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:11 np0005603622 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.861 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:11Z|00210|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.866 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:06:11 np0005603622 nova_compute[221301]: 2026-01-31 08:06:11.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.867 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ba4a37-d7ac-4f09-84f3-79b385700df6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.869 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:06:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:11.870 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:06:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 31 03:06:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:12.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:12 np0005603622 podman[244971]: 2026-01-31 08:06:12.250320281 +0000 UTC m=+0.067525207 container create 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:06:12 np0005603622 systemd[1]: Started libpod-conmon-150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530.scope.
Jan 31 03:06:12 np0005603622 podman[244971]: 2026-01-31 08:06:12.220696554 +0000 UTC m=+0.037901500 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:06:12 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:06:12 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c730c5f2ea43097e89ae68266816c7e9966df08184c5375d54ce59a95ca4e165/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:06:12 np0005603622 podman[244971]: 2026-01-31 08:06:12.351566263 +0000 UTC m=+0.168771229 container init 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:06:12 np0005603622 podman[244971]: 2026-01-31 08:06:12.363180566 +0000 UTC m=+0.180385502 container start 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 03:06:12 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [NOTICE]   (244991) : New worker (244993) forked
Jan 31 03:06:12 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [NOTICE]   (244991) : Loading success.
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.050 221324 DEBUG nova.compute.manager [req-48061903-45d3-4705-a13a-171801fbefe1 req-70cc2e0f-cbab-4f4f-b290-7a4ed698b75a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.052 221324 DEBUG oslo_concurrency.lockutils [req-48061903-45d3-4705-a13a-171801fbefe1 req-70cc2e0f-cbab-4f4f-b290-7a4ed698b75a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.052 221324 DEBUG oslo_concurrency.lockutils [req-48061903-45d3-4705-a13a-171801fbefe1 req-70cc2e0f-cbab-4f4f-b290-7a4ed698b75a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.053 221324 DEBUG oslo_concurrency.lockutils [req-48061903-45d3-4705-a13a-171801fbefe1 req-70cc2e0f-cbab-4f4f-b290-7a4ed698b75a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.053 221324 DEBUG nova.compute.manager [req-48061903-45d3-4705-a13a-171801fbefe1 req-70cc2e0f-cbab-4f4f-b290-7a4ed698b75a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Processing event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.054 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846773.0468612, 2d00dd71-0bdd-41cb-b673-26806329b43a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.055 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] VM Started (Lifecycle Event)#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.057 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.061 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.066 221324 INFO nova.virt.libvirt.driver [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Instance spawned successfully.#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.066 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.109 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.118 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.123 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.123 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.124 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.125 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.125 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.126 221324 DEBUG nova.virt.libvirt.driver [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.155 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.156 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846773.0516303, 2d00dd71-0bdd-41cb-b673-26806329b43a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.156 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.205 221324 INFO nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Took 10.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.206 221324 DEBUG nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.217 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.224 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846773.0599527, 2d00dd71-0bdd-41cb-b673-26806329b43a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.225 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.257 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.260 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:13.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.282 221324 INFO nova.compute.manager [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Took 11.17 seconds to build instance.#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.302 221324 DEBUG oslo_concurrency.lockutils [None req-c84d3167-38cd-4232-b6cf-48b9779894cf 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.318 221324 DEBUG nova.network.neutron [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Updated VIF entry in instance network info cache for port 31ef21c6-de51-41a3-a1b3-186778c38da2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.319 221324 DEBUG nova.network.neutron [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Updating instance_info_cache with network_info: [{"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.337 221324 DEBUG oslo_concurrency.lockutils [req-e7890dd3-73b8-46bf-99c5-7976d4c88b15 req-eaf64839-3bf3-4303-92be-1e2c7bd5d6e1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-2d00dd71-0bdd-41cb-b673-26806329b43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:13 np0005603622 nova_compute[221301]: 2026-01-31 08:06:13.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:14.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.021 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.254 221324 DEBUG nova.compute.manager [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.254 221324 DEBUG oslo_concurrency.lockutils [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.255 221324 DEBUG oslo_concurrency.lockutils [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.255 221324 DEBUG oslo_concurrency.lockutils [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.255 221324 DEBUG nova.compute.manager [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] No waiting events found dispatching network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:15 np0005603622 nova_compute[221301]: 2026-01-31 08:06:15.255 221324 WARNING nova.compute.manager [req-dd4eb973-648d-4dbf-8551-83d1906fc4c3 req-6b2751df-b282-4191-94af-84aaade4ac60 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received unexpected event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:06:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:17 np0005603622 podman[245044]: 2026-01-31 08:06:17.159746793 +0000 UTC m=+0.086675552 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 31 03:06:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:17.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 31 03:06:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:18.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.447 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.447 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.609 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.609 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.625 221324 DEBUG nova.objects.instance [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'flavor' on Instance uuid 2d00dd71-0bdd-41cb-b673-26806329b43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.678 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:18 np0005603622 nova_compute[221301]: 2026-01-31 08:06:18.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.173 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.175 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.175 221324 INFO nova.compute.manager [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Attaching volume cfc0e54a-67e2-4096-bf5e-8287b1e5add7 to /dev/vdb#033[00m
Jan 31 03:06:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:19.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.422 221324 DEBUG os_brick.utils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.424 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.437 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.438 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[fa729857-967b-4225-9977-73bb7f09eb78]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.439 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.448 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.448 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[ade8fac9-5380-41d4-a6fe-36ef0c512289]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.451 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.462 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.462 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[871d0a95-c2b3-43c4-b1fa-0e82e5d25e99]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.464 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[9894cc26-6e14-449b-b657-a847a3783e48]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.465 221324 DEBUG oslo_concurrency.processutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.489 221324 DEBUG oslo_concurrency.processutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.491 221324 DEBUG os_brick.initiator.connectors.lightos [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.492 221324 DEBUG os_brick.initiator.connectors.lightos [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.492 221324 DEBUG os_brick.initiator.connectors.lightos [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.493 221324 DEBUG os_brick.utils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:06:19 np0005603622 nova_compute[221301]: 2026-01-31 08:06:19.493 221324 DEBUG nova.virt.block_device [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Updating existing volume attachment record: 106b3c9a-2ab8-4dc4-9402-36555746d006 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.024 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:20 np0005603622 podman[245079]: 2026-01-31 08:06:20.186999297 +0000 UTC m=+0.105013415 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:06:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:20.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.655 221324 DEBUG nova.objects.instance [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'flavor' on Instance uuid 2d00dd71-0bdd-41cb-b673-26806329b43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.683 221324 DEBUG nova.virt.libvirt.driver [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Attempting to attach volume cfc0e54a-67e2-4096-bf5e-8287b1e5add7 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.686 221324 DEBUG nova.virt.libvirt.guest [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-cfc0e54a-67e2-4096-bf5e-8287b1e5add7">
Jan 31 03:06:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:06:20 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:06:20 np0005603622 nova_compute[221301]:  <serial>cfc0e54a-67e2-4096-bf5e-8287b1e5add7</serial>
Jan 31 03:06:20 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:06:20 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:06:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.854 221324 DEBUG nova.virt.libvirt.driver [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.855 221324 DEBUG nova.virt.libvirt.driver [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.855 221324 DEBUG nova.virt.libvirt.driver [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:20 np0005603622 nova_compute[221301]: 2026-01-31 08:06:20.856 221324 DEBUG nova.virt.libvirt.driver [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:97:40:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:21 np0005603622 nova_compute[221301]: 2026-01-31 08:06:21.270 221324 DEBUG oslo_concurrency.lockutils [None req-010cebc9-06b0-4be2-9486-254867062b8b 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:21.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:22.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.523 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.524 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.525 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.526 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.526 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.528 221324 INFO nova.compute.manager [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Terminating instance#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.530 221324 DEBUG nova.compute.manager [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:06:22 np0005603622 kernel: tap31ef21c6-de (unregistering): left promiscuous mode
Jan 31 03:06:22 np0005603622 NetworkManager[49080]: <info>  [1769846782.6342] device (tap31ef21c6-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:06:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:22Z|00211|binding|INFO|Releasing lport 31ef21c6-de51-41a3-a1b3-186778c38da2 from this chassis (sb_readonly=0)
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.634 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:22Z|00212|binding|INFO|Setting lport 31ef21c6-de51-41a3-a1b3-186778c38da2 down in Southbound
Jan 31 03:06:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:22Z|00213|binding|INFO|Removing iface tap31ef21c6-de ovn-installed in OVS
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.661 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:40:16 10.100.0.9'], port_security=['fa:16:3e:97:40:16 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2d00dd71-0bdd-41cb-b673-26806329b43a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=31ef21c6-de51-41a3-a1b3-186778c38da2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.663 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 31ef21c6-de51-41a3-a1b3-186778c38da2 in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.666 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.667 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a55ab444-0984-42a0-8c57-8cd19f6ec0d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.668 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:06:22 np0005603622 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000042.scope: Deactivated successfully.
Jan 31 03:06:22 np0005603622 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000042.scope: Consumed 11.127s CPU time.
Jan 31 03:06:22 np0005603622 systemd-machined[190406]: Machine qemu-27-instance-00000042 terminated.
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.771 221324 INFO nova.virt.libvirt.driver [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Instance destroyed successfully.#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.772 221324 DEBUG nova.objects.instance [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid 2d00dd71-0bdd-41cb-b673-26806329b43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:22 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [NOTICE]   (244991) : haproxy version is 2.8.14-c23fe91
Jan 31 03:06:22 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [NOTICE]   (244991) : path to executable is /usr/sbin/haproxy
Jan 31 03:06:22 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [WARNING]  (244991) : Exiting Master process...
Jan 31 03:06:22 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [ALERT]    (244991) : Current worker (244993) exited with code 143 (Terminated)
Jan 31 03:06:22 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[244987]: [WARNING]  (244991) : All workers exited. Exiting... (0)
Jan 31 03:06:22 np0005603622 systemd[1]: libpod-150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530.scope: Deactivated successfully.
Jan 31 03:06:22 np0005603622 podman[245144]: 2026-01-31 08:06:22.809522616 +0000 UTC m=+0.049587285 container died 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.811 221324 DEBUG nova.virt.libvirt.vif [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:06:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-418575325',display_name='tempest-DeleteServersTestJSON-server-418575325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-418575325',id=66,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-0pkr9n57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:13Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=2d00dd71-0bdd-41cb-b673-26806329b43a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.812 221324 DEBUG nova.network.os_vif_util [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "31ef21c6-de51-41a3-a1b3-186778c38da2", "address": "fa:16:3e:97:40:16", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ef21c6-de", "ovs_interfaceid": "31ef21c6-de51-41a3-a1b3-186778c38da2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.813 221324 DEBUG nova.network.os_vif_util [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.814 221324 DEBUG os_vif [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.817 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31ef21c6-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.821 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.825 221324 INFO os_vif [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:40:16,bridge_name='br-int',has_traffic_filtering=True,id=31ef21c6-de51-41a3-a1b3-186778c38da2,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ef21c6-de')#033[00m
Jan 31 03:06:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530-userdata-shm.mount: Deactivated successfully.
Jan 31 03:06:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay-c730c5f2ea43097e89ae68266816c7e9966df08184c5375d54ce59a95ca4e165-merged.mount: Deactivated successfully.
Jan 31 03:06:22 np0005603622 podman[245144]: 2026-01-31 08:06:22.851190907 +0000 UTC m=+0.091255596 container cleanup 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:06:22 np0005603622 systemd[1]: libpod-conmon-150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530.scope: Deactivated successfully.
Jan 31 03:06:22 np0005603622 podman[245195]: 2026-01-31 08:06:22.926730828 +0000 UTC m=+0.056004677 container remove 150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.934 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbf2650-c657-4b35-a709-029eb3d7ea56]: (4, ('Sat Jan 31 08:06:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530)\n150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530\nSat Jan 31 08:06:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530)\n150094d1750104d4aa2dc6069e83c5f15e38744d8e63d476728e628bee83f530\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.936 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[225b0f2f-0ee0-4b34-84ac-67a69d24df5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.937 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.939 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:06:22 np0005603622 nova_compute[221301]: 2026-01-31 08:06:22.945 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.949 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[30f09687-a615-4f1a-9747-fc6a0d6e02d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.964 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c918111-93bc-4f77-b05c-2725321ead11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.966 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd945bf-ce07-4a34-8e9e-eab6f9623fab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.984 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a35125a3-8241-43f0-b965-8c90a64e0e72]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602297, 'reachable_time': 18497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245213, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:22 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.988 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:06:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:22.989 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d50038-e0a8-4e58-9f8d-c65ac99bbe06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.182 221324 DEBUG nova.compute.manager [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-unplugged-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.183 221324 DEBUG oslo_concurrency.lockutils [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.184 221324 DEBUG oslo_concurrency.lockutils [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.184 221324 DEBUG oslo_concurrency.lockutils [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.184 221324 DEBUG nova.compute.manager [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] No waiting events found dispatching network-vif-unplugged-31ef21c6-de51-41a3-a1b3-186778c38da2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.185 221324 DEBUG nova.compute.manager [req-c1fe5904-3719-421e-8356-1f4d82820760 req-14cead53-8b0a-485d-be68-499fa6ba3421 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-unplugged-31ef21c6-de51-41a3-a1b3-186778c38da2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:06:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:23.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.297 221324 INFO nova.virt.libvirt.driver [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Deleting instance files /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a_del#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.298 221324 INFO nova.virt.libvirt.driver [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Deletion of /var/lib/nova/instances/2d00dd71-0bdd-41cb-b673-26806329b43a_del complete#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.449 221324 INFO nova.compute.manager [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.450 221324 DEBUG oslo.service.loopingcall [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.450 221324 DEBUG nova.compute.manager [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.450 221324 DEBUG nova.network.neutron [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.834 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:23 np0005603622 nova_compute[221301]: 2026-01-31 08:06:23.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:24.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:24 np0005603622 nova_compute[221301]: 2026-01-31 08:06:24.855 221324 DEBUG nova.network.neutron [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:24 np0005603622 nova_compute[221301]: 2026-01-31 08:06:24.923 221324 INFO nova.compute.manager [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Took 1.47 seconds to deallocate network for instance.#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.041 221324 DEBUG nova.compute.manager [req-92c00dd3-eef4-4752-add9-2d98a9460cfb req-5cfe015d-2eb4-4de2-950a-0b7003f60199 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-deleted-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.189 221324 INFO nova.compute.manager [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Took 0.27 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:06:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:25.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.296 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.297 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.389 221324 DEBUG nova.compute.manager [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.390 221324 DEBUG oslo_concurrency.lockutils [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.391 221324 DEBUG oslo_concurrency.lockutils [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.391 221324 DEBUG oslo_concurrency.lockutils [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.392 221324 DEBUG nova.compute.manager [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] No waiting events found dispatching network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.393 221324 WARNING nova.compute.manager [req-2827ac30-8c3b-467e-9260-aeddc46eeabc req-b9615a06-7d5b-4028-b0b8-249dc9689ca2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Received unexpected event network-vif-plugged-31ef21c6-de51-41a3-a1b3-186778c38da2 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.394 221324 DEBUG oslo_concurrency.processutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2229221420' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.846 221324 DEBUG oslo_concurrency.processutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.853 221324 DEBUG nova.compute.provider_tree [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.877 221324 DEBUG nova.scheduler.client.report [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.908 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:25 np0005603622 nova_compute[221301]: 2026-01-31 08:06:25.981 221324 INFO nova.scheduler.client.report [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance 2d00dd71-0bdd-41cb-b673-26806329b43a#033[00m
Jan 31 03:06:26 np0005603622 nova_compute[221301]: 2026-01-31 08:06:26.097 221324 DEBUG oslo_concurrency.lockutils [None req-b5c419b1-7229-42e8-83c4-84a53be484f0 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "2d00dd71-0bdd-41cb-b673-26806329b43a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:26.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:27.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:27 np0005603622 nova_compute[221301]: 2026-01-31 08:06:27.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:27 np0005603622 nova_compute[221301]: 2026-01-31 08:06:27.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:27 np0005603622 nova_compute[221301]: 2026-01-31 08:06:27.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:27 np0005603622 nova_compute[221301]: 2026-01-31 08:06:27.820 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:28.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:28 np0005603622 nova_compute[221301]: 2026-01-31 08:06:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603622 nova_compute[221301]: 2026-01-31 08:06:28.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:28 np0005603622 nova_compute[221301]: 2026-01-31 08:06:28.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:29.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.422 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.423 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1373946310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:29 np0005603622 nova_compute[221301]: 2026-01-31 08:06:29.921 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.125 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.127 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4675MB free_disk=20.94271469116211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.128 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.128 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.259 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.260 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:06:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:30.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.301 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1437474936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.782 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.790 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.815 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.855 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:06:30 np0005603622 nova_compute[221301]: 2026-01-31 08:06:30.856 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:30.860 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:30.861 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:30.861 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:31.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:32.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:32 np0005603622 nova_compute[221301]: 2026-01-31 08:06:32.823 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:32 np0005603622 nova_compute[221301]: 2026-01-31 08:06:32.857 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:32 np0005603622 nova_compute[221301]: 2026-01-31 08:06:32.857 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:06:32 np0005603622 nova_compute[221301]: 2026-01-31 08:06:32.857 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:06:32 np0005603622 nova_compute[221301]: 2026-01-31 08:06:32.874 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:06:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:33.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:33 np0005603622 nova_compute[221301]: 2026-01-31 08:06:33.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:34.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:34 np0005603622 nova_compute[221301]: 2026-01-31 08:06:34.409 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:06:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:36.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.026 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.027 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.047 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.144 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.145 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.152 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.153 221324 INFO nova.compute.claims [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.287 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:37.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:06:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3789667094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.769 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846782.7676084, 2d00dd71-0bdd-41cb-b673-26806329b43a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.770 221324 INFO nova.compute.manager [-] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.773 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.782 221324 DEBUG nova.compute.provider_tree [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.815 221324 DEBUG nova.compute.manager [None req-f2abec00-7351-49ae-916b-d3076fcb2f9f - - - - - -] [instance: 2d00dd71-0bdd-41cb-b673-26806329b43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.816 221324 DEBUG nova.scheduler.client.report [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.825 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.894 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.896 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.945 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.946 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.964 221324 INFO nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:06:37 np0005603622 nova_compute[221301]: 2026-01-31 08:06:37.984 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.120 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.122 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.122 221324 INFO nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Creating image(s)#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.162 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.201 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.238 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.244 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:38.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.318 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.320 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.320 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.321 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.349 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.354 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e7dde2ad-e161-41d0-b160-733cde37c6a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.943 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:38 np0005603622 nova_compute[221301]: 2026-01-31 08:06:38.954 221324 DEBUG nova.policy [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16d731f5875748ca9b8036b2ba061042', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3469c253459e40e39dcf5bcb6a32008f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:06:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:39.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:39 np0005603622 nova_compute[221301]: 2026-01-31 08:06:39.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:39.528 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:39.529 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:06:39 np0005603622 nova_compute[221301]: 2026-01-31 08:06:39.933 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Successfully created port: 21d4cfd3-5ed0-45c1-9805-ac82d14d631f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:06:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:40.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:40 np0005603622 nova_compute[221301]: 2026-01-31 08:06:40.467 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e7dde2ad-e161-41d0-b160-733cde37c6a4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:40 np0005603622 nova_compute[221301]: 2026-01-31 08:06:40.543 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] resizing rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:06:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:41.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.497 221324 DEBUG nova.objects.instance [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'migration_context' on Instance uuid e7dde2ad-e161-41d0-b160-733cde37c6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.554 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.556 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Ensure instance console log exists: /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.556 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.557 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:41 np0005603622 nova_compute[221301]: 2026-01-31 08:06:41.557 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:42.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:42 np0005603622 nova_compute[221301]: 2026-01-31 08:06:42.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:42 np0005603622 nova_compute[221301]: 2026-01-31 08:06:42.901 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Successfully updated port: 21d4cfd3-5ed0-45c1-9805-ac82d14d631f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:06:42 np0005603622 nova_compute[221301]: 2026-01-31 08:06:42.933 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:42 np0005603622 nova_compute[221301]: 2026-01-31 08:06:42.933 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:42 np0005603622 nova_compute[221301]: 2026-01-31 08:06:42.933 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:06:43 np0005603622 nova_compute[221301]: 2026-01-31 08:06:43.110 221324 DEBUG nova.compute.manager [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-changed-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:43 np0005603622 nova_compute[221301]: 2026-01-31 08:06:43.110 221324 DEBUG nova.compute.manager [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Refreshing instance network info cache due to event network-changed-21d4cfd3-5ed0-45c1-9805-ac82d14d631f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:06:43 np0005603622 nova_compute[221301]: 2026-01-31 08:06:43.111 221324 DEBUG oslo_concurrency.lockutils [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:06:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:06:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:43.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:06:43 np0005603622 nova_compute[221301]: 2026-01-31 08:06:43.922 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:06:43 np0005603622 nova_compute[221301]: 2026-01-31 08:06:43.945 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:44.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:45.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:46.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.784 221324 DEBUG nova.network.neutron [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Updating instance_info_cache with network_info: [{"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.811 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.812 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Instance network_info: |[{"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.812 221324 DEBUG oslo_concurrency.lockutils [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.813 221324 DEBUG nova.network.neutron [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Refreshing network info cache for port 21d4cfd3-5ed0-45c1-9805-ac82d14d631f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.818 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Start _get_guest_xml network_info=[{"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.824 221324 WARNING nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.832 221324 DEBUG nova.virt.libvirt.host [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.833 221324 DEBUG nova.virt.libvirt.host [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.850 221324 DEBUG nova.virt.libvirt.host [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.851 221324 DEBUG nova.virt.libvirt.host [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.853 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.853 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.854 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.854 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.855 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.855 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.856 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.856 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.857 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.857 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.857 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.858 221324 DEBUG nova.virt.hardware [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:06:46 np0005603622 nova_compute[221301]: 2026-01-31 08:06:46.864 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:06:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/566028856' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.279 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.311 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.315 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:47.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:47.533 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:06:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049122430' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.829 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.831 221324 DEBUG nova.virt.libvirt.vif [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-569916801',display_name='tempest-DeleteServersTestJSON-server-569916801',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-569916801',id=68,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-p2522h91',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:06:38Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e7dde2ad-e161-41d0-b160-733cde37c6a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.831 221324 DEBUG nova.network.os_vif_util [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.832 221324 DEBUG nova.network.os_vif_util [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.833 221324 DEBUG nova.objects.instance [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid e7dde2ad-e161-41d0-b160-733cde37c6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.859 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <uuid>e7dde2ad-e161-41d0-b160-733cde37c6a4</uuid>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <name>instance-00000044</name>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:name>tempest-DeleteServersTestJSON-server-569916801</nova:name>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:06:46</nova:creationTime>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <nova:port uuid="21d4cfd3-5ed0-45c1-9805-ac82d14d631f">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="serial">e7dde2ad-e161-41d0-b160-733cde37c6a4</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="uuid">e7dde2ad-e161-41d0-b160-733cde37c6a4</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e7dde2ad-e161-41d0-b160-733cde37c6a4_disk">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:d3:79:af"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <target dev="tap21d4cfd3-5e"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/console.log" append="off"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:06:47 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:06:47 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:06:47 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:06:47 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.860 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Preparing to wait for external event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.860 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.860 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.861 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.861 221324 DEBUG nova.virt.libvirt.vif [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-569916801',display_name='tempest-DeleteServersTestJSON-server-569916801',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-569916801',id=68,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-p2522h91',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:06:38Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e7dde2ad-e161-41d0-b160-733cde37c6a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.861 221324 DEBUG nova.network.os_vif_util [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.862 221324 DEBUG nova.network.os_vif_util [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.863 221324 DEBUG os_vif [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.863 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.864 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.867 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.867 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21d4cfd3-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.868 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21d4cfd3-5e, col_values=(('external_ids', {'iface-id': '21d4cfd3-5ed0-45c1-9805-ac82d14d631f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:79:af', 'vm-uuid': 'e7dde2ad-e161-41d0-b160-733cde37c6a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:47 np0005603622 NetworkManager[49080]: <info>  [1769846807.8737] manager: (tap21d4cfd3-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.879 221324 INFO os_vif [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e')#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.946 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.947 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.947 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:d3:79:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.948 221324 INFO nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Using config drive#033[00m
Jan 31 03:06:47 np0005603622 nova_compute[221301]: 2026-01-31 08:06:47.978 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:48 np0005603622 podman[245552]: 2026-01-31 08:06:48.195407744 +0000 UTC m=+0.109671230 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:06:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:48.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:48 np0005603622 nova_compute[221301]: 2026-01-31 08:06:48.936 221324 INFO nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Creating config drive at /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config#033[00m
Jan 31 03:06:48 np0005603622 nova_compute[221301]: 2026-01-31 08:06:48.942 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxj90s6zw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:48 np0005603622 nova_compute[221301]: 2026-01-31 08:06:48.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:49 np0005603622 nova_compute[221301]: 2026-01-31 08:06:49.082 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxj90s6zw" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:49 np0005603622 nova_compute[221301]: 2026-01-31 08:06:49.123 221324 DEBUG nova.storage.rbd_utils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:06:49 np0005603622 nova_compute[221301]: 2026-01-31 08:06:49.129 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:06:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:49.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:49 np0005603622 nova_compute[221301]: 2026-01-31 08:06:49.979 221324 DEBUG nova.network.neutron [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Updated VIF entry in instance network info cache for port 21d4cfd3-5ed0-45c1-9805-ac82d14d631f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:06:49 np0005603622 nova_compute[221301]: 2026-01-31 08:06:49.980 221324 DEBUG nova.network.neutron [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Updating instance_info_cache with network_info: [{"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:06:50 np0005603622 nova_compute[221301]: 2026-01-31 08:06:50.007 221324 DEBUG oslo_concurrency.lockutils [req-132298fd-b8b3-493a-86fa-46cd07c921a3 req-921a6cec-f57e-4c85-92d0-2e8177cbf2c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e7dde2ad-e161-41d0-b160-733cde37c6a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:06:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:50.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:51 np0005603622 podman[245618]: 2026-01-31 08:06:51.128026203 +0000 UTC m=+0.055859774 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.318 221324 DEBUG oslo_concurrency.processutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config e7dde2ad-e161-41d0-b160-733cde37c6a4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.319 221324 INFO nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Deleting local config drive /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4/disk.config because it was imported into RBD.#033[00m
Jan 31 03:06:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:51.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:51 np0005603622 kernel: tap21d4cfd3-5e: entered promiscuous mode
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.3862] manager: (tap21d4cfd3-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Jan 31 03:06:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:51Z|00214|binding|INFO|Claiming lport 21d4cfd3-5ed0-45c1-9805-ac82d14d631f for this chassis.
Jan 31 03:06:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:51Z|00215|binding|INFO|21d4cfd3-5ed0-45c1-9805-ac82d14d631f: Claiming fa:16:3e:d3:79:af 10.100.0.13
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.401 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:79:af 10.100.0.13'], port_security=['fa:16:3e:d3:79:af 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e7dde2ad-e161-41d0-b160-733cde37c6a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=21d4cfd3-5ed0-45c1-9805-ac82d14d631f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.403 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 21d4cfd3-5ed0-45c1-9805-ac82d14d631f in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.406 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.415 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:51Z|00216|binding|INFO|Setting lport 21d4cfd3-5ed0-45c1-9805-ac82d14d631f ovn-installed in OVS
Jan 31 03:06:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:51Z|00217|binding|INFO|Setting lport 21d4cfd3-5ed0-45c1-9805-ac82d14d631f up in Southbound
Jan 31 03:06:51 np0005603622 systemd-machined[190406]: New machine qemu-28-instance-00000044.
Jan 31 03:06:51 np0005603622 systemd-udevd[245651]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.419 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.418 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d43fee78-5f4a-4001-8a82-76989c1c499e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.421 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.425 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.425 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb91058-ea27-498c-8bd9-32be17663b99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.426 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[562fe72c-6d9a-491d-b770-0638f6cf47fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 systemd[1]: Started Virtual Machine qemu-28-instance-00000044.
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.4351] device (tap21d4cfd3-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.4360] device (tap21d4cfd3-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.445 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[2582e946-d841-4d53-926f-112c8195ca14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.459 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ecfad7e0-505f-454b-80b9-5b27bacdc626]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.489 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6e75ba-2ddd-4c28-94a4-68e9a21ff289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.495 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8a199b65-fa64-4218-adcb-a5931cb331ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.4966] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Jan 31 03:06:51 np0005603622 systemd-udevd[245654]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.528 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccedf88-0c16-4bde-8ba6-0ee9682268ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.533 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[30252ae0-b888-4536-995b-e2227e12b7ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.5590] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.561 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[93d90d1e-4158-43d6-bd1e-8a5e77d9bfcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.575 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16b3c06e-eea3-430b-aa2e-777364bbf125]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606289, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245683, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b45a45a6-f8b3-49db-b7c6-2d521a25eab8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606289, 'tstamp': 606289}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245684, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.598 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6c0a37-54db-439e-a9a5-12ea0879349d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606289, 'reachable_time': 23333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245685, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.624 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[464280f3-3bc3-44dc-97d3-63e6b901320e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.676 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b000c5-1313-43e6-97a0-8c7d0096120b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.679 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.679 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.680 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:51 np0005603622 NetworkManager[49080]: <info>  [1769846811.6825] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 31 03:06:51 np0005603622 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.684 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:51Z|00218|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.687 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.688 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[30cc05f5-5537-49d5-a08f-a0a755b775b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.689 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:06:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:51.689 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:06:51 np0005603622 nova_compute[221301]: 2026-01-31 08:06:51.691 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:52 np0005603622 podman[245732]: 2026-01-31 08:06:52.035862078 +0000 UTC m=+0.062050820 container create 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:06:52 np0005603622 systemd[1]: Started libpod-conmon-231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551.scope.
Jan 31 03:06:52 np0005603622 podman[245732]: 2026-01-31 08:06:52.004928396 +0000 UTC m=+0.031117198 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:06:52 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:06:52 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a159083fec42647f9ba248da5afedb5e195afaf6a65c709d2497eaf5d53d278/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:06:52 np0005603622 podman[245732]: 2026-01-31 08:06:52.122507538 +0000 UTC m=+0.148696290 container init 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:06:52 np0005603622 podman[245732]: 2026-01-31 08:06:52.126954038 +0000 UTC m=+0.153142740 container start 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:06:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [NOTICE]   (245772) : New worker (245774) forked
Jan 31 03:06:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [NOTICE]   (245772) : Loading success.
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.253 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846812.252551, e7dde2ad-e161-41d0-b160-733cde37c6a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.254 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] VM Started (Lifecycle Event)#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.275 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.280 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846812.2531226, e7dde2ad-e161-41d0-b160-733cde37c6a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.280 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.297 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.301 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:52.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.319 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.525 221324 DEBUG nova.compute.manager [req-a64643a0-f132-43d0-8b70-07b008e0a75d req-5787a6f0-dfec-49ff-ac76-02793905251c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.526 221324 DEBUG oslo_concurrency.lockutils [req-a64643a0-f132-43d0-8b70-07b008e0a75d req-5787a6f0-dfec-49ff-ac76-02793905251c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.526 221324 DEBUG oslo_concurrency.lockutils [req-a64643a0-f132-43d0-8b70-07b008e0a75d req-5787a6f0-dfec-49ff-ac76-02793905251c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.527 221324 DEBUG oslo_concurrency.lockutils [req-a64643a0-f132-43d0-8b70-07b008e0a75d req-5787a6f0-dfec-49ff-ac76-02793905251c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.527 221324 DEBUG nova.compute.manager [req-a64643a0-f132-43d0-8b70-07b008e0a75d req-5787a6f0-dfec-49ff-ac76-02793905251c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Processing event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.528 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.533 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846812.5329118, e7dde2ad-e161-41d0-b160-733cde37c6a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.534 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.537 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.542 221324 INFO nova.virt.libvirt.driver [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Instance spawned successfully.#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.542 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.559 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.566 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.569 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.569 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.570 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.570 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.570 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.571 221324 DEBUG nova.virt.libvirt.driver [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.597 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.632 221324 INFO nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Took 14.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.633 221324 DEBUG nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.740 221324 INFO nova.compute.manager [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Took 15.63 seconds to build instance.#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.761 221324 DEBUG oslo_concurrency.lockutils [None req-afb6b0a9-ae8b-4900-947e-5a0517aab1f5 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:52 np0005603622 nova_compute[221301]: 2026-01-31 08:06:52.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:53.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:53 np0005603622 nova_compute[221301]: 2026-01-31 08:06:53.950 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:54.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.570 221324 INFO nova.compute.manager [None req-98699058-3a9b-4a81-b23a-fdabc2310b4f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Pausing#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.572 221324 DEBUG nova.objects.instance [None req-98699058-3a9b-4a81-b23a-fdabc2310b4f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'flavor' on Instance uuid e7dde2ad-e161-41d0-b160-733cde37c6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.610 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846814.6103065, e7dde2ad-e161-41d0-b160-733cde37c6a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.611 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.613 221324 DEBUG nova.compute.manager [None req-98699058-3a9b-4a81-b23a-fdabc2310b4f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.643 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.647 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.663 221324 DEBUG nova.compute.manager [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.664 221324 DEBUG oslo_concurrency.lockutils [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.664 221324 DEBUG oslo_concurrency.lockutils [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.665 221324 DEBUG oslo_concurrency.lockutils [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.665 221324 DEBUG nova.compute.manager [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] No waiting events found dispatching network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:54 np0005603622 nova_compute[221301]: 2026-01-31 08:06:54.665 221324 WARNING nova.compute.manager [req-2363705b-9327-43e7-8e36-4cd92afde760 req-25085ab2-6f76-4506-b134-926ad183d2a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received unexpected event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f for instance with vm_state active and task_state pausing.#033[00m
Jan 31 03:06:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:06:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:55.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.543 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.544 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.545 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.545 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.546 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.547 221324 INFO nova.compute.manager [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Terminating instance#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.548 221324 DEBUG nova.compute.manager [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:06:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:06:55 np0005603622 kernel: tap21d4cfd3-5e (unregistering): left promiscuous mode
Jan 31 03:06:55 np0005603622 NetworkManager[49080]: <info>  [1769846815.8779] device (tap21d4cfd3-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:06:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:55Z|00219|binding|INFO|Releasing lport 21d4cfd3-5ed0-45c1-9805-ac82d14d631f from this chassis (sb_readonly=0)
Jan 31 03:06:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:55Z|00220|binding|INFO|Setting lport 21d4cfd3-5ed0-45c1-9805-ac82d14d631f down in Southbound
Jan 31 03:06:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:06:55Z|00221|binding|INFO|Removing iface tap21d4cfd3-5e ovn-installed in OVS
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:55.886 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:79:af 10.100.0.13'], port_security=['fa:16:3e:d3:79:af 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e7dde2ad-e161-41d0-b160-733cde37c6a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=21d4cfd3-5ed0-45c1-9805-ac82d14d631f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:06:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:55.888 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 21d4cfd3-5ed0-45c1-9805-ac82d14d631f in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:06:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:55.890 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.892 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:55.892 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a5a5f01-c889-4964-a20f-76ceebcce8d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:55.893 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:06:55 np0005603622 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 31 03:06:55 np0005603622 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Consumed 2.888s CPU time.
Jan 31 03:06:55 np0005603622 systemd-machined[190406]: Machine qemu-28-instance-00000044 terminated.
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.986 221324 INFO nova.virt.libvirt.driver [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Instance destroyed successfully.#033[00m
Jan 31 03:06:55 np0005603622 nova_compute[221301]: 2026-01-31 08:06:55.987 221324 DEBUG nova.objects.instance [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid e7dde2ad-e161-41d0-b160-733cde37c6a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.009 221324 DEBUG nova.virt.libvirt.vif [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-569916801',display_name='tempest-DeleteServersTestJSON-server-569916801',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-569916801',id=68,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:06:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-p2522h91',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:06:54Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e7dde2ad-e161-41d0-b160-733cde37c6a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.010 221324 DEBUG nova.network.os_vif_util [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "address": "fa:16:3e:d3:79:af", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21d4cfd3-5e", "ovs_interfaceid": "21d4cfd3-5ed0-45c1-9805-ac82d14d631f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.011 221324 DEBUG nova.network.os_vif_util [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.012 221324 DEBUG os_vif [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.014 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.014 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21d4cfd3-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.016 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.021 221324 INFO os_vif [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:79:af,bridge_name='br-int',has_traffic_filtering=True,id=21d4cfd3-5ed0-45c1-9805-ac82d14d631f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21d4cfd3-5e')#033[00m
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [NOTICE]   (245772) : haproxy version is 2.8.14-c23fe91
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [NOTICE]   (245772) : path to executable is /usr/sbin/haproxy
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [WARNING]  (245772) : Exiting Master process...
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [WARNING]  (245772) : Exiting Master process...
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [ALERT]    (245772) : Current worker (245774) exited with code 143 (Terminated)
Jan 31 03:06:56 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[245768]: [WARNING]  (245772) : All workers exited. Exiting... (0)
Jan 31 03:06:56 np0005603622 systemd[1]: libpod-231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551.scope: Deactivated successfully.
Jan 31 03:06:56 np0005603622 podman[245821]: 2026-01-31 08:06:56.038271237 +0000 UTC m=+0.044576440 container died 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:06:56 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551-userdata-shm.mount: Deactivated successfully.
Jan 31 03:06:56 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0a159083fec42647f9ba248da5afedb5e195afaf6a65c709d2497eaf5d53d278-merged.mount: Deactivated successfully.
Jan 31 03:06:56 np0005603622 podman[245821]: 2026-01-31 08:06:56.077636876 +0000 UTC m=+0.083942079 container cleanup 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:06:56 np0005603622 systemd[1]: libpod-conmon-231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551.scope: Deactivated successfully.
Jan 31 03:06:56 np0005603622 podman[245871]: 2026-01-31 08:06:56.13615418 +0000 UTC m=+0.041406815 container remove 231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.140 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c36a16-82c1-4b89-bbf7-b38dc3731131]: (4, ('Sat Jan 31 08:06:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551)\n231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551\nSat Jan 31 08:06:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551)\n231f1597d3d0ee9fcac4597bee67fdf90ea9ca7081a8b4139e14a7e23d4cb551\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.142 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[93b3fdef-4946-4c35-b151-ea3caf49095d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.143 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:56 np0005603622 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.155 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a4497e37-e9df-4e29-ad45-ac5d8ddbe090]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.174 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[01cba621-7b81-4dc2-a5b0-1d0bf9abc0b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.176 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[72598a2d-e751-432e-b9d1-1c30fd44ec20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.189 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf2087c-04bd-486f-9572-2f005b2212c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606282, 'reachable_time': 25435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245887, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.192 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:06:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:06:56.192 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e09a54dc-74e4-4649-9acf-65746af175e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:06:56 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:06:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:56.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.799 221324 DEBUG nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-unplugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.799 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.799 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] No waiting events found dispatching network-vif-unplugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-unplugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.800 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.801 221324 DEBUG oslo_concurrency.lockutils [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.801 221324 DEBUG nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] No waiting events found dispatching network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:06:56 np0005603622 nova_compute[221301]: 2026-01-31 08:06:56.801 221324 WARNING nova.compute.manager [req-6abe0d5d-551e-4dea-b165-2e039a3d99bd req-113a5c1e-103a-4068-a150-d7482c5b812e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received unexpected event network-vif-plugged-21d4cfd3-5ed0-45c1-9805-ac82d14d631f for instance with vm_state paused and task_state deleting.#033[00m
Jan 31 03:06:57 np0005603622 podman[246060]: 2026-01-31 08:06:57.333346907 +0000 UTC m=+0.084118754 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:06:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:57.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:57 np0005603622 podman[246060]: 2026-01-31 08:06:57.456765576 +0000 UTC m=+0.207537403 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:06:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:06:58.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:06:59 np0005603622 nova_compute[221301]: 2026-01-31 08:06:59.002 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:06:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:06:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:06:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:06:59.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:00.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:01 np0005603622 nova_compute[221301]: 2026-01-31 08:07:01.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:01.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:03.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:03.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:07:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:07:04 np0005603622 nova_compute[221301]: 2026-01-31 08:07:04.005 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:05.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:05 np0005603622 nova_compute[221301]: 2026-01-31 08:07:05.778 221324 INFO nova.virt.libvirt.driver [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Deleting instance files /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4_del#033[00m
Jan 31 03:07:05 np0005603622 nova_compute[221301]: 2026-01-31 08:07:05.779 221324 INFO nova.virt.libvirt.driver [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Deletion of /var/lib/nova/instances/e7dde2ad-e161-41d0-b160-733cde37c6a4_del complete#033[00m
Jan 31 03:07:06 np0005603622 nova_compute[221301]: 2026-01-31 08:07:06.053 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:06 np0005603622 nova_compute[221301]: 2026-01-31 08:07:06.117 221324 INFO nova.compute.manager [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Took 10.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:07:06 np0005603622 nova_compute[221301]: 2026-01-31 08:07:06.118 221324 DEBUG oslo.service.loopingcall [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:07:06 np0005603622 nova_compute[221301]: 2026-01-31 08:07:06.118 221324 DEBUG nova.compute.manager [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:07:06 np0005603622 nova_compute[221301]: 2026-01-31 08:07:06.119 221324 DEBUG nova.network.neutron [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.138 221324 DEBUG nova.network.neutron [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.174 221324 INFO nova.compute.manager [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Took 1.05 seconds to deallocate network for instance.#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.242 221324 DEBUG nova.compute.manager [req-380c1468-1b05-42a5-b959-62e58d5fa13e req-c8998f3b-d476-48c8-a92b-fe09ed003070 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Received event network-vif-deleted-21d4cfd3-5ed0-45c1-9805-ac82d14d631f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.315 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.316 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:07.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.381 221324 DEBUG oslo_concurrency.processutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3080978043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.816 221324 DEBUG oslo_concurrency.processutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.823 221324 DEBUG nova.compute.provider_tree [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.854 221324 DEBUG nova.scheduler.client.report [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.881 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:07 np0005603622 nova_compute[221301]: 2026-01-31 08:07:07.937 221324 INFO nova.scheduler.client.report [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance e7dde2ad-e161-41d0-b160-733cde37c6a4#033[00m
Jan 31 03:07:08 np0005603622 nova_compute[221301]: 2026-01-31 08:07:08.032 221324 DEBUG oslo_concurrency.lockutils [None req-d782c1af-85cc-4b76-a3fb-3b2c571ae09f 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e7dde2ad-e161-41d0-b160-733cde37c6a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.487s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:09 np0005603622 nova_compute[221301]: 2026-01-31 08:07:09.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:09.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:09.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:10 np0005603622 nova_compute[221301]: 2026-01-31 08:07:10.985 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846815.9834194, e7dde2ad-e161-41d0-b160-733cde37c6a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:10 np0005603622 nova_compute[221301]: 2026-01-31 08:07:10.986 221324 INFO nova.compute.manager [-] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:07:11 np0005603622 nova_compute[221301]: 2026-01-31 08:07:11.010 221324 DEBUG nova.compute.manager [None req-c16dfe6a-6c27-441f-808d-e92498d4f916 - - - - - -] [instance: e7dde2ad-e161-41d0-b160-733cde37c6a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:11.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:11 np0005603622 nova_compute[221301]: 2026-01-31 08:07:11.192 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:11.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:13.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:13.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.466 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.467 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.581 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.771 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.772 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.778 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:07:13 np0005603622 nova_compute[221301]: 2026-01-31 08:07:13.779 221324 INFO nova.compute.claims [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:07:14 np0005603622 nova_compute[221301]: 2026-01-31 08:07:14.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:07:14 np0005603622 nova_compute[221301]: 2026-01-31 08:07:14.802 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:15.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.338 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.344 221324 DEBUG nova.compute.provider_tree [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:15.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.476 221324 DEBUG nova.scheduler.client.report [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.762 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.763 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:07:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.961 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:07:15 np0005603622 nova_compute[221301]: 2026-01-31 08:07:15.962 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.021 221324 INFO nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.365 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.548 221324 DEBUG nova.policy [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16d731f5875748ca9b8036b2ba061042', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3469c253459e40e39dcf5bcb6a32008f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.606 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.608 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.608 221324 INFO nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Creating image(s)#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.641 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.672 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.701 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.704 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.749 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.751 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.752 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.753 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.783 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:16 np0005603622 nova_compute[221301]: 2026-01-31 08:07:16.787 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 23ffafae-0fb8-48ec-9843-b0127f61e340_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.100 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 23ffafae-0fb8-48ec-9843-b0127f61e340_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:17.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.181 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] resizing rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.230 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Successfully created port: 6e573731-6974-4502-bf29-a75fac4ae37b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.331 221324 DEBUG nova.objects.instance [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'migration_context' on Instance uuid 23ffafae-0fb8-48ec-9843-b0127f61e340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:07:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:17.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.496 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.497 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Ensure instance console log exists: /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.497 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.498 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:17 np0005603622 nova_compute[221301]: 2026-01-31 08:07:17.498 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:18 np0005603622 nova_compute[221301]: 2026-01-31 08:07:18.819 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Successfully updated port: 6e573731-6974-4502-bf29-a75fac4ae37b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.015 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.015 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.016 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.187 221324 DEBUG nova.compute.manager [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-changed-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.187 221324 DEBUG nova.compute.manager [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Refreshing instance network info cache due to event network-changed-6e573731-6974-4502-bf29-a75fac4ae37b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:07:19 np0005603622 podman[246575]: 2026-01-31 08:07:19.188474976 +0000 UTC m=+0.095478649 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.188 221324 DEBUG oslo_concurrency.lockutils [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:19.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:19 np0005603622 nova_compute[221301]: 2026-01-31 08:07:19.421 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.500 221324 DEBUG nova.network.neutron [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updating instance_info_cache with network_info: [{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.755 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.756 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance network_info: |[{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.758 221324 DEBUG oslo_concurrency.lockutils [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.758 221324 DEBUG nova.network.neutron [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Refreshing network info cache for port 6e573731-6974-4502-bf29-a75fac4ae37b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.763 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Start _get_guest_xml network_info=[{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:07:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.771 221324 WARNING nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.779 221324 DEBUG nova.virt.libvirt.host [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.780 221324 DEBUG nova.virt.libvirt.host [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.785 221324 DEBUG nova.virt.libvirt.host [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.786 221324 DEBUG nova.virt.libvirt.host [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.788 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.789 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.790 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.790 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.790 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.791 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.791 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.792 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.793 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.793 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.794 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.794 221324 DEBUG nova.virt.hardware [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:07:20 np0005603622 nova_compute[221301]: 2026-01-31 08:07:20.799 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:21.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/35694754' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.227 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.261 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.266 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:21.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2108471743' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.673 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.676 221324 DEBUG nova.virt.libvirt.vif [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-5961696',display_name='tempest-DeleteServersTestJSON-server-5961696',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-5961696',id=69,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-e3ymdfq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:16Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=23ffafae-0fb8-48ec-9843-b0127f61e340,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.676 221324 DEBUG nova.network.os_vif_util [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.678 221324 DEBUG nova.network.os_vif_util [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.680 221324 DEBUG nova.objects.instance [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid 23ffafae-0fb8-48ec-9843-b0127f61e340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.901 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <uuid>23ffafae-0fb8-48ec-9843-b0127f61e340</uuid>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <name>instance-00000045</name>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-DeleteServersTestJSON-server-5961696</nova:name>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:07:20</nova:creationTime>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <nova:port uuid="6e573731-6974-4502-bf29-a75fac4ae37b">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="serial">23ffafae-0fb8-48ec-9843-b0127f61e340</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="uuid">23ffafae-0fb8-48ec-9843-b0127f61e340</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/23ffafae-0fb8-48ec-9843-b0127f61e340_disk">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:05:b7:59"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <target dev="tap6e573731-69"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/console.log" append="off"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:07:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:07:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:07:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:07:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.902 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Preparing to wait for external event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.903 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.903 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.904 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.905 221324 DEBUG nova.virt.libvirt.vif [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-5961696',display_name='tempest-DeleteServersTestJSON-server-5961696',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-5961696',id=69,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-e3ymdfq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:16Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=23ffafae-0fb8-48ec-9843-b0127f61e340,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.906 221324 DEBUG nova.network.os_vif_util [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.907 221324 DEBUG nova.network.os_vif_util [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.907 221324 DEBUG os_vif [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.909 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.910 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.915 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e573731-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.916 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e573731-69, col_values=(('external_ids', {'iface-id': '6e573731-6974-4502-bf29-a75fac4ae37b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:b7:59', 'vm-uuid': '23ffafae-0fb8-48ec-9843-b0127f61e340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:21 np0005603622 NetworkManager[49080]: <info>  [1769846841.9198] manager: (tap6e573731-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.921 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.926 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:21 np0005603622 nova_compute[221301]: 2026-01-31 08:07:21.927 221324 INFO os_vif [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69')#033[00m
Jan 31 03:07:22 np0005603622 podman[246667]: 2026-01-31 08:07:22.060449293 +0000 UTC m=+0.089903329 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:07:22 np0005603622 nova_compute[221301]: 2026-01-31 08:07:22.214 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:22 np0005603622 nova_compute[221301]: 2026-01-31 08:07:22.215 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:22 np0005603622 nova_compute[221301]: 2026-01-31 08:07:22.215 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:05:b7:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:07:22 np0005603622 nova_compute[221301]: 2026-01-31 08:07:22.215 221324 INFO nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Using config drive#033[00m
Jan 31 03:07:22 np0005603622 nova_compute[221301]: 2026-01-31 08:07:22.235 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.216 221324 INFO nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Creating config drive at /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.221 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd087tao4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.351 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpd087tao4" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.390 221324 DEBUG nova.storage.rbd_utils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] rbd image 23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.395 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config 23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:23.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.423 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.447 221324 DEBUG nova.network.neutron [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updated VIF entry in instance network info cache for port 6e573731-6974-4502-bf29-a75fac4ae37b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.448 221324 DEBUG nova.network.neutron [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updating instance_info_cache with network_info: [{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.532 221324 DEBUG oslo_concurrency.lockutils [req-a9556d14-f580-45a2-9b0a-fe6b114b5643 req-805f935c-bd17-45fb-bee3-74af8481ca89 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.571 221324 DEBUG oslo_concurrency.processutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config 23ffafae-0fb8-48ec-9843-b0127f61e340_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.177s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.572 221324 INFO nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Deleting local config drive /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340/disk.config because it was imported into RBD.#033[00m
Jan 31 03:07:23 np0005603622 kernel: tap6e573731-69: entered promiscuous mode
Jan 31 03:07:23 np0005603622 NetworkManager[49080]: <info>  [1769846843.6226] manager: (tap6e573731-69): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:23Z|00222|binding|INFO|Claiming lport 6e573731-6974-4502-bf29-a75fac4ae37b for this chassis.
Jan 31 03:07:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:23Z|00223|binding|INFO|6e573731-6974-4502-bf29-a75fac4ae37b: Claiming fa:16:3e:05:b7:59 10.100.0.6
Jan 31 03:07:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:23Z|00224|binding|INFO|Setting lport 6e573731-6974-4502-bf29-a75fac4ae37b ovn-installed in OVS
Jan 31 03:07:23 np0005603622 nova_compute[221301]: 2026-01-31 08:07:23.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:23 np0005603622 systemd-machined[190406]: New machine qemu-29-instance-00000045.
Jan 31 03:07:23 np0005603622 systemd[1]: Started Virtual Machine qemu-29-instance-00000045.
Jan 31 03:07:23 np0005603622 systemd-udevd[246757]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:23 np0005603622 NetworkManager[49080]: <info>  [1769846843.6851] device (tap6e573731-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:07:23 np0005603622 NetworkManager[49080]: <info>  [1769846843.6870] device (tap6e573731-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.771 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:b7:59 10.100.0.6'], port_security=['fa:16:3e:05:b7:59 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '23ffafae-0fb8-48ec-9843-b0127f61e340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e573731-6974-4502-bf29-a75fac4ae37b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:23Z|00225|binding|INFO|Setting lport 6e573731-6974-4502-bf29-a75fac4ae37b up in Southbound
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.774 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e573731-6974-4502-bf29-a75fac4ae37b in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.775 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.786 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2d257c-bd10-4408-8c08-ef43be9a7692]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.786 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.790 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.790 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aabc9bde-9ce2-454e-98cc-bed64af1f445]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.791 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[55787a3e-6a36-4b95-aba9-26c89734fa38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.804 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c4bd783d-f3c1-4579-94dc-26052e774c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.831 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74eaeedc-1b74-46ca-8ac3-905c4c13ead0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.860 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e52da044-09ca-4c98-ab7b-c5c89b7d3063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.865 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[923d5ef9-e24d-4caa-97dd-644c24d818d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 NetworkManager[49080]: <info>  [1769846843.8673] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Jan 31 03:07:23 np0005603622 systemd-udevd[246760]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.900 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8c175424-40e7-4ae0-914e-3976e10fd9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.904 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd497a4-d4b8-4d2b-8ff7-29a77f7f0238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 NetworkManager[49080]: <info>  [1769846843.9253] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.931 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[46a1f0c6-dc79-45e1-80bf-096c181ea85f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.949 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fb210eb7-7510-4d74-a9c7-a71130cdb8cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609526, 'reachable_time': 31787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246790, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.963 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[34d80f65-631e-4440-badf-0cce77403ab9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609526, 'tstamp': 609526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246791, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.977 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[802b1e68-c188-4287-a24d-c3a2ea3de600]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609526, 'reachable_time': 31787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246792, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:23.998 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cd72da08-85fd-426b-97dc-1e104688847d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.042 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9b94b8aa-27d9-4926-a94b-01bf289307f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.043 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.044 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.044 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:24 np0005603622 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:07:24 np0005603622 NetworkManager[49080]: <info>  [1769846844.0876] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.086 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.089 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.090 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:24Z|00226|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.101 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.102 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.104 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4a005f-8fe9-420e-b7e3-297c76ed843c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.105 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:07:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:24.106 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:07:24 np0005603622 podman[246839]: 2026-01-31 08:07:24.432876376 +0000 UTC m=+0.052416331 container create a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:07:24 np0005603622 systemd[1]: Started libpod-conmon-a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2.scope.
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.497 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846844.4974442, 23ffafae-0fb8-48ec-9843-b0127f61e340 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.498 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] VM Started (Lifecycle Event)#033[00m
Jan 31 03:07:24 np0005603622 podman[246839]: 2026-01-31 08:07:24.404572505 +0000 UTC m=+0.024112450 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:07:24 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:07:24 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34696c7e45fc2649be2d6155f182543cb76b292f58daf4db9b64233ecbcc6690/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:07:24 np0005603622 podman[246839]: 2026-01-31 08:07:24.526330089 +0000 UTC m=+0.145870054 container init a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:07:24 np0005603622 podman[246839]: 2026-01-31 08:07:24.531973871 +0000 UTC m=+0.151513806 container start a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:07:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [NOTICE]   (246883) : New worker (246885) forked
Jan 31 03:07:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [NOTICE]   (246883) : Loading success.
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.571 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.575 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846844.500815, 23ffafae-0fb8-48ec-9843-b0127f61e340 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.575 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.602 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.605 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:07:24 np0005603622 nova_compute[221301]: 2026-01-31 08:07:24.629 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:07:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:25.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.491 221324 DEBUG nova.compute.manager [req-0b607210-dffc-4eb1-9f40-24f28ea5f9d3 req-748acce9-daf1-4866-8454-36f95cbf9d02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.492 221324 DEBUG oslo_concurrency.lockutils [req-0b607210-dffc-4eb1-9f40-24f28ea5f9d3 req-748acce9-daf1-4866-8454-36f95cbf9d02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.492 221324 DEBUG oslo_concurrency.lockutils [req-0b607210-dffc-4eb1-9f40-24f28ea5f9d3 req-748acce9-daf1-4866-8454-36f95cbf9d02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.492 221324 DEBUG oslo_concurrency.lockutils [req-0b607210-dffc-4eb1-9f40-24f28ea5f9d3 req-748acce9-daf1-4866-8454-36f95cbf9d02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.493 221324 DEBUG nova.compute.manager [req-0b607210-dffc-4eb1-9f40-24f28ea5f9d3 req-748acce9-daf1-4866-8454-36f95cbf9d02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Processing event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.494 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.498 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846846.4986987, 23ffafae-0fb8-48ec-9843-b0127f61e340 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.499 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.502 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.506 221324 INFO nova.virt.libvirt.driver [-] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance spawned successfully.#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.506 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.526 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.531 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.539 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.539 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.539 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.540 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.540 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.541 221324 DEBUG nova.virt.libvirt.driver [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.577 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.645 221324 INFO nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Took 10.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.646 221324 DEBUG nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.753 221324 INFO nova.compute.manager [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Took 13.01 seconds to build instance.#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.772 221324 DEBUG oslo_concurrency.lockutils [None req-e55a25f9-be17-483c-9dac-9ba15315bf69 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:26 np0005603622 nova_compute[221301]: 2026-01-31 08:07:26.919 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:27.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.615 221324 DEBUG nova.compute.manager [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.615 221324 DEBUG oslo_concurrency.lockutils [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.615 221324 DEBUG oslo_concurrency.lockutils [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.616 221324 DEBUG oslo_concurrency.lockutils [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.616 221324 DEBUG nova.compute.manager [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] No waiting events found dispatching network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.616 221324 WARNING nova.compute.manager [req-5122a7ea-323b-4f59-8943-9e76d4e8dcca req-950e8531-2ed2-4033-8ab9-8ec5810c7e1a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received unexpected event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b for instance with vm_state active and task_state shelving.#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.742 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.743 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.743 221324 INFO nova.compute.manager [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Shelving#033[00m
Jan 31 03:07:28 np0005603622 nova_compute[221301]: 2026-01-31 08:07:28.816 221324 DEBUG nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.136 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:29.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:29.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.443 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.444 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.444 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.444 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.445 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/264318912' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:29 np0005603622 nova_compute[221301]: 2026-01-31 08:07:29.883 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.077 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000045 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.077 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000045 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.228 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.229 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4487MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.230 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.230 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.312 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 23ffafae-0fb8-48ec-9843-b0127f61e340 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.313 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.313 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.354 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/69967653' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.816 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.823 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:30.861 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:30.863 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:30.863 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.869 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.934 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:07:30 np0005603622 nova_compute[221301]: 2026-01-31 08:07:30.935 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:31.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:31 np0005603622 nova_compute[221301]: 2026-01-31 08:07:31.921 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:31 np0005603622 nova_compute[221301]: 2026-01-31 08:07:31.936 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:07:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:33.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.474 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.474 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.474 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:07:33 np0005603622 nova_compute[221301]: 2026-01-31 08:07:33.475 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23ffafae-0fb8-48ec-9843-b0127f61e340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:34 np0005603622 nova_compute[221301]: 2026-01-31 08:07:34.191 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:35.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:35 np0005603622 nova_compute[221301]: 2026-01-31 08:07:35.576 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updating instance_info_cache with network_info: [{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:35 np0005603622 nova_compute[221301]: 2026-01-31 08:07:35.593 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:35 np0005603622 nova_compute[221301]: 2026-01-31 08:07:35.594 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:07:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:36 np0005603622 nova_compute[221301]: 2026-01-31 08:07:36.923 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:37.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:38 np0005603622 nova_compute[221301]: 2026-01-31 08:07:38.876 221324 DEBUG nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:07:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:39.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:39 np0005603622 nova_compute[221301]: 2026-01-31 08:07:39.251 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:39.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:40.201 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:40 np0005603622 nova_compute[221301]: 2026-01-31 08:07:40.202 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:40.204 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:07:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:40Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:b7:59 10.100.0.6
Jan 31 03:07:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:40Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:b7:59 10.100.0.6
Jan 31 03:07:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:41.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:41.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:41 np0005603622 nova_compute[221301]: 2026-01-31 08:07:41.926 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:42.207 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:43.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:43.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.449630) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863449694, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2150, "num_deletes": 265, "total_data_size": 4650182, "memory_usage": 4746384, "flush_reason": "Manual Compaction"}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863468943, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3051600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36695, "largest_seqno": 38840, "table_properties": {"data_size": 3042963, "index_size": 5259, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18871, "raw_average_key_size": 20, "raw_value_size": 3025228, "raw_average_value_size": 3302, "num_data_blocks": 228, "num_entries": 916, "num_filter_entries": 916, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846700, "oldest_key_time": 1769846700, "file_creation_time": 1769846863, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 19400 microseconds, and 9688 cpu microseconds.
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.469027) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3051600 bytes OK
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.469055) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.470602) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.470621) EVENT_LOG_v1 {"time_micros": 1769846863470615, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.470646) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 4640579, prev total WAL file size 4640579, number of live WAL files 2.
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.471699) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2980KB)], [69(8654KB)]
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863471817, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11913690, "oldest_snapshot_seqno": -1}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6438 keys, 11756709 bytes, temperature: kUnknown
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863532945, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11756709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11710876, "index_size": 28635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16133, "raw_key_size": 164809, "raw_average_key_size": 25, "raw_value_size": 11592812, "raw_average_value_size": 1800, "num_data_blocks": 1152, "num_entries": 6438, "num_filter_entries": 6438, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846863, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.533384) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11756709 bytes
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.534960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.2 rd, 191.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 8.5 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(7.8) write-amplify(3.9) OK, records in: 6982, records dropped: 544 output_compression: NoCompression
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.534976) EVENT_LOG_v1 {"time_micros": 1769846863534968, "job": 42, "event": "compaction_finished", "compaction_time_micros": 61348, "compaction_time_cpu_micros": 28527, "output_level": 6, "num_output_files": 1, "total_output_size": 11756709, "num_input_records": 6982, "num_output_records": 6438, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863535377, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846863536228, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.471527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.536558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.536576) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.536579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.536583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:07:43.536586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:07:44 np0005603622 nova_compute[221301]: 2026-01-31 08:07:44.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:45.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:45.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:46 np0005603622 nova_compute[221301]: 2026-01-31 08:07:46.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:47.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:47.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:48 np0005603622 nova_compute[221301]: 2026-01-31 08:07:48.988 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:48 np0005603622 nova_compute[221301]: 2026-01-31 08:07:48.988 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.011 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.019 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.019 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.070 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.112 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.113 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.121 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.122 221324 INFO nova.compute.claims [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.175 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:49.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.275 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:49.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/90591540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.748 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.754 221324 DEBUG nova.compute.provider_tree [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.779 221324 DEBUG nova.scheduler.client.report [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.816 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.817 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.819 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.830 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.830 221324 INFO nova.compute.claims [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.931 221324 DEBUG nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.968 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.969 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:07:49 np0005603622 nova_compute[221301]: 2026-01-31 08:07:49.993 221324 INFO nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.031 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.070 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.168 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.170 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.170 221324 INFO nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Creating image(s)#033[00m
Jan 31 03:07:50 np0005603622 podman[246961]: 2026-01-31 08:07:50.174316434 +0000 UTC m=+0.086368363 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.204 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.244 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.275 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.280 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.305 221324 DEBUG nova.policy [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.355 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.356 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.357 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.357 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.386 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.391 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:07:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/93348330' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.545 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.550 221324 DEBUG nova.compute.provider_tree [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.568 221324 DEBUG nova.scheduler.client.report [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.602 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.604 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.665 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.666 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.690 221324 INFO nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.738 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:07:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.915 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.917 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.918 221324 INFO nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Creating image(s)#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.951 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:50 np0005603622 nova_compute[221301]: 2026-01-31 08:07:50.984 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.023 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.027 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.053 221324 DEBUG nova.policy [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31a158d811524ad292199b931be12bc7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64a52f6edb6c4843a8decc313f8c10e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.101 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.102 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.103 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.104 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.150 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.156 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:51.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.350 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Successfully created port: 1a414f6e-3462-4511-a204-1326ac6d6fcd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:07:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:07:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.479 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.578 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] resizing rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:51 np0005603622 nova_compute[221301]: 2026-01-31 08:07:51.963 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.807s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.112 221324 DEBUG nova.objects.instance [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.121 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] resizing rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.164 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.165 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Ensure instance console log exists: /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.166 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.167 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.167 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.262 221324 DEBUG nova.objects.instance [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lazy-loading 'migration_context' on Instance uuid c131ca84-4259-49e0-a3d5-a4a5ca629d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.281 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.281 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Ensure instance console log exists: /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.282 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.282 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.283 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:52 np0005603622 kernel: tap6e573731-69 (unregistering): left promiscuous mode
Jan 31 03:07:52 np0005603622 NetworkManager[49080]: <info>  [1769846872.3481] device (tap6e573731-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:07:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:52Z|00227|binding|INFO|Releasing lport 6e573731-6974-4502-bf29-a75fac4ae37b from this chassis (sb_readonly=0)
Jan 31 03:07:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:52Z|00228|binding|INFO|Setting lport 6e573731-6974-4502-bf29-a75fac4ae37b down in Southbound
Jan 31 03:07:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:52Z|00229|binding|INFO|Removing iface tap6e573731-69 ovn-installed in OVS
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.362 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:b7:59 10.100.0.6'], port_security=['fa:16:3e:05:b7:59 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '23ffafae-0fb8-48ec-9843-b0127f61e340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e573731-6974-4502-bf29-a75fac4ae37b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.364 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e573731-6974-4502-bf29-a75fac4ae37b in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.366 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.368 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2c6787-4225-4dd2-b2cf-9a356cc8bc43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.370 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:07:52 np0005603622 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 31 03:07:52 np0005603622 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Consumed 14.812s CPU time.
Jan 31 03:07:52 np0005603622 systemd-machined[190406]: Machine qemu-29-instance-00000045 terminated.
Jan 31 03:07:52 np0005603622 podman[247343]: 2026-01-31 08:07:52.429180315 +0000 UTC m=+0.056236923 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:07:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [NOTICE]   (246883) : haproxy version is 2.8.14-c23fe91
Jan 31 03:07:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [NOTICE]   (246883) : path to executable is /usr/sbin/haproxy
Jan 31 03:07:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [WARNING]  (246883) : Exiting Master process...
Jan 31 03:07:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [ALERT]    (246883) : Current worker (246885) exited with code 143 (Terminated)
Jan 31 03:07:52 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[246879]: [WARNING]  (246883) : All workers exited. Exiting... (0)
Jan 31 03:07:52 np0005603622 systemd[1]: libpod-a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2.scope: Deactivated successfully.
Jan 31 03:07:52 np0005603622 podman[247384]: 2026-01-31 08:07:52.488176642 +0000 UTC m=+0.046020979 container died a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 03:07:52 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2-userdata-shm.mount: Deactivated successfully.
Jan 31 03:07:52 np0005603622 systemd[1]: var-lib-containers-storage-overlay-34696c7e45fc2649be2d6155f182543cb76b292f58daf4db9b64233ecbcc6690-merged.mount: Deactivated successfully.
Jan 31 03:07:52 np0005603622 podman[247384]: 2026-01-31 08:07:52.528422054 +0000 UTC m=+0.086266361 container cleanup a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:07:52 np0005603622 systemd[1]: libpod-conmon-a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2.scope: Deactivated successfully.
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.577 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.581 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 podman[247416]: 2026-01-31 08:07:52.600803461 +0000 UTC m=+0.054054215 container remove a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.606 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[69671549-ec8c-4a1c-adae-64457f511239]: (4, ('Sat Jan 31 08:07:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2)\na1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2\nSat Jan 31 08:07:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (a1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2)\na1724024f714ad2eb5af43698a3b04789b216eead578c5d3152e07978911cdc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.608 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[52a9b135-4fc1-4e4c-82e2-e9235e9485da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.609 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.625 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[86b3c850-977d-4ecd-9a68-45fdad47f068]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.637 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[330ec96d-1c7a-471f-a576-eb20e447f9ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.638 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5c4b11-1da3-442a-acc2-8002e633821a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.652 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4a192610-fd8e-44ee-bdc3-2cf182802e0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609519, 'reachable_time': 31248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247445, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.656 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:07:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:52.656 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[702b0885-e329-43df-ba65-c4e0b36beb82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.659 221324 DEBUG nova.compute.manager [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-vif-unplugged-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.660 221324 DEBUG oslo_concurrency.lockutils [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.660 221324 DEBUG oslo_concurrency.lockutils [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.660 221324 DEBUG oslo_concurrency.lockutils [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.660 221324 DEBUG nova.compute.manager [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] No waiting events found dispatching network-vif-unplugged-6e573731-6974-4502-bf29-a75fac4ae37b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.660 221324 WARNING nova.compute.manager [req-b82e4895-2eff-4f76-b35b-d2f603986c60 req-d350ba91-e43e-4e32-b091-12afeb147a57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received unexpected event network-vif-unplugged-6e573731-6974-4502-bf29-a75fac4ae37b for instance with vm_state active and task_state shelving.#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.788 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Successfully created port: bc46ca84-347f-4b48-a09d-4742bd167dc7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.957 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Successfully updated port: 1a414f6e-3462-4511-a204-1326ac6d6fcd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.984 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.984 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:52 np0005603622 nova_compute[221301]: 2026-01-31 08:07:52.984 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.171 221324 INFO nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance shutdown successfully after 24 seconds.#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.176 221324 INFO nova.virt.libvirt.driver [-] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance destroyed successfully.#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.177 221324 DEBUG nova.objects.instance [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'numa_topology' on Instance uuid 23ffafae-0fb8-48ec-9843-b0127f61e340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.194 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:07:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:53.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.423 221324 DEBUG nova.compute.manager [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.424 221324 DEBUG nova.compute.manager [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing instance network info cache due to event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.424 221324 DEBUG oslo_concurrency.lockutils [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:53.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.784 221324 INFO nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Beginning cold snapshot process#033[00m
Jan 31 03:07:53 np0005603622 nova_compute[221301]: 2026-01-31 08:07:53.986 221324 DEBUG nova.virt.libvirt.imagebackend [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.263 221324 DEBUG nova.storage.rbd_utils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] creating snapshot(389890404ae2421ca12f5240a7ab47fe) on rbd image(23ffafae-0fb8-48ec-9843-b0127f61e340_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.304 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.343 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Successfully updated port: bc46ca84-347f-4b48-a09d-4742bd167dc7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.406 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.407 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.407 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.768 221324 DEBUG nova.compute.manager [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.768 221324 DEBUG oslo_concurrency.lockutils [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.768 221324 DEBUG oslo_concurrency.lockutils [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.768 221324 DEBUG oslo_concurrency.lockutils [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.769 221324 DEBUG nova.compute.manager [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] No waiting events found dispatching network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.769 221324 WARNING nova.compute.manager [req-8629b415-cd6b-4433-946d-9900e292518e req-08f0371c-87fa-4544-93f8-92b51dccab47 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received unexpected event network-vif-plugged-6e573731-6974-4502-bf29-a75fac4ae37b for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 31 03:07:54 np0005603622 nova_compute[221301]: 2026-01-31 08:07:54.829 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:07:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.053 221324 DEBUG nova.storage.rbd_utils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] cloning vms/23ffafae-0fb8-48ec-9843-b0127f61e340_disk@389890404ae2421ca12f5240a7ab47fe to images/7b781d58-222d-48fb-b75e-dda36a2cb764 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.096 221324 DEBUG nova.network.neutron [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.134 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.135 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Instance network_info: |[{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.135 221324 DEBUG oslo_concurrency.lockutils [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.136 221324 DEBUG nova.network.neutron [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.139 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Start _get_guest_xml network_info=[{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.146 221324 WARNING nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.152 221324 DEBUG nova.virt.libvirt.host [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.153 221324 DEBUG nova.virt.libvirt.host [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.157 221324 DEBUG nova.virt.libvirt.host [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.157 221324 DEBUG nova.virt.libvirt.host [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.159 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.159 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.160 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.160 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.160 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.160 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.161 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.161 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.161 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.161 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.162 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.162 221324 DEBUG nova.virt.hardware [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.165 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:55.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.234 221324 DEBUG nova.storage.rbd_utils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] flattening images/7b781d58-222d-48fb-b75e-dda36a2cb764 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:07:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:07:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:55.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.517 221324 DEBUG nova.compute.manager [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.519 221324 DEBUG nova.compute.manager [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing instance network info cache due to event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.520 221324 DEBUG oslo_concurrency.lockutils [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:07:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4273847047' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.635 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.675 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.681 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:07:55 np0005603622 nova_compute[221301]: 2026-01-31 08:07:55.806 221324 DEBUG nova.storage.rbd_utils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] removing snapshot(389890404ae2421ca12f5240a7ab47fe) on rbd image(23ffafae-0fb8-48ec-9843-b0127f61e340_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:07:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.080 221324 DEBUG nova.network.neutron [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.115 221324 DEBUG nova.storage.rbd_utils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] creating snapshot(snap) on rbd image(7b781d58-222d-48fb-b75e-dda36a2cb764) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.164 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.165 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Instance network_info: |[{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.166 221324 DEBUG oslo_concurrency.lockutils [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.166 221324 DEBUG nova.network.neutron [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.170 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Start _get_guest_xml network_info=[{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.176 221324 WARNING nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.182 221324 DEBUG nova.virt.libvirt.host [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.183 221324 DEBUG nova.virt.libvirt.host [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.186 221324 DEBUG nova.virt.libvirt.host [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.186 221324 DEBUG nova.virt.libvirt.host [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.187 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.187 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.188 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.188 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.188 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.189 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.189 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.189 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.190 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.190 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.190 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:07:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.190 221324 DEBUG nova.virt.hardware [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:07:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2144928470' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.193 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.219 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.221 221324 DEBUG nova.virt.libvirt.vif [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.221 221324 DEBUG nova.network.os_vif_util [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.222 221324 DEBUG nova.network.os_vif_util [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.223 221324 DEBUG nova.objects.instance [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.241 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <uuid>9f1f7c19-2aa2-4663-81ca-a26eaa106441</uuid>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <name>instance-00000048</name>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:name>tempest-tempest.common.compute-instance-576814234</nova:name>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:07:55</nova:creationTime>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <nova:port uuid="1a414f6e-3462-4511-a204-1326ac6d6fcd">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="serial">9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="uuid">9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:83:fa:32"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <target dev="tap1a414f6e-34"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log" append="off"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:07:56 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:07:56 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:07:56 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:07:56 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.242 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Preparing to wait for external event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.242 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.243 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.243 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.244 221324 DEBUG nova.virt.libvirt.vif [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.244 221324 DEBUG nova.network.os_vif_util [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.244 221324 DEBUG nova.network.os_vif_util [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.245 221324 DEBUG os_vif [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.246 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.246 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.251 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a414f6e-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.252 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1a414f6e-34, col_values=(('external_ids', {'iface-id': '1a414f6e-3462-4511-a204-1326ac6d6fcd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:fa:32', 'vm-uuid': '9f1f7c19-2aa2-4663-81ca-a26eaa106441'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:56 np0005603622 NetworkManager[49080]: <info>  [1769846876.2959] manager: (tap1a414f6e-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.304 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.305 221324 INFO os_vif [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34')#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.397 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.397 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.398 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:83:fa:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.398 221324 INFO nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Using config drive#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.429 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4034060347' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.674 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.711 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.718 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.979 221324 INFO nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Creating config drive at /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config#033[00m
Jan 31 03:07:56 np0005603622 nova_compute[221301]: 2026-01-31 08:07:56.986 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzbzy4fop execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.029 221324 DEBUG nova.network.neutron [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated VIF entry in instance network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.031 221324 DEBUG nova.network.neutron [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.063 221324 DEBUG oslo_concurrency.lockutils [req-ce090ad4-795e-4d55-be33-d931f346e778 req-2ecf8149-0550-4932-8ba4-12e1a5e9dc98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.120 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzbzy4fop" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.158 221324 DEBUG nova.storage.rbd_utils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] rbd image 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.163 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:07:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1526605689' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:07:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:57.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.274 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.277 221324 DEBUG nova.virt.libvirt.vif [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-239311118',display_name='tempest-AttachInterfacesUnderV243Test-server-239311118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-239311118',id=73,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuImM2D3fNGJMsdqkErwFBaEJ04l9kGVA/cSUA+JbgAY3g1q1pguY3j8BOfvSSgfzBZsiL0lJL3gsjWJYvs7KdxSwVg5XZVPK7b50FIczK25ThQp4VMPwH5/FpuX6vz9A==',key_name='tempest-keypair-621715770',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64a52f6edb6c4843a8decc313f8c10e3',ramdisk_id='',reservation_id='r-92w1prj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-902546950',owner_user_name='tempest-AttachInterfacesUnderV243Test-902546950-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='31a158d811524ad292199b931be12bc7',uuid=c131ca84-4259-49e0-a3d5-a4a5ca629d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.278 221324 DEBUG nova.network.os_vif_util [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converting VIF {"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.280 221324 DEBUG nova.network.os_vif_util [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.282 221324 DEBUG nova.objects.instance [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c131ca84-4259-49e0-a3d5-a4a5ca629d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.317 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <uuid>c131ca84-4259-49e0-a3d5-a4a5ca629d31</uuid>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <name>instance-00000049</name>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachInterfacesUnderV243Test-server-239311118</nova:name>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:07:56</nova:creationTime>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:user uuid="31a158d811524ad292199b931be12bc7">tempest-AttachInterfacesUnderV243Test-902546950-project-member</nova:user>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:project uuid="64a52f6edb6c4843a8decc313f8c10e3">tempest-AttachInterfacesUnderV243Test-902546950</nova:project>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <nova:port uuid="bc46ca84-347f-4b48-a09d-4742bd167dc7">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="serial">c131ca84-4259-49e0-a3d5-a4a5ca629d31</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="uuid">c131ca84-4259-49e0-a3d5-a4a5ca629d31</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:91:49:ed"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <target dev="tapbc46ca84-34"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/console.log" append="off"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:07:57 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:07:57 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:07:57 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:07:57 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.318 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Preparing to wait for external event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.318 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.318 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.318 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.319 221324 DEBUG nova.virt.libvirt.vif [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-239311118',display_name='tempest-AttachInterfacesUnderV243Test-server-239311118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-239311118',id=73,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuImM2D3fNGJMsdqkErwFBaEJ04l9kGVA/cSUA+JbgAY3g1q1pguY3j8BOfvSSgfzBZsiL0lJL3gsjWJYvs7KdxSwVg5XZVPK7b50FIczK25ThQp4VMPwH5/FpuX6vz9A==',key_name='tempest-keypair-621715770',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64a52f6edb6c4843a8decc313f8c10e3',ramdisk_id='',reservation_id='r-92w1prj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-902546950',owner_user_name='tempest-AttachInterfacesUnderV243Test-902546950-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:07:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='31a158d811524ad292199b931be12bc7',uuid=c131ca84-4259-49e0-a3d5-a4a5ca629d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.319 221324 DEBUG nova.network.os_vif_util [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converting VIF {"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.320 221324 DEBUG nova.network.os_vif_util [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.320 221324 DEBUG os_vif [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.321 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.321 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.321 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.324 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc46ca84-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.324 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc46ca84-34, col_values=(('external_ids', {'iface-id': 'bc46ca84-347f-4b48-a09d-4742bd167dc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:49:ed', 'vm-uuid': 'c131ca84-4259-49e0-a3d5-a4a5ca629d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:57 np0005603622 NetworkManager[49080]: <info>  [1769846877.3710] manager: (tapbc46ca84-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.372 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.381 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.381 221324 INFO os_vif [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34')#033[00m
Jan 31 03:07:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:57.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.467 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.467 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.467 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] No VIF found with MAC fa:16:3e:91:49:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.468 221324 INFO nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Using config drive#033[00m
Jan 31 03:07:57 np0005603622 nova_compute[221301]: 2026-01-31 08:07:57.502 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.025 221324 DEBUG nova.network.neutron [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updated VIF entry in instance network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.025 221324 DEBUG nova.network.neutron [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.048 221324 DEBUG oslo_concurrency.lockutils [req-99b3eee5-f35a-4eae-ab91-5ffc685ca694 req-2da9977d-5da7-42b2-80c0-19b65e8c4c1d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.158 221324 INFO nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Creating config drive at /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.170 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpttqwb_xw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.316 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpttqwb_xw" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.355 221324 DEBUG nova.storage.rbd_utils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] rbd image c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.360 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.716 221324 DEBUG oslo_concurrency.processutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config 9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.717 221324 INFO nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Deleting local config drive /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/disk.config because it was imported into RBD.#033[00m
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.7830] manager: (tap1a414f6e-34): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 31 03:07:58 np0005603622 kernel: tap1a414f6e-34: entered promiscuous mode
Jan 31 03:07:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:58Z|00230|binding|INFO|Claiming lport 1a414f6e-3462-4511-a204-1326ac6d6fcd for this chassis.
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:58Z|00231|binding|INFO|1a414f6e-3462-4511-a204-1326ac6d6fcd: Claiming fa:16:3e:83:fa:32 10.100.0.12
Jan 31 03:07:58 np0005603622 systemd-udevd[247853]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.8514] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.8522] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 31 03:07:58 np0005603622 systemd-machined[190406]: New machine qemu-30-instance-00000048.
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.855 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:fa:32 10.100.0.12'], port_security=['fa:16:3e:83:fa:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9f1f7c19-2aa2-4663-81ca-a26eaa106441', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c826f71-7560-44f4-8034-5ac735f4e81f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1a414f6e-3462-4511-a204-1326ac6d6fcd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.856 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1a414f6e-3462-4511-a204-1326ac6d6fcd in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.858 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.8682] device (tap1a414f6e-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.8691] device (tap1a414f6e-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.870 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3fc4ad-b6d1-45d8-8980-43ddd9e1cab6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.871 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455fab34-b1 in ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:07:58 np0005603622 systemd[1]: Started Virtual Machine qemu-30-instance-00000048.
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.874 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455fab34-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.874 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39bd1818-4093-4776-9b4a-27aada0e58c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.877 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2310268f-5f9b-497b-805e-73042a934ebd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.888 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[91481b4d-0514-4dcb-b123-5476140108bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.899 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.904 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eda35784-dd6c-4c10-8dc8-bbc3a64b77d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.919 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:58Z|00232|binding|INFO|Setting lport 1a414f6e-3462-4511-a204-1326ac6d6fcd ovn-installed in OVS
Jan 31 03:07:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:58Z|00233|binding|INFO|Setting lport 1a414f6e-3462-4511-a204-1326ac6d6fcd up in Southbound
Jan 31 03:07:58 np0005603622 nova_compute[221301]: 2026-01-31 08:07:58.921 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.935 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fe9a2f-4c81-4eb7-bc8e-2827bc72890d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.941 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9666de22-6ad5-400b-8ed6-0e9e7a7e117f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 NetworkManager[49080]: <info>  [1769846878.9431] manager: (tap455fab34-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Jan 31 03:07:58 np0005603622 systemd-udevd[247858]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.974 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[cd075360-1c48-4be3-b2a2-4a17377a3ef9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:58.978 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0a0796-7c70-4be3-8850-d6ae44916ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.0005] device (tap455fab34-b0): carrier: link connected
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.006 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ce37bf5a-8265-4e17-80c7-d966096438db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.024 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[66441fe5-1c2b-4b30-b07f-6ea35aa9c741]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613034, 'reachable_time': 34072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247896, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.043 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e722a96c-336a-4cc7-af4f-08a65dd7d15b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8f98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613034, 'tstamp': 613034}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247897, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.065 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd2b85a-4fd1-4816-b639-34f48b67d562]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613034, 'reachable_time': 34072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247898, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.098 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88ceba69-a1b0-4222-ad8e-389e07bd5a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.168 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a868ee6a-72c1-4176-9358-c239576301e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.170 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.170 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.171 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.1751] manager: (tap455fab34-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 31 03:07:59 np0005603622 kernel: tap455fab34-b0: entered promiscuous mode
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.182 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.184 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:59Z|00234|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.194 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.201 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.203 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.204 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[07a944d1-23e2-4e23-8d9c-e997a78adba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.205 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/455fab34-b015-4d97-a96d-f7ebd7f7555f.pid.haproxy
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 455fab34-b015-4d97-a96d-f7ebd7f7555f
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.206 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'env', 'PROCESS_TAG=haproxy-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455fab34-b015-4d97-a96d-f7ebd7f7555f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:07:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:07:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.240 221324 DEBUG oslo_concurrency.processutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config c131ca84-4259-49e0-a3d5-a4a5ca629d31_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.241 221324 INFO nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Deleting local config drive /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31/disk.config because it was imported into RBD.#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 kernel: tapbc46ca84-34: entered promiscuous mode
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.2830] manager: (tapbc46ca84-34): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Jan 31 03:07:59 np0005603622 systemd-udevd[247881]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.283 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:59Z|00235|binding|INFO|Claiming lport bc46ca84-347f-4b48-a09d-4742bd167dc7 for this chassis.
Jan 31 03:07:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:59Z|00236|binding|INFO|bc46ca84-347f-4b48-a09d-4742bd167dc7: Claiming fa:16:3e:91:49:ed 10.100.0.13
Jan 31 03:07:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:59Z|00237|binding|INFO|Setting lport bc46ca84-347f-4b48-a09d-4742bd167dc7 ovn-installed in OVS
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.3004] device (tapbc46ca84-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.3040] device (tapbc46ca84-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:07:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:07:59Z|00238|binding|INFO|Setting lport bc46ca84-347f-4b48-a09d-4742bd167dc7 up in Southbound
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.307 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:49:ed 10.100.0.13'], port_security=['fa:16:3e:91:49:ed 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c131ca84-4259-49e0-a3d5-a4a5ca629d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09a26665-5963-47b2-a3e4-106214bb83c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64a52f6edb6c4843a8decc313f8c10e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77da8387-ff87-4eef-a7dd-fe6d484eb09a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54f038a1-4edf-46aa-89a7-be5e1e9f7b39, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=bc46ca84-347f-4b48-a09d-4742bd167dc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:07:59 np0005603622 systemd-machined[190406]: New machine qemu-31-instance-00000049.
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.326 221324 DEBUG nova.compute.manager [req-85f27bf7-30fa-4a60-a18b-0905592adc5f req-46bb6846-b0c2-444b-ad75-b6c669a59a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.326 221324 DEBUG oslo_concurrency.lockutils [req-85f27bf7-30fa-4a60-a18b-0905592adc5f req-46bb6846-b0c2-444b-ad75-b6c669a59a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.327 221324 DEBUG oslo_concurrency.lockutils [req-85f27bf7-30fa-4a60-a18b-0905592adc5f req-46bb6846-b0c2-444b-ad75-b6c669a59a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.327 221324 DEBUG oslo_concurrency.lockutils [req-85f27bf7-30fa-4a60-a18b-0905592adc5f req-46bb6846-b0c2-444b-ad75-b6c669a59a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:07:59 np0005603622 nova_compute[221301]: 2026-01-31 08:07:59.328 221324 DEBUG nova.compute.manager [req-85f27bf7-30fa-4a60-a18b-0905592adc5f req-46bb6846-b0c2-444b-ad75-b6c669a59a5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Processing event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:07:59 np0005603622 systemd[1]: Started Virtual Machine qemu-31-instance-00000049.
Jan 31 03:07:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:07:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:07:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:07:59.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:07:59 np0005603622 podman[247958]: 2026-01-31 08:07:59.57200109 +0000 UTC m=+0.041305962 container create d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:07:59 np0005603622 systemd[1]: Started libpod-conmon-d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522.scope.
Jan 31 03:07:59 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:07:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c11a727124ce99cca09e19c73e5a6a086065a4bc317065dd5e2e772e15ffa3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:07:59 np0005603622 podman[247958]: 2026-01-31 08:07:59.553095472 +0000 UTC m=+0.022400354 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:07:59 np0005603622 podman[247958]: 2026-01-31 08:07:59.658654561 +0000 UTC m=+0.127959473 container init d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:07:59 np0005603622 podman[247958]: 2026-01-31 08:07:59.662974607 +0000 UTC m=+0.132279489 container start d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:07:59 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [NOTICE]   (247977) : New worker (247979) forked
Jan 31 03:07:59 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [NOTICE]   (247977) : Loading success.
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.727 139793 INFO neutron.agent.ovn.metadata.agent [-] Port bc46ca84-347f-4b48-a09d-4742bd167dc7 in datapath 09a26665-5963-47b2-a3e4-106214bb83c4 unbound from our chassis#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.729 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09a26665-5963-47b2-a3e4-106214bb83c4#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.737 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c681e7f4-75cd-4d8e-8867-14de92bd27ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.737 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09a26665-51 in ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.739 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09a26665-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.739 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4352ec9b-18af-4d7a-9efe-4f29c016fc13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d7eca4ea-7830-495d-8a7d-bc07929e450c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.748 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[edf7051b-e1c6-45d4-ac45-c70dd9f852cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.757 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2674a8d4-cf94-4667-ba71-801bfda23c42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.782 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dd2869-c498-43cf-98e0-fc9f32b42298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.789 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a23dd7d6-0586-4b07-bd6f-fbb52bee7f14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.7928] manager: (tap09a26665-50): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.822 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[976aa1a3-2558-49eb-a0cb-8cc82a9e5067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.826 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[70eeee02-f871-4e79-b442-bf601134578a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 NetworkManager[49080]: <info>  [1769846879.8460] device (tap09a26665-50): carrier: link connected
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.851 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6fee9523-2be0-4a69-8c48-81fad46f98df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.865 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c2133af7-d0da-4e87-a703-76fbd53ed6e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09a26665-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:4a:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613118, 'reachable_time': 16510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248001, 'error': None, 'target': 'ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.879 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e2221586-7020-445c-88bc-3b530067c9a2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:4a01'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613118, 'tstamp': 613118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248002, 'error': None, 'target': 'ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.893 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[568d9720-84cb-47df-ac61-7c723feecdd8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09a26665-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:4a:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613118, 'reachable_time': 16510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248003, 'error': None, 'target': 'ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.918 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[47c42d38-7045-4f18-80a4-c727d061decd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.971 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[97649093-87fe-4773-9486-13d0e38f0350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.973 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09a26665-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.973 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:07:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:07:59.974 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09a26665-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:00 np0005603622 kernel: tap09a26665-50: entered promiscuous mode
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:00 np0005603622 NetworkManager[49080]: <info>  [1769846880.0328] manager: (tap09a26665-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:00.041 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09a26665-50, col_values=(('external_ids', {'iface-id': '2094ea04-4f18-4e6f-bdfe-84c049b00b3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.044 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:00Z|00239|binding|INFO|Releasing lport 2094ea04-4f18-4e6f-bdfe-84c049b00b3e from this chassis (sb_readonly=0)
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.053 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:00.054 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09a26665-5963-47b2-a3e4-106214bb83c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09a26665-5963-47b2-a3e4-106214bb83c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:00.055 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[341faff5-0e6f-4049-ae0c-68fa9fb6ac58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:00.056 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-09a26665-5963-47b2-a3e4-106214bb83c4
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/09a26665-5963-47b2-a3e4-106214bb83c4.pid.haproxy
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 09a26665-5963-47b2-a3e4-106214bb83c4
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:08:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:00.057 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4', 'env', 'PROCESS_TAG=haproxy-09a26665-5963-47b2-a3e4-106214bb83c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09a26665-5963-47b2-a3e4-106214bb83c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:08:00 np0005603622 podman[248099]: 2026-01-31 08:08:00.407731356 +0000 UTC m=+0.054250830 container create 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.419 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846880.4190178, c131ca84-4259-49e0-a3d5-a4a5ca629d31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.421 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] VM Started (Lifecycle Event)#033[00m
Jan 31 03:08:00 np0005603622 systemd[1]: Started libpod-conmon-93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f.scope.
Jan 31 03:08:00 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:08:00 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9159851c6652a0b816b18874c618dc0e8c84ebaada4989958976a186ebdc63f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:08:00 np0005603622 podman[248099]: 2026-01-31 08:08:00.467511744 +0000 UTC m=+0.114031228 container init 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:08:00 np0005603622 podman[248099]: 2026-01-31 08:08:00.471191843 +0000 UTC m=+0.117711307 container start 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:08:00 np0005603622 podman[248099]: 2026-01-31 08:08:00.380061672 +0000 UTC m=+0.026581166 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:08:00 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [NOTICE]   (248144) : New worker (248146) forked
Jan 31 03:08:00 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [NOTICE]   (248144) : Loading success.
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.518 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.523 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846880.4203696, c131ca84-4259-49e0-a3d5-a4a5ca629d31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.523 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.535 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.538 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.541 221324 INFO nova.virt.libvirt.driver [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Instance spawned successfully.#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.541 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.743 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.749 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.750 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.751 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.752 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.753 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.754 221324 DEBUG nova.virt.libvirt.driver [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.760 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.860 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.861 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846880.5344796, 9f1f7c19-2aa2-4663-81ca-a26eaa106441 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.862 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] VM Started (Lifecycle Event)#033[00m
Jan 31 03:08:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:00Z|00240|binding|INFO|Releasing lport 2094ea04-4f18-4e6f-bdfe-84c049b00b3e from this chassis (sb_readonly=0)
Jan 31 03:08:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:00Z|00241|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.970 221324 INFO nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Took 10.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:08:00 np0005603622 nova_compute[221301]: 2026-01-31 08:08:00.970 221324 DEBUG nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.166 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.171 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:01.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.474 221324 INFO nova.compute.manager [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Took 12.39 seconds to build instance.#033[00m
Jan 31 03:08:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:01.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.651 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846880.5345204, 9f1f7c19-2aa2-4663-81ca-a26eaa106441 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.652 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.695 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.696 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.696 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.697 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.697 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.697 221324 WARNING nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.698 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.698 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.698 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.698 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.699 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Processing event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.699 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.699 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.700 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.700 221324 DEBUG oslo_concurrency.lockutils [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.700 221324 DEBUG nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] No waiting events found dispatching network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.700 221324 WARNING nova.compute.manager [req-b27d4f50-0d20-4094-8228-3cd1c2e78075 req-6827f603-e57d-4385-afe1-19f2726e9bd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received unexpected event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.702 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.709 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.713 221324 INFO nova.virt.libvirt.driver [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Instance spawned successfully.#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.714 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.747 221324 DEBUG oslo_concurrency.lockutils [None req-7193cf1f-4ccd-476a-bf87-94333a25ea78 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.753 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.761 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846880.5375643, 9f1f7c19-2aa2-4663-81ca-a26eaa106441 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.761 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.766 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.766 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.767 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.767 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.768 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.769 221324 DEBUG nova.virt.libvirt.driver [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.818 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.821 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.856 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846881.7075272, c131ca84-4259-49e0-a3d5-a4a5ca629d31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.857 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.950 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.954 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.975 221324 INFO nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Took 11.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:08:01 np0005603622 nova_compute[221301]: 2026-01-31 08:08:01.975 221324 DEBUG nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.027 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.086 221324 INFO nova.compute.manager [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Took 12.94 seconds to build instance.#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.139 221324 DEBUG oslo_concurrency.lockutils [None req-fedac6ee-f1b1-4771-9c93-033eca59a697 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.829 221324 INFO nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Snapshot image upload complete#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.830 221324 DEBUG nova.compute.manager [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.920 221324 INFO nova.compute.manager [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Shelve offloading#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.929 221324 INFO nova.virt.libvirt.driver [-] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance destroyed successfully.#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.929 221324 DEBUG nova.compute.manager [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.931 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.932 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:02 np0005603622 nova_compute[221301]: 2026-01-31 08:08:02.932 221324 DEBUG nova.network.neutron [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:03.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 31 03:08:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:03.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:04 np0005603622 nova_compute[221301]: 2026-01-31 08:08:04.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:05.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:05.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:05 np0005603622 nova_compute[221301]: 2026-01-31 08:08:05.756 221324 DEBUG nova.network.neutron [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updating instance_info_cache with network_info: [{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.144 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.921 221324 DEBUG nova.compute.manager [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.922 221324 DEBUG nova.compute.manager [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing instance network info cache due to event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.923 221324 DEBUG oslo_concurrency.lockutils [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.923 221324 DEBUG oslo_concurrency.lockutils [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:06 np0005603622 nova_compute[221301]: 2026-01-31 08:08:06.923 221324 DEBUG nova.network.neutron [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:07.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:07.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.588 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846872.5876944, 23ffafae-0fb8-48ec-9843-b0127f61e340 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.588 221324 INFO nova.compute.manager [-] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.623 221324 DEBUG nova.compute.manager [None req-ef22af3c-2a0a-40bc-9a14-488a10bd2648 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.627 221324 DEBUG nova.compute.manager [None req-ef22af3c-2a0a-40bc-9a14-488a10bd2648 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.650 221324 INFO nova.compute.manager [None req-ef22af3c-2a0a-40bc-9a14-488a10bd2648 - - - - - -] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.681 221324 INFO nova.virt.libvirt.driver [-] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Instance destroyed successfully.#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.682 221324 DEBUG nova.objects.instance [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid 23ffafae-0fb8-48ec-9843-b0127f61e340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.703 221324 DEBUG nova.virt.libvirt.vif [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-5961696',display_name='tempest-DeleteServersTestJSON-server-5961696',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-5961696',id=69,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:07:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-e3ymdfq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member',shelved_at='2026-01-31T08:08:02.830273',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='7b781d58-222d-48fb-b75e-dda36a2cb764'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:07:53Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=23ffafae-0fb8-48ec-9843-b0127f61e340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.704 221324 DEBUG nova.network.os_vif_util [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e573731-69", "ovs_interfaceid": "6e573731-6974-4502-bf29-a75fac4ae37b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.705 221324 DEBUG nova.network.os_vif_util [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.706 221324 DEBUG os_vif [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.709 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e573731-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.719 221324 INFO os_vif [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:b7:59,bridge_name='br-int',has_traffic_filtering=True,id=6e573731-6974-4502-bf29-a75fac4ae37b,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e573731-69')#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.805 221324 DEBUG nova.compute.manager [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Received event network-changed-6e573731-6974-4502-bf29-a75fac4ae37b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.806 221324 DEBUG nova.compute.manager [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Refreshing instance network info cache due to event network-changed-6e573731-6974-4502-bf29-a75fac4ae37b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.806 221324 DEBUG oslo_concurrency.lockutils [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.806 221324 DEBUG oslo_concurrency.lockutils [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:07 np0005603622 nova_compute[221301]: 2026-01-31 08:08:07.806 221324 DEBUG nova.network.neutron [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Refreshing network info cache for port 6e573731-6974-4502-bf29-a75fac4ae37b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:08 np0005603622 nova_compute[221301]: 2026-01-31 08:08:08.734 221324 DEBUG nova.network.neutron [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updated VIF entry in instance network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:08 np0005603622 nova_compute[221301]: 2026-01-31 08:08:08.735 221324 DEBUG nova.network.neutron [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:08 np0005603622 nova_compute[221301]: 2026-01-31 08:08:08.854 221324 DEBUG oslo_concurrency.lockutils [req-c86604a4-09c1-4e25-a84c-08aad7ded00c req-0dffd6af-62ae-479c-b098-b690f873c5c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.007 221324 DEBUG nova.compute.manager [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.008 221324 DEBUG nova.compute.manager [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing instance network info cache due to event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.009 221324 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.009 221324 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.009 221324 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:09.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.300 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:09.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.635 221324 DEBUG nova.network.neutron [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updated VIF entry in instance network info cache for port 6e573731-6974-4502-bf29-a75fac4ae37b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.636 221324 DEBUG nova.network.neutron [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Updating instance_info_cache with network_info: [{"id": "6e573731-6974-4502-bf29-a75fac4ae37b", "address": "fa:16:3e:05:b7:59", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": null, "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap6e573731-69", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:09 np0005603622 nova_compute[221301]: 2026-01-31 08:08:09.666 221324 DEBUG oslo_concurrency.lockutils [req-12b74d5c-56fb-4019-97ec-76526d81ac8c req-a60ca6b4-2883-4624-aea0-2db180b8297a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-23ffafae-0fb8-48ec-9843-b0127f61e340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.474 221324 INFO nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Deleting instance files /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340_del#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.476 221324 INFO nova.virt.libvirt.driver [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: 23ffafae-0fb8-48ec-9843-b0127f61e340] Deletion of /var/lib/nova/instances/23ffafae-0fb8-48ec-9843-b0127f61e340_del complete#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.628 221324 DEBUG nova.compute.manager [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.629 221324 DEBUG nova.compute.manager [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing instance network info cache due to event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.629 221324 DEBUG oslo_concurrency.lockutils [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.660 221324 INFO nova.scheduler.client.report [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance 23ffafae-0fb8-48ec-9843-b0127f61e340#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.723 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.724 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.791 221324 DEBUG oslo_concurrency.processutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.891 221324 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated VIF entry in instance network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.893 221324 DEBUG nova.network.neutron [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.945 221324 DEBUG oslo_concurrency.lockutils [req-1b424e8d-0b2d-40a7-b69b-181c2ace3ce8 req-2a92ded5-47e2-4903-a3cd-989a585b3a18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.946 221324 DEBUG oslo_concurrency.lockutils [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:10 np0005603622 nova_compute[221301]: 2026-01-31 08:08:10.947 221324 DEBUG nova.network.neutron [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:11.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/286719679' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.269 221324 DEBUG oslo_concurrency.processutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.273 221324 DEBUG nova.compute.provider_tree [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.317 221324 DEBUG nova.scheduler.client.report [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.345 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.395 221324 DEBUG oslo_concurrency.lockutils [None req-94169f9c-2a11-4c1a-8b3a-ad2f35e67333 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "23ffafae-0fb8-48ec-9843-b0127f61e340" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 42.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:11 np0005603622 nova_compute[221301]: 2026-01-31 08:08:11.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:11.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:12 np0005603622 nova_compute[221301]: 2026-01-31 08:08:12.209 221324 DEBUG nova.network.neutron [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated VIF entry in instance network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:12 np0005603622 nova_compute[221301]: 2026-01-31 08:08:12.210 221324 DEBUG nova.network.neutron [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:12 np0005603622 nova_compute[221301]: 2026-01-31 08:08:12.233 221324 DEBUG oslo_concurrency.lockutils [req-df153bd9-e75d-45ae-ad56-2ddd2149c241 req-f0996df3-bb01-4d66-90da-73d342c50884 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.611025) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892611066, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 588, "num_deletes": 252, "total_data_size": 884651, "memory_usage": 896760, "flush_reason": "Manual Compaction"}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892616059, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 582597, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38845, "largest_seqno": 39428, "table_properties": {"data_size": 579587, "index_size": 982, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7321, "raw_average_key_size": 19, "raw_value_size": 573451, "raw_average_value_size": 1525, "num_data_blocks": 43, "num_entries": 376, "num_filter_entries": 376, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846864, "oldest_key_time": 1769846864, "file_creation_time": 1769846892, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 5082 microseconds, and 2583 cpu microseconds.
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.616106) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 582597 bytes OK
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.616129) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.620610) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.620634) EVENT_LOG_v1 {"time_micros": 1769846892620621, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.620651) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 881303, prev total WAL file size 881303, number of live WAL files 2.
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.621555) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(568KB)], [72(11MB)]
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892621640, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 12339306, "oldest_snapshot_seqno": -1}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6298 keys, 10405865 bytes, temperature: kUnknown
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892670415, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 10405865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10362168, "index_size": 26819, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15813, "raw_key_size": 162674, "raw_average_key_size": 25, "raw_value_size": 10247764, "raw_average_value_size": 1627, "num_data_blocks": 1068, "num_entries": 6298, "num_filter_entries": 6298, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769846892, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.670661) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 10405865 bytes
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.672153) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 252.6 rd, 213.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.2 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(39.0) write-amplify(17.9) OK, records in: 6814, records dropped: 516 output_compression: NoCompression
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.672170) EVENT_LOG_v1 {"time_micros": 1769846892672162, "job": 44, "event": "compaction_finished", "compaction_time_micros": 48858, "compaction_time_cpu_micros": 26163, "output_level": 6, "num_output_files": 1, "total_output_size": 10405865, "num_input_records": 6814, "num_output_records": 6298, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892672379, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769846892673577, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.620934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.673659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.673666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.673668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.673669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:08:12.673671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:08:12 np0005603622 nova_compute[221301]: 2026-01-31 08:08:12.713 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:13.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:13.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:14 np0005603622 nova_compute[221301]: 2026-01-31 08:08:14.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:15.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:15 np0005603622 nova_compute[221301]: 2026-01-31 08:08:15.414 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:15.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:17.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:17Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:91:49:ed 10.100.0.13
Jan 31 03:08:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:17Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:49:ed 10.100.0.13
Jan 31 03:08:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:17.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:17 np0005603622 nova_compute[221301]: 2026-01-31 08:08:17.764 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:18Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:fa:32 10.100.0.12
Jan 31 03:08:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:18Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:fa:32 10.100.0.12
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:08:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:19.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:19 np0005603622 nova_compute[221301]: 2026-01-31 08:08:19.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:19.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:21 np0005603622 podman[248328]: 2026-01-31 08:08:21.218746449 +0000 UTC m=+0.132195826 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:08:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:21.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:21 np0005603622 nova_compute[221301]: 2026-01-31 08:08:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:21 np0005603622 nova_compute[221301]: 2026-01-31 08:08:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:08:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:21.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:22 np0005603622 nova_compute[221301]: 2026-01-31 08:08:22.768 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:23 np0005603622 podman[248354]: 2026-01-31 08:08:23.128223064 +0000 UTC m=+0.051089265 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:08:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:23.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:23.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:24 np0005603622 nova_compute[221301]: 2026-01-31 08:08:24.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:24 np0005603622 nova_compute[221301]: 2026-01-31 08:08:24.710 221324 DEBUG nova.objects.instance [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lazy-loading 'flavor' on Instance uuid c131ca84-4259-49e0-a3d5-a4a5ca629d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:24 np0005603622 nova_compute[221301]: 2026-01-31 08:08:24.786 221324 DEBUG oslo_concurrency.lockutils [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:24 np0005603622 nova_compute[221301]: 2026-01-31 08:08:24.786 221324 DEBUG oslo_concurrency.lockutils [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:25.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:25 np0005603622 nova_compute[221301]: 2026-01-31 08:08:25.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:25.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.614 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.615 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.617 221324 DEBUG nova.objects.instance [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.992 221324 DEBUG nova.compute.manager [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.992 221324 DEBUG nova.compute.manager [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing instance network info cache due to event network-changed-1a414f6e-3462-4511-a204-1326ac6d6fcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.993 221324 DEBUG oslo_concurrency.lockutils [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.993 221324 DEBUG oslo_concurrency.lockutils [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:26 np0005603622 nova_compute[221301]: 2026-01-31 08:08:26.993 221324 DEBUG nova.network.neutron [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:27 np0005603622 nova_compute[221301]: 2026-01-31 08:08:27.060 221324 DEBUG nova.network.neutron [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:27.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:27.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:27 np0005603622 nova_compute[221301]: 2026-01-31 08:08:27.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:28 np0005603622 nova_compute[221301]: 2026-01-31 08:08:28.026 221324 DEBUG nova.objects.instance [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:28 np0005603622 nova_compute[221301]: 2026-01-31 08:08:28.074 221324 DEBUG nova.network.neutron [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:08:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.101 221324 DEBUG nova.compute.manager [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.102 221324 DEBUG nova.compute.manager [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing instance network info cache due to event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.103 221324 DEBUG oslo_concurrency.lockutils [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.202 221324 DEBUG nova.policy [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60f2b878669c4c529b35e04860cc6d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c73212dc7c84914b6c934d45b6826f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:08:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:29.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.425 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.426 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.427 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.428 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:29.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.733 221324 DEBUG nova.network.neutron [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated VIF entry in instance network info cache for port 1a414f6e-3462-4511-a204-1326ac6d6fcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.733 221324 DEBUG nova.network.neutron [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.778 221324 DEBUG oslo_concurrency.lockutils [req-2c2d73bb-1993-4fb3-9122-bc584fff8327 req-bfca18b3-29a7-4e68-8c92-5d485ff56dc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/850156123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:29 np0005603622 nova_compute[221301]: 2026-01-31 08:08:29.938 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.025 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000049 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.026 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000049 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.033 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.034 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.068 221324 DEBUG nova.network.neutron [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.108 221324 DEBUG oslo_concurrency.lockutils [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.109 221324 DEBUG nova.compute.manager [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.109 221324 DEBUG nova.compute.manager [None req-d1855936-2f1a-47da-83f9-96e131fe6b13 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] network_info to inject: |[{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.113 221324 DEBUG oslo_concurrency.lockutils [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.113 221324 DEBUG nova.network.neutron [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.256 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.258 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4315MB free_disk=20.809898376464844GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.259 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.259 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.345 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.345 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance c131ca84-4259-49e0-a3d5-a4a5ca629d31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.345 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.346 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.397 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.589 221324 DEBUG nova.network.neutron [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Successfully updated port: 131dda89-6e7d-4a88-9572-dcd63205fc02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.633 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.633 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.634 221324 DEBUG nova.network.neutron [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2443506547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.839 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.845 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:30.863 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.864 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:30.865 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.895 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.895 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:30 np0005603622 nova_compute[221301]: 2026-01-31 08:08:30.902 221324 WARNING nova.network.neutron [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] 455fab34-b015-4d97-a96d-f7ebd7f7555f already exists in list: networks containing: ['455fab34-b015-4d97-a96d-f7ebd7f7555f']. ignoring it#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.094 221324 DEBUG nova.objects.instance [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lazy-loading 'flavor' on Instance uuid c131ca84-4259-49e0-a3d5-a4a5ca629d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.124 221324 DEBUG oslo_concurrency.lockutils [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.211 221324 DEBUG nova.compute.manager [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-changed-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.212 221324 DEBUG nova.compute.manager [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing instance network info cache due to event network-changed-131dda89-6e7d-4a88-9572-dcd63205fc02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.212 221324 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:08:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:31.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:08:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:31.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.892 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.893 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:31 np0005603622 nova_compute[221301]: 2026-01-31 08:08:31.893 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:32 np0005603622 nova_compute[221301]: 2026-01-31 08:08:32.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:33.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.338 221324 DEBUG nova.network.neutron [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updated VIF entry in instance network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.339 221324 DEBUG nova.network.neutron [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.400 221324 DEBUG oslo_concurrency.lockutils [req-ccc8f520-acc1-4e3f-8252-9c04faa7c393 req-a7322627-73c3-4833-916f-19f9fd94bb56 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.400 221324 DEBUG oslo_concurrency.lockutils [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:33.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:33 np0005603622 nova_compute[221301]: 2026-01-31 08:08:33.655 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.126 221324 DEBUG nova.network.neutron [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.153 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.156 221324 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.156 221324 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Refreshing network info cache for port 131dda89-6e7d-4a88-9572-dcd63205fc02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.164 221324 DEBUG nova.virt.libvirt.vif [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.165 221324 DEBUG nova.network.os_vif_util [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.166 221324 DEBUG nova.network.os_vif_util [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.167 221324 DEBUG os_vif [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.168 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.169 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.169 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.179 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.179 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap131dda89-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.180 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap131dda89-6e, col_values=(('external_ids', {'iface-id': '131dda89-6e7d-4a88-9572-dcd63205fc02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:47:2e', 'vm-uuid': '9f1f7c19-2aa2-4663-81ca-a26eaa106441'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 NetworkManager[49080]: <info>  [1769846914.1864] manager: (tap131dda89-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.194 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.195 221324 INFO os_vif [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e')#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.196 221324 DEBUG nova.virt.libvirt.vif [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.197 221324 DEBUG nova.network.os_vif_util [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.198 221324 DEBUG nova.network.os_vif_util [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.202 221324 DEBUG nova.virt.libvirt.guest [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] attach device xml: <interface type="ethernet">
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:08:34 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:08:34 np0005603622 kernel: tap131dda89-6e: entered promiscuous mode
Jan 31 03:08:34 np0005603622 NetworkManager[49080]: <info>  [1769846914.2191] manager: (tap131dda89-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 31 03:08:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:34Z|00242|binding|INFO|Claiming lport 131dda89-6e7d-4a88-9572-dcd63205fc02 for this chassis.
Jan 31 03:08:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:34Z|00243|binding|INFO|131dda89-6e7d-4a88-9572-dcd63205fc02: Claiming fa:16:3e:b3:47:2e 10.100.0.10
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:34Z|00244|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 ovn-installed in OVS
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:34Z|00245|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 up in Southbound
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.261 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:47:2e 10.100.0.10'], port_security=['fa:16:3e:b3:47:2e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9f1f7c19-2aa2-4663-81ca-a26eaa106441', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=131dda89-6e7d-4a88-9572-dcd63205fc02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.266 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f bound to our chassis#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.268 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:08:34 np0005603622 systemd-udevd[248474]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:08:34 np0005603622 NetworkManager[49080]: <info>  [1769846914.2919] device (tap131dda89-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.288 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04fd9460-d358-430c-81c9-8dde6a3a57f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 NetworkManager[49080]: <info>  [1769846914.2933] device (tap131dda89-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.343 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bc613f20-9049-4589-bac7-5284610be16a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.348 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e03b73e5-b949-4977-9e26-4b4235fa0ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.350 221324 DEBUG nova.virt.libvirt.driver [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.350 221324 DEBUG nova.virt.libvirt.driver [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.351 221324 DEBUG nova.virt.libvirt.driver [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:83:fa:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.351 221324 DEBUG nova.virt.libvirt.driver [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] No VIF found with MAC fa:16:3e:b3:47:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.382 221324 DEBUG nova.virt.libvirt.guest [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:name>tempest-tempest.common.compute-instance-576814234</nova:name>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:08:34</nova:creationTime>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:port uuid="1a414f6e-3462-4511-a204-1326ac6d6fcd">
Jan 31 03:08:34 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:34 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:34 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:08:34 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:08:34 np0005603622 nova_compute[221301]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.384 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[57ae5c61-8d15-4fb3-ad73-d3cd5773370f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.403 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ff50b5da-1f75-46da-bc24-368c42394ce1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613034, 'reachable_time': 34072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248482, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.412 221324 DEBUG oslo_concurrency.lockutils [None req-260e178f-5eef-461b-ae4e-eff728e61ef4 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.422 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b31c55af-a480-4a99-8f0b-a114edb29445]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613046, 'tstamp': 613046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248483, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613049, 'tstamp': 613049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248483, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.426 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.428 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.430 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.430 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.431 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:34.431 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:34 np0005603622 nova_compute[221301]: 2026-01-31 08:08:34.699 221324 DEBUG nova.network.neutron [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:35.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:35.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.683 221324 DEBUG nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.684 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.685 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.685 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.686 221324 DEBUG nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.686 221324 WARNING nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.687 221324 DEBUG nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.688 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.689 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.689 221324 DEBUG oslo_concurrency.lockutils [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.690 221324 DEBUG nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:35 np0005603622 nova_compute[221301]: 2026-01-31 08:08:35.690 221324 WARNING nova.compute.manager [req-1eedc49a-1c91-40fa-b3f4-3ecd553576c4 req-acfa3fdd-36be-4155-9c6d-33a305034c99 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.021 221324 DEBUG nova.compute.manager [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.022 221324 DEBUG nova.compute.manager [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing instance network info cache due to event network-changed-bc46ca84-347f-4b48-a09d-4742bd167dc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.022 221324 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.213 221324 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated VIF entry in instance network info cache for port 131dda89-6e7d-4a88-9572-dcd63205fc02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.214 221324 DEBUG nova.network.neutron [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.271 221324 DEBUG oslo_concurrency.lockutils [req-f5a66487-47b7-4bbc-bf3f-08a66c08d66a req-d424200c-bcf7-485d-aadf-b94adf335573 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.271 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.272 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.272 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.612 221324 DEBUG nova.network.neutron [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.653 221324 DEBUG oslo_concurrency.lockutils [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.654 221324 DEBUG nova.compute.manager [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.655 221324 DEBUG nova.compute.manager [None req-6c24566f-007f-447d-a8c8-01dc11a9b641 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] network_info to inject: |[{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.659 221324 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.659 221324 DEBUG nova.network.neutron [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Refreshing network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.859 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.860 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.904 221324 DEBUG nova.objects.instance [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'flavor' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.942 221324 DEBUG nova.virt.libvirt.vif [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.943 221324 DEBUG nova.network.os_vif_util [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.943 221324 DEBUG nova.network.os_vif_util [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.947 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.954 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.956 221324 DEBUG nova.virt.libvirt.driver [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Attempting to detach device tap131dda89-6e from instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.956 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.964 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.968 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <name>instance-00000048</name>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <uuid>9f1f7c19-2aa2-4663-81ca-a26eaa106441</uuid>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:name>tempest-tempest.common.compute-instance-576814234</nova:name>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:08:34</nova:creationTime>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:port uuid="1a414f6e-3462-4511-a204-1326ac6d6fcd">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='serial'>9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='uuid'>9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk' index='2'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config' index='1'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:83:fa:32'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target dev='tap1a414f6e-34'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:b3:47:2e'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target dev='tap131dda89-6e'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='net1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log' append='off'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log' append='off'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c330,c536</label>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c330,c536</imagelabel>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.969 221324 INFO nova.virt.libvirt.driver [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap131dda89-6e from instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 from the persistent domain config.#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.970 221324 DEBUG nova.virt.libvirt.driver [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] (1/8): Attempting to detach device tap131dda89-6e with device alias net1 from instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:08:36 np0005603622 nova_compute[221301]: 2026-01-31 08:08:36.971 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] detach device xml: <interface type="ethernet">
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <mac address="fa:16:3e:b3:47:2e"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <model type="virtio"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <mtu size="1442"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]:  <target dev="tap131dda89-6e"/>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: </interface>
Jan 31 03:08:36 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:08:37 np0005603622 kernel: tap131dda89-6e (unregistering): left promiscuous mode
Jan 31 03:08:37 np0005603622 NetworkManager[49080]: <info>  [1769846917.0919] device (tap131dda89-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00246|binding|INFO|Releasing lport 131dda89-6e7d-4a88-9572-dcd63205fc02 from this chassis (sb_readonly=0)
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00247|binding|INFO|Setting lport 131dda89-6e7d-4a88-9572-dcd63205fc02 down in Southbound
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.094 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00248|binding|INFO|Removing iface tap131dda89-6e ovn-installed in OVS
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.096 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.101 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769846917.1010547, 9f1f7c19-2aa2-4663-81ca-a26eaa106441 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.103 221324 DEBUG nova.virt.libvirt.driver [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Start waiting for the detach event from libvirt for device tap131dda89-6e with device alias net1 for instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.103 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.108 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.108 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.108 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.108 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:47:2e 10.100.0.10'], port_security=['fa:16:3e:b3:47:2e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9f1f7c19-2aa2-4663-81ca-a26eaa106441', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1951371451', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'fdcf3a61-8bd1-47a3-8e6c-d6fed17d2331', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=131dda89-6e7d-4a88-9572-dcd63205fc02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.109 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.109 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.111 221324 INFO nova.compute.manager [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Terminating instance#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.111 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.112 221324 DEBUG nova.compute.manager [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.114 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455fab34-b015-4d97-a96d-f7ebd7f7555f#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.114 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b3:47:2e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap131dda89-6e"/></interface>not found in domain: <domain type='kvm' id='30'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <name>instance-00000048</name>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <uuid>9f1f7c19-2aa2-4663-81ca-a26eaa106441</uuid>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:name>tempest-tempest.common.compute-instance-576814234</nova:name>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:08:34</nova:creationTime>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:port uuid="1a414f6e-3462-4511-a204-1326ac6d6fcd">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:port uuid="131dda89-6e7d-4a88-9572-dcd63205fc02">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:08:37 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <memory unit='KiB'>131072</memory>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <vcpu placement='static'>1</vcpu>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <resource>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <partition>/machine</partition>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </resource>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <sysinfo type='smbios'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='manufacturer'>RDO</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='product'>OpenStack Compute</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='serial'>9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='uuid'>9f1f7c19-2aa2-4663-81ca-a26eaa106441</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <entry name='family'>Virtual Machine</entry>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <boot dev='hd'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <smbios mode='sysinfo'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <vmcoreinfo state='on'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <cpu mode='custom' match='exact' check='full'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <model fallback='forbid'>Nehalem</model>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <feature policy='require' name='x2apic'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <feature policy='require' name='hypervisor'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <feature policy='require' name='vme'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <clock offset='utc'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <timer name='pit' tickpolicy='delay'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <timer name='hpet' present='no'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <on_poweroff>destroy</on_poweroff>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <on_reboot>restart</on_reboot>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <on_crash>destroy</on_crash>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <disk type='network' device='disk'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk' index='2'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target dev='vda' bus='virtio'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='virtio-disk0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <disk type='network' device='cdrom'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <driver name='qemu' type='raw' cache='none'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <auth username='openstack'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <secret type='ceph' uuid='2f5ab832-5f2e-5a84-bd93-cf8bab960ee2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <source protocol='rbd' name='vms/9f1f7c19-2aa2-4663-81ca-a26eaa106441_disk.config' index='1'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.100' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.102' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <host name='192.168.122.101' port='6789'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target dev='sda' bus='sata'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <readonly/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='sata0-0-0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='0' model='pcie-root'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pcie.0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='1' port='0x10'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='2' port='0x11'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='3' port='0x12'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.3'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='4' port='0x13'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.4'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='5' port='0x14'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.5'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='6' port='0x15'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.6'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='7' port='0x16'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.7'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='8' port='0x17'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.8'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='9' port='0x18'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.9'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='10' port='0x19'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.10'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='11' port='0x1a'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.11'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='12' port='0x1b'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.12'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='13' port='0x1c'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.13'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='14' port='0x1d'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.14'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='15' port='0x1e'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.15'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='16' port='0x1f'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.16'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='17' port='0x20'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.17'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='18' port='0x21'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.18'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='19' port='0x22'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.19'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='20' port='0x23'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.20'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='21' port='0x24'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.21'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='22' port='0x25'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.22'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='23' port='0x26'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.23'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='24' port='0x27'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.24'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-root-port'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target chassis='25' port='0x28'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.25'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model name='pcie-pci-bridge'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='pci.26'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='usb'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <controller type='sata' index='0'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='ide'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </controller>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <interface type='ethernet'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <mac address='fa:16:3e:83:fa:32'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target dev='tap1a414f6e-34'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model type='virtio'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <driver name='vhost' rx_queue_size='512'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <mtu size='1442'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='net0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <serial type='pty'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log' append='off'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target type='isa-serial' port='0'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:        <model name='isa-serial'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      </target>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <console type='pty' tty='/dev/pts/0'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <source path='/dev/pts/0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <log file='/var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441/console.log' append='off'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <target type='serial' port='0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='serial0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </console>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <input type='tablet' bus='usb'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='input0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='usb' bus='0' port='1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <input type='mouse' bus='ps2'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='input1'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <input type='keyboard' bus='ps2'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='input2'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </input>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <listen type='address' address='::0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </graphics>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <audio id='1' type='none'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <model type='virtio' heads='1' primary='yes'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='video0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <watchdog model='itco' action='reset'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='watchdog0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </watchdog>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <memballoon model='virtio'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <stats period='10'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='balloon0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <rng model='virtio'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <backend model='random'>/dev/urandom</backend>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <alias name='rng0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <label>system_u:system_r:svirt_t:s0:c330,c536</label>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c330,c536</imagelabel>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <label>+107:+107</label>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <imagelabel>+107:+107</imagelabel>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </seclabel>
Jan 31 03:08:37 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:08:37 np0005603622 nova_compute[221301]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.115 221324 INFO nova.virt.libvirt.driver [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully detached device tap131dda89-6e from instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441 from the live domain config.#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.116 221324 DEBUG nova.virt.libvirt.vif [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.116 221324 DEBUG nova.network.os_vif_util [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.117 221324 DEBUG nova.network.os_vif_util [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.117 221324 DEBUG os_vif [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.121 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap131dda89-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.123 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.125 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.128 221324 INFO os_vif [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e')#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.129 221324 DEBUG nova.virt.libvirt.guest [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:name>tempest-tempest.common.compute-instance-576814234</nova:name>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:creationTime>2026-01-31 08:08:37</nova:creationTime>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:flavor name="m1.nano">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:memory>128</nova:memory>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:disk>1</nova:disk>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:swap>0</nova:swap>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:vcpus>1</nova:vcpus>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:flavor>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:owner>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:user uuid="60f2b878669c4c529b35e04860cc6d64">tempest-AttachInterfacesTestJSON-1920739502-project-member</nova:user>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:project uuid="0c73212dc7c84914b6c934d45b6826f7">tempest-AttachInterfacesTestJSON-1920739502</nova:project>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:owner>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  <nova:ports>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    <nova:port uuid="1a414f6e-3462-4511-a204-1326ac6d6fcd">
Jan 31 03:08:37 np0005603622 nova_compute[221301]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:    </nova:port>
Jan 31 03:08:37 np0005603622 nova_compute[221301]:  </nova:ports>
Jan 31 03:08:37 np0005603622 nova_compute[221301]: </nova:instance>
Jan 31 03:08:37 np0005603622 nova_compute[221301]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.129 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ea522345-9a6d-423d-a6d4-d149734a6cb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.164 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0d80dcb6-aedb-4f36-a0a3-445eff3cf154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.168 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb26341-aa5a-4799-a6a7-d69399cd5e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 kernel: tapbc46ca84-34 (unregistering): left promiscuous mode
Jan 31 03:08:37 np0005603622 NetworkManager[49080]: <info>  [1769846917.1829] device (tapbc46ca84-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00249|binding|INFO|Releasing lport bc46ca84-347f-4b48-a09d-4742bd167dc7 from this chassis (sb_readonly=0)
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00250|binding|INFO|Setting lport bc46ca84-347f-4b48-a09d-4742bd167dc7 down in Southbound
Jan 31 03:08:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:37Z|00251|binding|INFO|Removing iface tapbc46ca84-34 ovn-installed in OVS
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.201 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9d806e03-114a-4b3b-9137-c89bad5d73e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.206 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:49:ed 10.100.0.13'], port_security=['fa:16:3e:91:49:ed 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c131ca84-4259-49e0-a3d5-a4a5ca629d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09a26665-5963-47b2-a3e4-106214bb83c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64a52f6edb6c4843a8decc313f8c10e3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '77da8387-ff87-4eef-a7dd-fe6d484eb09a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54f038a1-4edf-46aa-89a7-be5e1e9f7b39, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=bc46ca84-347f-4b48-a09d-4742bd167dc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.218 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[abe085ba-b7db-4675-9952-a25db02857fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455fab34-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8f:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613034, 'reachable_time': 34072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248497, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.235 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[918fc052-6e47-404a-9fa0-4587c39172d4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613046, 'tstamp': 613046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248498, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap455fab34-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613049, 'tstamp': 613049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248498, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.237 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 31 03:08:37 np0005603622 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000049.scope: Consumed 15.560s CPU time.
Jan 31 03:08:37 np0005603622 systemd-machined[190406]: Machine qemu-31-instance-00000049 terminated.
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.248 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.248 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455fab34-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.249 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.249 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455fab34-b0, col_values=(('external_ids', {'iface-id': 'b4a40811-3703-4da5-859c-3e041b7cfee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.250 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.251 139793 INFO neutron.agent.ovn.metadata.agent [-] Port bc46ca84-347f-4b48-a09d-4742bd167dc7 in datapath 09a26665-5963-47b2-a3e4-106214bb83c4 unbound from our chassis#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.254 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09a26665-5963-47b2-a3e4-106214bb83c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.256 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[beb38e0e-8d33-49ee-92f6-898014d14c69]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.256 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4 namespace which is not needed anymore#033[00m
Jan 31 03:08:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:37.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.364 221324 INFO nova.virt.libvirt.driver [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Instance destroyed successfully.#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.364 221324 DEBUG nova.objects.instance [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lazy-loading 'resources' on Instance uuid c131ca84-4259-49e0-a3d5-a4a5ca629d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [NOTICE]   (248144) : haproxy version is 2.8.14-c23fe91
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [NOTICE]   (248144) : path to executable is /usr/sbin/haproxy
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [WARNING]  (248144) : Exiting Master process...
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [WARNING]  (248144) : Exiting Master process...
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [ALERT]    (248144) : Current worker (248146) exited with code 143 (Terminated)
Jan 31 03:08:37 np0005603622 neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4[248134]: [WARNING]  (248144) : All workers exited. Exiting... (0)
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.387 221324 DEBUG nova.virt.libvirt.vif [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-239311118',display_name='tempest-AttachInterfacesUnderV243Test-server-239311118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-239311118',id=73,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuImM2D3fNGJMsdqkErwFBaEJ04l9kGVA/cSUA+JbgAY3g1q1pguY3j8BOfvSSgfzBZsiL0lJL3gsjWJYvs7KdxSwVg5XZVPK7b50FIczK25ThQp4VMPwH5/FpuX6vz9A==',key_name='tempest-keypair-621715770',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='64a52f6edb6c4843a8decc313f8c10e3',ramdisk_id='',reservation_id='r-92w1prj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-902546950',owner_user_name='tempest-AttachInterfacesUnderV243Test-902546950-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='31a158d811524ad292199b931be12bc7',uuid=c131ca84-4259-49e0-a3d5-a4a5ca629d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.388 221324 DEBUG nova.network.os_vif_util [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converting VIF {"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.389 221324 DEBUG nova.network.os_vif_util [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.389 221324 DEBUG os_vif [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:37 np0005603622 systemd[1]: libpod-93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f.scope: Deactivated successfully.
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.391 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.391 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc46ca84-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 podman[248517]: 2026-01-31 08:08:37.398346411 +0000 UTC m=+0.057270151 container died 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.400 221324 INFO os_vif [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:49:ed,bridge_name='br-int',has_traffic_filtering=True,id=bc46ca84-347f-4b48-a09d-4742bd167dc7,network=Network(09a26665-5963-47b2-a3e4-106214bb83c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc46ca84-34')#033[00m
Jan 31 03:08:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f-userdata-shm.mount: Deactivated successfully.
Jan 31 03:08:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9159851c6652a0b816b18874c618dc0e8c84ebaada4989958976a186ebdc63f4-merged.mount: Deactivated successfully.
Jan 31 03:08:37 np0005603622 podman[248517]: 2026-01-31 08:08:37.442883499 +0000 UTC m=+0.101807199 container cleanup 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:08:37 np0005603622 systemd[1]: libpod-conmon-93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f.scope: Deactivated successfully.
Jan 31 03:08:37 np0005603622 podman[248580]: 2026-01-31 08:08:37.520904267 +0000 UTC m=+0.057071895 container remove 93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.526 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[07d05982-d0ef-445b-a4da-c23a6c1ec130]: (4, ('Sat Jan 31 08:08:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4 (93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f)\n93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f\nSat Jan 31 08:08:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4 (93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f)\n93d863615bb728e943fc3f1dff64c9ec31df7bd872cc955b26b1ebcc60e9231f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.530 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ff2ba9-bc65-4908-a101-f006fcba36fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.532 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09a26665-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.534 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 kernel: tap09a26665-50: left promiscuous mode
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.544 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.548 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e49066-ae85-4136-84d0-5b55a21f6d2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4798ab33-3b6c-42ff-a366-b21800c4d01a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.568 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c84cf32e-394e-4fcb-add9-d4e6bf868a63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.587 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48a63813-a9bc-4050-9cbd-809141a4dd84]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613111, 'reachable_time': 34344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248594, 'error': None, 'target': 'ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 systemd[1]: run-netns-ovnmeta\x2d09a26665\x2d5963\x2d47b2\x2da3e4\x2d106214bb83c4.mount: Deactivated successfully.
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.591 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09a26665-5963-47b2-a3e4-106214bb83c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:08:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:37.592 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1a7d98-7b79-47f6-9453-edb6bf2d4172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:37.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.825 221324 DEBUG nova.compute.manager [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.827 221324 DEBUG oslo_concurrency.lockutils [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.827 221324 DEBUG oslo_concurrency.lockutils [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.827 221324 DEBUG oslo_concurrency.lockutils [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.827 221324 DEBUG nova.compute.manager [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.828 221324 WARNING nova.compute.manager [req-7d4d6b2d-72e5-4c35-88ed-102a0b8c6ebf req-fbf74320-2be9-4763-9edc-ea7b70dce531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-unplugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:37 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.998 221324 INFO nova.virt.libvirt.driver [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Deleting instance files /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31_del#033[00m
Jan 31 03:08:38 np0005603622 nova_compute[221301]: 2026-01-31 08:08:37.999 221324 INFO nova.virt.libvirt.driver [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Deletion of /var/lib/nova/instances/c131ca84-4259-49e0-a3d5-a4a5ca629d31_del complete#033[00m
Jan 31 03:08:38 np0005603622 nova_compute[221301]: 2026-01-31 08:08:38.134 221324 INFO nova.compute.manager [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Took 1.02 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:08:38 np0005603622 nova_compute[221301]: 2026-01-31 08:08:38.135 221324 DEBUG oslo.service.loopingcall [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:08:38 np0005603622 nova_compute[221301]: 2026-01-31 08:08:38.136 221324 DEBUG nova.compute.manager [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:08:38 np0005603622 nova_compute[221301]: 2026-01-31 08:08:38.136 221324 DEBUG nova.network.neutron [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:08:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:08:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:39.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:08:39 np0005603622 nova_compute[221301]: 2026-01-31 08:08:39.382 221324 DEBUG nova.network.neutron [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updated VIF entry in instance network info cache for port bc46ca84-347f-4b48-a09d-4742bd167dc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:08:39 np0005603622 nova_compute[221301]: 2026-01-31 08:08:39.382 221324 DEBUG nova.network.neutron [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [{"id": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "address": "fa:16:3e:91:49:ed", "network": {"id": "09a26665-5963-47b2-a3e4-106214bb83c4", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2063088727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64a52f6edb6c4843a8decc313f8c10e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc46ca84-34", "ovs_interfaceid": "bc46ca84-347f-4b48-a09d-4742bd167dc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:39 np0005603622 nova_compute[221301]: 2026-01-31 08:08:39.459 221324 DEBUG oslo_concurrency.lockutils [req-d73afae5-af7f-4e39-9125-efeb0dcb75a1 req-e19f1d87-9750-4f1e-8ecd-7bb77f723591 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c131ca84-4259-49e0-a3d5-a4a5ca629d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:39 np0005603622 nova_compute[221301]: 2026-01-31 08:08:39.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:08:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:39.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.089 221324 DEBUG nova.network.neutron [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.143 221324 DEBUG nova.compute.manager [req-cb41eaf7-4c93-463b-bd53-6c5ec702563b req-a2e59ea9-1f6d-4998-ac3c-0812d74864f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-vif-deleted-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.144 221324 INFO nova.compute.manager [req-cb41eaf7-4c93-463b-bd53-6c5ec702563b req-a2e59ea9-1f6d-4998-ac3c-0812d74864f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Neutron deleted interface bc46ca84-347f-4b48-a09d-4742bd167dc7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.144 221324 DEBUG nova.network.neutron [req-cb41eaf7-4c93-463b-bd53-6c5ec702563b req-a2e59ea9-1f6d-4998-ac3c-0812d74864f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.150 221324 INFO nova.compute.manager [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Took 2.01 seconds to deallocate network for instance.#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.199 221324 DEBUG nova.compute.manager [req-cb41eaf7-4c93-463b-bd53-6c5ec702563b req-a2e59ea9-1f6d-4998-ac3c-0812d74864f8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Detach interface failed, port_id=bc46ca84-347f-4b48-a09d-4742bd167dc7, reason: Instance c131ca84-4259-49e0-a3d5-a4a5ca629d31 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.239 221324 DEBUG nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.240 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.240 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.240 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.241 221324 DEBUG nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.241 221324 WARNING nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-plugged-131dda89-6e7d-4a88-9572-dcd63205fc02 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.242 221324 DEBUG nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.242 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.242 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.243 221324 DEBUG oslo_concurrency.lockutils [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.243 221324 DEBUG nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] No waiting events found dispatching network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.244 221324 WARNING nova.compute.manager [req-88e43e54-f804-4dcc-9b42-c68b0d6c49ed req-e52b4201-3938-4ad5-8874-a0c923a6471f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Received unexpected event network-vif-plugged-bc46ca84-347f-4b48-a09d-4742bd167dc7 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.247 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.247 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.337 221324 DEBUG oslo_concurrency.processutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.470 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.553 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.594 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.595 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.595 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquired lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.595 221324 DEBUG nova.network.neutron [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:08:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:40Z|00252|binding|INFO|Releasing lport b4a40811-3703-4da5-859c-3e041b7cfee4 from this chassis (sb_readonly=0)
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3711067702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.763 221324 DEBUG oslo_concurrency.processutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.770 221324 DEBUG nova.compute.provider_tree [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.815 221324 DEBUG nova.scheduler.client.report [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.871 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:40 np0005603622 nova_compute[221301]: 2026-01-31 08:08:40.916 221324 INFO nova.scheduler.client.report [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Deleted allocations for instance c131ca84-4259-49e0-a3d5-a4a5ca629d31#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.033 221324 DEBUG oslo_concurrency.lockutils [None req-1f2acaa6-e40b-4541-956a-4b75ef376a53 31a158d811524ad292199b931be12bc7 64a52f6edb6c4843a8decc313f8c10e3 - - default default] Lock "c131ca84-4259-49e0-a3d5-a4a5ca629d31" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.173 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.174 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.174 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.175 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.175 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.177 221324 INFO nova.compute.manager [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Terminating instance#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.178 221324 DEBUG nova.compute.manager [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:08:41 np0005603622 kernel: tap1a414f6e-34 (unregistering): left promiscuous mode
Jan 31 03:08:41 np0005603622 NetworkManager[49080]: <info>  [1769846921.2496] device (tap1a414f6e-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.261 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:41Z|00253|binding|INFO|Releasing lport 1a414f6e-3462-4511-a204-1326ac6d6fcd from this chassis (sb_readonly=0)
Jan 31 03:08:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:41Z|00254|binding|INFO|Setting lport 1a414f6e-3462-4511-a204-1326ac6d6fcd down in Southbound
Jan 31 03:08:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:08:41Z|00255|binding|INFO|Removing iface tap1a414f6e-34 ovn-installed in OVS
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:41.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:41 np0005603622 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 31 03:08:41 np0005603622 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000048.scope: Consumed 16.149s CPU time.
Jan 31 03:08:41 np0005603622 systemd-machined[190406]: Machine qemu-30-instance-00000048 terminated.
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.351 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:fa:32 10.100.0.12'], port_security=['fa:16:3e:83:fa:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9f1f7c19-2aa2-4663-81ca-a26eaa106441', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c73212dc7c84914b6c934d45b6826f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c826f71-7560-44f4-8034-5ac735f4e81f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8030b63f-5501-4734-a04c-133b7c767454, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1a414f6e-3462-4511-a204-1326ac6d6fcd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.353 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1a414f6e-3462-4511-a204-1326ac6d6fcd in datapath 455fab34-b015-4d97-a96d-f7ebd7f7555f unbound from our chassis#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.355 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455fab34-b015-4d97-a96d-f7ebd7f7555f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.356 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d38907ef-6c9c-455c-8e30-03a5cc24b1d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.357 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f namespace which is not needed anymore#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.419 221324 INFO nova.virt.libvirt.driver [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Instance destroyed successfully.#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.420 221324 DEBUG nova.objects.instance [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lazy-loading 'resources' on Instance uuid 9f1f7c19-2aa2-4663-81ca-a26eaa106441 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.454 221324 DEBUG nova.virt.libvirt.vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.455 221324 DEBUG nova.network.os_vif_util [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.457 221324 DEBUG nova.network.os_vif_util [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.458 221324 DEBUG os_vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.462 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a414f6e-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.468 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.470 221324 INFO os_vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=1a414f6e-3462-4511-a204-1326ac6d6fcd,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a414f6e-34')#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.472 221324 DEBUG nova.virt.libvirt.vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-576814234',display_name='tempest-tempest.common.compute-instance-576814234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-576814234',id=72,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPZmLk+NGh2zKbql/sBzP6qM4W9cXGD3OUJAhT/207QiFni858RIgrXDyBBR0Tlv+t9A7ybvSMg5e6CDTEEkg6g7w68asAv+N4fL3AAeDAWcmo04YGYMANL/8swEdfyv1w==',key_name='tempest-keypair-1664347335',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:08:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c73212dc7c84914b6c934d45b6826f7',ramdisk_id='',reservation_id='r-ypbj0lf2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1920739502',owner_user_name='tempest-AttachInterfacesTestJSON-1920739502-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:08:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='60f2b878669c4c529b35e04860cc6d64',uuid=9f1f7c19-2aa2-4663-81ca-a26eaa106441,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.472 221324 DEBUG nova.network.os_vif_util [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converting VIF {"id": "131dda89-6e7d-4a88-9572-dcd63205fc02", "address": "fa:16:3e:b3:47:2e", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap131dda89-6e", "ovs_interfaceid": "131dda89-6e7d-4a88-9572-dcd63205fc02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.473 221324 DEBUG nova.network.os_vif_util [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.474 221324 DEBUG os_vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.476 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap131dda89-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.476 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.479 221324 INFO os_vif [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:47:2e,bridge_name='br-int',has_traffic_filtering=True,id=131dda89-6e7d-4a88-9572-dcd63205fc02,network=Network(455fab34-b015-4d97-a96d-f7ebd7f7555f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap131dda89-6e')#033[00m
Jan 31 03:08:41 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [NOTICE]   (247977) : haproxy version is 2.8.14-c23fe91
Jan 31 03:08:41 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [NOTICE]   (247977) : path to executable is /usr/sbin/haproxy
Jan 31 03:08:41 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [WARNING]  (247977) : Exiting Master process...
Jan 31 03:08:41 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [ALERT]    (247977) : Current worker (247979) exited with code 143 (Terminated)
Jan 31 03:08:41 np0005603622 neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f[247973]: [WARNING]  (247977) : All workers exited. Exiting... (0)
Jan 31 03:08:41 np0005603622 systemd[1]: libpod-d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522.scope: Deactivated successfully.
Jan 31 03:08:41 np0005603622 conmon[247973]: conmon d18da0c5f2ee8eb0c02d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522.scope/container/memory.events
Jan 31 03:08:41 np0005603622 podman[248654]: 2026-01-31 08:08:41.525957539 +0000 UTC m=+0.054439765 container died d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:08:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522-userdata-shm.mount: Deactivated successfully.
Jan 31 03:08:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay-88c11a727124ce99cca09e19c73e5a6a086065a4bc317065dd5e2e772e15ffa3-merged.mount: Deactivated successfully.
Jan 31 03:08:41 np0005603622 podman[248654]: 2026-01-31 08:08:41.570530268 +0000 UTC m=+0.099012494 container cleanup d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:08:41 np0005603622 systemd[1]: libpod-conmon-d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522.scope: Deactivated successfully.
Jan 31 03:08:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:41.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:41 np0005603622 podman[248701]: 2026-01-31 08:08:41.639071321 +0000 UTC m=+0.050485288 container remove d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.643 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[00dc22e9-6a12-44a9-a729-ed51e681e4f2]: (4, ('Sat Jan 31 08:08:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522)\nd18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522\nSat Jan 31 08:08:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f (d18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522)\nd18da0c5f2ee8eb0c02dae792b609fccbf6f4487a41d4cba043ecad67d5a1522\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.646 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a62c41e5-7a4f-4ef9-b859-4e47e04b58c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.647 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455fab34-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.649 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.656 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 kernel: tap455fab34-b0: left promiscuous mode
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.664 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e89e36-610b-4e72-8f33-d76c0489343d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.685 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ca9cf4-dd62-4f66-87ed-63b9b720d6a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.686 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[33839603-e424-4d26-baaf-c18a2cc3a72b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.701 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ff3f99-bd5a-4ba7-9761-0473d90ec69e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613026, 'reachable_time': 19412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248716, 'error': None, 'target': 'ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.703 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455fab34-b015-4d97-a96d-f7ebd7f7555f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:08:41 np0005603622 systemd[1]: run-netns-ovnmeta\x2d455fab34\x2db015\x2d4d97\x2da96d\x2df7ebd7f7555f.mount: Deactivated successfully.
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.703 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3f11b922-77eb-4ac9-892b-00c30b00c20a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:08:41 np0005603622 nova_compute[221301]: 2026-01-31 08:08:41.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.772 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:08:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:41.773 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.167 221324 INFO nova.virt.libvirt.driver [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Deleting instance files /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441_del#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.169 221324 INFO nova.virt.libvirt.driver [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Deletion of /var/lib/nova/instances/9f1f7c19-2aa2-4663-81ca-a26eaa106441_del complete#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.226 221324 INFO nova.compute.manager [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Took 1.05 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.227 221324 DEBUG oslo.service.loopingcall [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.227 221324 DEBUG nova.compute.manager [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.228 221324 DEBUG nova.network.neutron [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.335 221324 DEBUG nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-unplugged-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.335 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.336 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.336 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.337 221324 DEBUG nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-unplugged-1a414f6e-3462-4511-a204-1326ac6d6fcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.337 221324 DEBUG nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-unplugged-1a414f6e-3462-4511-a204-1326ac6d6fcd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.338 221324 DEBUG nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.338 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.338 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.339 221324 DEBUG oslo_concurrency.lockutils [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.339 221324 DEBUG nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] No waiting events found dispatching network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.340 221324 WARNING nova.compute.manager [req-1969b4da-9890-4a1e-997b-400171d56cf7 req-a018a224-1eb1-438a-a111-ff02c353655c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received unexpected event network-vif-plugged-1a414f6e-3462-4511-a204-1326ac6d6fcd for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.383 221324 INFO nova.network.neutron [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Port 131dda89-6e7d-4a88-9572-dcd63205fc02 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.384 221324 DEBUG nova.network.neutron [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [{"id": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "address": "fa:16:3e:83:fa:32", "network": {"id": "455fab34-b015-4d97-a96d-f7ebd7f7555f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-76911663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c73212dc7c84914b6c934d45b6826f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a414f6e-34", "ovs_interfaceid": "1a414f6e-3462-4511-a204-1326ac6d6fcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.419 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Releasing lock "refresh_cache-9f1f7c19-2aa2-4663-81ca-a26eaa106441" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:08:42 np0005603622 nova_compute[221301]: 2026-01-31 08:08:42.459 221324 DEBUG oslo_concurrency.lockutils [None req-446be8ff-4c35-4055-92ea-af4a4a239028 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "interface-9f1f7c19-2aa2-4663-81ca-a26eaa106441-131dda89-6e7d-4a88-9572-dcd63205fc02" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:43.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:43.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:44 np0005603622 nova_compute[221301]: 2026-01-31 08:08:44.470 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:44 np0005603622 nova_compute[221301]: 2026-01-31 08:08:44.593 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:08:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:45.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.407 221324 DEBUG nova.network.neutron [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.454 221324 INFO nova.compute.manager [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Took 3.23 seconds to deallocate network for instance.#033[00m
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.517 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.518 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.551 221324 DEBUG nova.compute.manager [req-ace68bd4-9c68-454c-ab77-5465b20022cd req-74d8e408-60bc-4b3d-9f6f-8381fac4d0d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Received event network-vif-deleted-1a414f6e-3462-4511-a204-1326ac6d6fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:08:45 np0005603622 nova_compute[221301]: 2026-01-31 08:08:45.560 221324 DEBUG oslo_concurrency.processutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:08:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:45.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:08:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2690538863' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.022 221324 DEBUG oslo_concurrency.processutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.029 221324 DEBUG nova.compute.provider_tree [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.091 221324 DEBUG nova.scheduler.client.report [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.237 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.452 221324 INFO nova.scheduler.client.report [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Deleted allocations for instance 9f1f7c19-2aa2-4663-81ca-a26eaa106441#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:46 np0005603622 nova_compute[221301]: 2026-01-31 08:08:46.649 221324 DEBUG oslo_concurrency.lockutils [None req-4f2d560d-e362-4318-b8a6-10901a9eef20 60f2b878669c4c529b35e04860cc6d64 0c73212dc7c84914b6c934d45b6826f7 - - default default] Lock "9f1f7c19-2aa2-4663-81ca-a26eaa106441" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:08:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:08:46.774 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:08:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:47.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:47.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:49.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:49 np0005603622 nova_compute[221301]: 2026-01-31 08:08:49.472 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:49.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:51.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:51 np0005603622 nova_compute[221301]: 2026-01-31 08:08:51.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:51.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:52 np0005603622 podman[248741]: 2026-01-31 08:08:52.1891058 +0000 UTC m=+0.102889279 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:08:52 np0005603622 nova_compute[221301]: 2026-01-31 08:08:52.360 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846917.3595057, c131ca84-4259-49e0-a3d5-a4a5ca629d31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:52 np0005603622 nova_compute[221301]: 2026-01-31 08:08:52.362 221324 INFO nova.compute.manager [-] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:08:52 np0005603622 nova_compute[221301]: 2026-01-31 08:08:52.391 221324 DEBUG nova.compute.manager [None req-1aa1cfc9-969b-46b9-a859-53df00d0a0d3 - - - - - -] [instance: c131ca84-4259-49e0-a3d5-a4a5ca629d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:53.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:53.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:54 np0005603622 podman[248768]: 2026-01-31 08:08:54.130244453 +0000 UTC m=+0.050973362 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 31 03:08:54 np0005603622 nova_compute[221301]: 2026-01-31 08:08:54.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:08:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:55.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:08:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:55.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:08:56 np0005603622 nova_compute[221301]: 2026-01-31 08:08:56.417 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846921.41589, 9f1f7c19-2aa2-4663-81ca-a26eaa106441 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:08:56 np0005603622 nova_compute[221301]: 2026-01-31 08:08:56.418 221324 INFO nova.compute.manager [-] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:08:56 np0005603622 nova_compute[221301]: 2026-01-31 08:08:56.449 221324 DEBUG nova.compute.manager [None req-452e044a-0ef5-4dfc-ad56-ebe8a111eeda - - - - - -] [instance: 9f1f7c19-2aa2-4663-81ca-a26eaa106441] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:08:56 np0005603622 nova_compute[221301]: 2026-01-31 08:08:56.522 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:57.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:57.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:08:59.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:08:59 np0005603622 nova_compute[221301]: 2026-01-31 08:08:59.520 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:08:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:08:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:08:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:08:59.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.251 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.252 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.272 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.358 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.359 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.370 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.371 221324 INFO nova.compute.claims [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:09:00 np0005603622 nova_compute[221301]: 2026-01-31 08:09:00.500 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:01.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/113449352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.385 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.885s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.396 221324 DEBUG nova.compute.provider_tree [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.426 221324 DEBUG nova.scheduler.client.report [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.486 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.488 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.569 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.570 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.607 221324 INFO nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:09:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:01.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.657 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.897 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.899 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:09:01 np0005603622 nova_compute[221301]: 2026-01-31 08:09:01.900 221324 INFO nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Creating image(s)#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.839 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.873 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.907 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.911 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.939 221324 DEBUG nova.policy [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ec2f1b397404af09d4aba95271759dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd99975994ff45c99cc41da24976e491', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.993 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.993 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.994 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:02 np0005603622 nova_compute[221301]: 2026-01-31 08:09:02.995 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:03 np0005603622 nova_compute[221301]: 2026-01-31 08:09:03.030 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:03 np0005603622 nova_compute[221301]: 2026-01-31 08:09:03.035 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:03.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:03.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:04 np0005603622 nova_compute[221301]: 2026-01-31 08:09:04.496 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Successfully created port: 8b68efaf-3bec-4475-88f7-dd2b3e15802c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:09:04 np0005603622 nova_compute[221301]: 2026-01-31 08:09:04.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:04 np0005603622 nova_compute[221301]: 2026-01-31 08:09:04.711 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.677s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:04 np0005603622 nova_compute[221301]: 2026-01-31 08:09:04.807 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] resizing rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.193 221324 DEBUG nova.objects.instance [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lazy-loading 'migration_context' on Instance uuid ebe0c91b-fcdb-446a-9991-a40196d98c4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.263 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.263 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Ensure instance console log exists: /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.264 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.265 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.265 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:05.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:05.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.882 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Successfully updated port: 8b68efaf-3bec-4475-88f7-dd2b3e15802c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.916 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.917 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquired lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:05 np0005603622 nova_compute[221301]: 2026-01-31 08:09:05.917 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:09:06 np0005603622 nova_compute[221301]: 2026-01-31 08:09:06.059 221324 DEBUG nova.compute.manager [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Received event network-changed-8b68efaf-3bec-4475-88f7-dd2b3e15802c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:06 np0005603622 nova_compute[221301]: 2026-01-31 08:09:06.060 221324 DEBUG nova.compute.manager [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Refreshing instance network info cache due to event network-changed-8b68efaf-3bec-4475-88f7-dd2b3e15802c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:09:06 np0005603622 nova_compute[221301]: 2026-01-31 08:09:06.060 221324 DEBUG oslo_concurrency.lockutils [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:06 np0005603622 nova_compute[221301]: 2026-01-31 08:09:06.524 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:09:06 np0005603622 nova_compute[221301]: 2026-01-31 08:09:06.531 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:07.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.476 221324 DEBUG nova.network.neutron [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Updating instance_info_cache with network_info: [{"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.591 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Releasing lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.591 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Instance network_info: |[{"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.592 221324 DEBUG oslo_concurrency.lockutils [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.593 221324 DEBUG nova.network.neutron [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Refreshing network info cache for port 8b68efaf-3bec-4475-88f7-dd2b3e15802c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.598 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Start _get_guest_xml network_info=[{"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.604 221324 WARNING nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.609 221324 DEBUG nova.virt.libvirt.host [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.610 221324 DEBUG nova.virt.libvirt.host [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.614 221324 DEBUG nova.virt.libvirt.host [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.615 221324 DEBUG nova.virt.libvirt.host [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.618 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.618 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.619 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.619 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.620 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.620 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.621 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.621 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.622 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.623 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.623 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.623 221324 DEBUG nova.virt.hardware [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:09:07 np0005603622 nova_compute[221301]: 2026-01-31 08:09:07.628 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:07.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3306374796' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.153 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.185 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.190 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3355630577' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.696 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.699 221324 DEBUG nova.virt.libvirt.vif [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2124760522',display_name='tempest-NoVNCConsoleTestJSON-server-2124760522',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2124760522',id=77,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd99975994ff45c99cc41da24976e491',ramdisk_id='',reservation_id='r-x5y29v21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-2037008786',owner_user_name='tempest-NoVNCConsoleTestJSON-2037008786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:01Z,user_data=None,user_id='6ec2f1b397404af09d4aba95271759dd',uuid=ebe0c91b-fcdb-446a-9991-a40196d98c4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.699 221324 DEBUG nova.network.os_vif_util [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converting VIF {"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.700 221324 DEBUG nova.network.os_vif_util [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.701 221324 DEBUG nova.objects.instance [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lazy-loading 'pci_devices' on Instance uuid ebe0c91b-fcdb-446a-9991-a40196d98c4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.718 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <uuid>ebe0c91b-fcdb-446a-9991-a40196d98c4b</uuid>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <name>instance-0000004d</name>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:name>tempest-NoVNCConsoleTestJSON-server-2124760522</nova:name>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:09:07</nova:creationTime>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:user uuid="6ec2f1b397404af09d4aba95271759dd">tempest-NoVNCConsoleTestJSON-2037008786-project-member</nova:user>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:project uuid="dd99975994ff45c99cc41da24976e491">tempest-NoVNCConsoleTestJSON-2037008786</nova:project>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <nova:port uuid="8b68efaf-3bec-4475-88f7-dd2b3e15802c">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="serial">ebe0c91b-fcdb-446a-9991-a40196d98c4b</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="uuid">ebe0c91b-fcdb-446a-9991-a40196d98c4b</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:6c:b4:f2"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <target dev="tap8b68efaf-3b"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/console.log" append="off"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:09:08 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:09:08 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:09:08 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:09:08 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.719 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Preparing to wait for external event network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.720 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.720 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.720 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.721 221324 DEBUG nova.virt.libvirt.vif [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2124760522',display_name='tempest-NoVNCConsoleTestJSON-server-2124760522',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2124760522',id=77,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd99975994ff45c99cc41da24976e491',ramdisk_id='',reservation_id='r-x5y29v21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-2037008786',owner_user_name='tempest-NoVNCConsoleTestJSON-2037008786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:01Z,user_data=None,user_id='6ec2f1b397404af09d4aba95271759dd',uuid=ebe0c91b-fcdb-446a-9991-a40196d98c4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.721 221324 DEBUG nova.network.os_vif_util [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converting VIF {"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.722 221324 DEBUG nova.network.os_vif_util [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.722 221324 DEBUG os_vif [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.723 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.725 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.728 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.729 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b68efaf-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.729 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b68efaf-3b, col_values=(('external_ids', {'iface-id': '8b68efaf-3bec-4475-88f7-dd2b3e15802c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:b4:f2', 'vm-uuid': 'ebe0c91b-fcdb-446a-9991-a40196d98c4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603622 NetworkManager[49080]: <info>  [1769846948.7320] manager: (tap8b68efaf-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.737 221324 INFO os_vif [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b')#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.813 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.813 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.813 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] No VIF found with MAC fa:16:3e:6c:b4:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.814 221324 INFO nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Using config drive#033[00m
Jan 31 03:09:08 np0005603622 nova_compute[221301]: 2026-01-31 08:09:08.848 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:09.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.401 221324 INFO nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Creating config drive at /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config#033[00m
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.405 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8axpjkhj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.543 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8axpjkhj" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.593 221324 DEBUG nova.storage.rbd_utils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] rbd image ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.598 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:09 np0005603622 nova_compute[221301]: 2026-01-31 08:09:09.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:09.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:10 np0005603622 nova_compute[221301]: 2026-01-31 08:09:10.414 221324 DEBUG nova.network.neutron [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Updated VIF entry in instance network info cache for port 8b68efaf-3bec-4475-88f7-dd2b3e15802c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:09:10 np0005603622 nova_compute[221301]: 2026-01-31 08:09:10.415 221324 DEBUG nova.network.neutron [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Updating instance_info_cache with network_info: [{"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:10 np0005603622 nova_compute[221301]: 2026-01-31 08:09:10.487 221324 DEBUG oslo_concurrency.lockutils [req-156d36d6-d885-47d9-9b36-b2bb56735b09 req-039b50ce-c396-4b47-b96f-02334927cadc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-ebe0c91b-fcdb-446a-9991-a40196d98c4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:11.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.400 221324 DEBUG oslo_concurrency.processutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config ebe0c91b-fcdb-446a-9991-a40196d98c4b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.802s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.401 221324 INFO nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Deleting local config drive /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:09:11 np0005603622 kernel: tap8b68efaf-3b: entered promiscuous mode
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.4492] manager: (tap8b68efaf-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Jan 31 03:09:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:11Z|00256|binding|INFO|Claiming lport 8b68efaf-3bec-4475-88f7-dd2b3e15802c for this chassis.
Jan 31 03:09:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:11Z|00257|binding|INFO|8b68efaf-3bec-4475-88f7-dd2b3e15802c: Claiming fa:16:3e:6c:b4:f2 10.100.0.10
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 systemd-machined[190406]: New machine qemu-32-instance-0000004d.
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:11Z|00258|binding|INFO|Setting lport 8b68efaf-3bec-4475-88f7-dd2b3e15802c ovn-installed in OVS
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.510 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 systemd[1]: Started Virtual Machine qemu-32-instance-0000004d.
Jan 31 03:09:11 np0005603622 systemd-udevd[249112]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.5471] device (tap8b68efaf-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.5477] device (tap8b68efaf-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:09:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:11Z|00259|binding|INFO|Setting lport 8b68efaf-3bec-4475-88f7-dd2b3e15802c up in Southbound
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.571 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:b4:f2 10.100.0.10'], port_security=['fa:16:3e:6c:b4:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ebe0c91b-fcdb-446a-9991-a40196d98c4b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-805a5162-0756-4738-89ca-694bfe9d313e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd99975994ff45c99cc41da24976e491', 'neutron:revision_number': '2', 'neutron:security_group_ids': '179c01fe-7e86-414d-8ce2-4d7c332b9d51', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f04d558-c668-4d85-a027-36bf6d6b63da, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8b68efaf-3bec-4475-88f7-dd2b3e15802c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.574 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8b68efaf-3bec-4475-88f7-dd2b3e15802c in datapath 805a5162-0756-4738-89ca-694bfe9d313e bound to our chassis#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.576 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 805a5162-0756-4738-89ca-694bfe9d313e#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.591 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee2166c-a46f-4236-941b-0250549037b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.593 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap805a5162-01 in ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.595 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap805a5162-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.595 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[128db03a-e9e0-4de7-9e13-c066e34ea444]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.597 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4add88-44b1-4abf-b9ea-00e271010b9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.617 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[57823755-aeb0-44ab-b1f1-f39375a3ff3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.639 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2efdf9da-6737-443e-b078-f9976ebcac06]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:11.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.668 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ebdb4016-c098-4eff-81b2-829d0861ae8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.6915] manager: (tap805a5162-00): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.694 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa70f2f-2604-41f5-9947-933192790b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.728 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[455fffb2-dcb1-4be0-9bef-66d830a09137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.731 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e38a7bd9-4942-41c4-b549-213d5591f388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.7549] device (tap805a5162-00): carrier: link connected
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.760 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c23a29bc-b4d4-4f50-ab35-75498dcc6853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.775 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5aeb90c0-f2e2-4dce-a880-393b8b432b04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap805a5162-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:2e:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620309, 'reachable_time': 17916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249145, 'error': None, 'target': 'ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.785 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[72bce8f4-a307-464e-a6cd-2cff1fd2e770]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:2e85'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 620309, 'tstamp': 620309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249146, 'error': None, 'target': 'ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.804 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5dfb60-b03f-409a-80aa-e91bdd573941]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap805a5162-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:2e:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620309, 'reachable_time': 17916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249147, 'error': None, 'target': 'ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.828 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[de6312af-7c5c-4b7f-88b9-771b5bfc3c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.882 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[332268ec-b779-4843-aeb7-16eae1aac379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.884 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap805a5162-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.884 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.885 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap805a5162-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 kernel: tap805a5162-00: entered promiscuous mode
Jan 31 03:09:11 np0005603622 NetworkManager[49080]: <info>  [1769846951.8890] manager: (tap805a5162-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.895 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap805a5162-00, col_values=(('external_ids', {'iface-id': '1f524be9-1f8b-4cfb-8ee2-08a411c7d7a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:11Z|00260|binding|INFO|Releasing lport 1f524be9-1f8b-4cfb-8ee2-08a411c7d7a8 from this chassis (sb_readonly=0)
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.900 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/805a5162-0756-4738-89ca-694bfe9d313e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/805a5162-0756-4738-89ca-694bfe9d313e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.901 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1540d1a4-0c52-4af2-935c-f4b323c27d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:11 np0005603622 nova_compute[221301]: 2026-01-31 08:09:11.903 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.903 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-805a5162-0756-4738-89ca-694bfe9d313e
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/805a5162-0756-4738-89ca-694bfe9d313e.pid.haproxy
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 805a5162-0756-4738-89ca-694bfe9d313e
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:09:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:11.904 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e', 'env', 'PROCESS_TAG=haproxy-805a5162-0756-4738-89ca-694bfe9d313e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/805a5162-0756-4738-89ca-694bfe9d313e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.233 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846952.2329478, ebe0c91b-fcdb-446a-9991-a40196d98c4b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.234 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:09:12 np0005603622 podman[249221]: 2026-01-31 08:09:12.289312148 +0000 UTC m=+0.058689699 container create 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:09:12 np0005603622 systemd[1]: Started libpod-conmon-7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558.scope.
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.339 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.347 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846952.2368803, ebe0c91b-fcdb-446a-9991-a40196d98c4b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.348 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:09:12 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:09:12 np0005603622 podman[249221]: 2026-01-31 08:09:12.262998331 +0000 UTC m=+0.032375922 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:09:12 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7758e3a00ce72df4079a2f928e17eeccaed49d9c3ecc48b30730732a79cc3508/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:09:12 np0005603622 podman[249221]: 2026-01-31 08:09:12.376122843 +0000 UTC m=+0.145500464 container init 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:09:12 np0005603622 podman[249221]: 2026-01-31 08:09:12.38345553 +0000 UTC m=+0.152833111 container start 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:09:12 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [NOTICE]   (249240) : New worker (249242) forked
Jan 31 03:09:12 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [NOTICE]   (249240) : Loading success.
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.499 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.504 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:12 np0005603622 nova_compute[221301]: 2026-01-31 08:09:12.796 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:13.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:13.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:13 np0005603622 nova_compute[221301]: 2026-01-31 08:09:13.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.217 221324 DEBUG nova.compute.manager [req-a8017bb0-f055-4563-99a7-82848dec188a req-5c6012dd-5f92-40de-9cd4-b8322b76ff67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Received event network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.218 221324 DEBUG oslo_concurrency.lockutils [req-a8017bb0-f055-4563-99a7-82848dec188a req-5c6012dd-5f92-40de-9cd4-b8322b76ff67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.218 221324 DEBUG oslo_concurrency.lockutils [req-a8017bb0-f055-4563-99a7-82848dec188a req-5c6012dd-5f92-40de-9cd4-b8322b76ff67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.218 221324 DEBUG oslo_concurrency.lockutils [req-a8017bb0-f055-4563-99a7-82848dec188a req-5c6012dd-5f92-40de-9cd4-b8322b76ff67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.219 221324 DEBUG nova.compute.manager [req-a8017bb0-f055-4563-99a7-82848dec188a req-5c6012dd-5f92-40de-9cd4-b8322b76ff67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Processing event network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.219 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.224 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769846954.2245038, ebe0c91b-fcdb-446a-9991-a40196d98c4b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.225 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.227 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.230 221324 INFO nova.virt.libvirt.driver [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Instance spawned successfully.#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.230 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.562 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.568 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.569 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.569 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.570 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.571 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.571 221324 DEBUG nova.virt.libvirt.driver [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.578 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.645 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.854 221324 INFO nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Took 12.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:09:14 np0005603622 nova_compute[221301]: 2026-01-31 08:09:14.856 221324 DEBUG nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:15 np0005603622 nova_compute[221301]: 2026-01-31 08:09:15.059 221324 INFO nova.compute.manager [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Took 14.74 seconds to build instance.#033[00m
Jan 31 03:09:15 np0005603622 nova_compute[221301]: 2026-01-31 08:09:15.229 221324 DEBUG oslo_concurrency.lockutils [None req-a90c8887-8477-4e5f-8519-5468a6db9a08 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.116003118s ======
Jan 31 03:09:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:15.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.116003118s
Jan 31 03:09:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:15.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.791 221324 DEBUG nova.compute.manager [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Received event network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.791 221324 DEBUG oslo_concurrency.lockutils [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.792 221324 DEBUG oslo_concurrency.lockutils [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.792 221324 DEBUG oslo_concurrency.lockutils [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.792 221324 DEBUG nova.compute.manager [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] No waiting events found dispatching network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.792 221324 WARNING nova.compute.manager [req-13f4ee09-279a-4e96-a6c7-001a54cee764 req-6c5fa0e2-a580-4ec5-b06d-5493c8a7fcbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Received unexpected event network-vif-plugged-8b68efaf-3bec-4475-88f7-dd2b3e15802c for instance with vm_state active and task_state None.#033[00m
Jan 31 03:09:16 np0005603622 nova_compute[221301]: 2026-01-31 08:09:16.816 221324 DEBUG nova.compute.manager [None req-e8ce35b6-3913-42a8-9bb0-58f6531cb542 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Jan 31 03:09:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:09:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:17.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:09:17 np0005603622 nova_compute[221301]: 2026-01-31 08:09:17.658 221324 DEBUG nova.compute.manager [None req-6f175430-0093-47ee-858e-a1103b7a63f0 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Jan 31 03:09:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:17.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.151 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.152 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.152 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.152 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.153 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.154 221324 INFO nova.compute.manager [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Terminating instance#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.155 221324 DEBUG nova.compute.manager [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:09:18 np0005603622 kernel: tap8b68efaf-3b (unregistering): left promiscuous mode
Jan 31 03:09:18 np0005603622 NetworkManager[49080]: <info>  [1769846958.2157] device (tap8b68efaf-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:09:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:18Z|00261|binding|INFO|Releasing lport 8b68efaf-3bec-4475-88f7-dd2b3e15802c from this chassis (sb_readonly=0)
Jan 31 03:09:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:18Z|00262|binding|INFO|Setting lport 8b68efaf-3bec-4475-88f7-dd2b3e15802c down in Southbound
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.255 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:09:18Z|00263|binding|INFO|Removing iface tap8b68efaf-3b ovn-installed in OVS
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.258 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.271 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:b4:f2 10.100.0.10'], port_security=['fa:16:3e:6c:b4:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ebe0c91b-fcdb-446a-9991-a40196d98c4b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-805a5162-0756-4738-89ca-694bfe9d313e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd99975994ff45c99cc41da24976e491', 'neutron:revision_number': '4', 'neutron:security_group_ids': '179c01fe-7e86-414d-8ce2-4d7c332b9d51', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f04d558-c668-4d85-a027-36bf6d6b63da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8b68efaf-3bec-4475-88f7-dd2b3e15802c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.272 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8b68efaf-3bec-4475-88f7-dd2b3e15802c in datapath 805a5162-0756-4738-89ca-694bfe9d313e unbound from our chassis#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.274 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 805a5162-0756-4738-89ca-694bfe9d313e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.276 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7e973ea4-943d-4ae2-9cc1-403a0a459f73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.277 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e namespace which is not needed anymore#033[00m
Jan 31 03:09:18 np0005603622 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 31 03:09:18 np0005603622 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Consumed 4.791s CPU time.
Jan 31 03:09:18 np0005603622 systemd-machined[190406]: Machine qemu-32-instance-0000004d terminated.
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.395 221324 INFO nova.virt.libvirt.driver [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Instance destroyed successfully.#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.395 221324 DEBUG nova.objects.instance [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lazy-loading 'resources' on Instance uuid ebe0c91b-fcdb-446a-9991-a40196d98c4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.435 221324 DEBUG nova.virt.libvirt.vif [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2124760522',display_name='tempest-NoVNCConsoleTestJSON-server-2124760522',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2124760522',id=77,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd99975994ff45c99cc41da24976e491',ramdisk_id='',reservation_id='r-x5y29v21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-2037008786',owner_user_name='tempest-NoVNCConsoleTestJSON-2037008786-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:09:14Z,user_data=None,user_id='6ec2f1b397404af09d4aba95271759dd',uuid=ebe0c91b-fcdb-446a-9991-a40196d98c4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.436 221324 DEBUG nova.network.os_vif_util [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converting VIF {"id": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "address": "fa:16:3e:6c:b4:f2", "network": {"id": "805a5162-0756-4738-89ca-694bfe9d313e", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-212791358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd99975994ff45c99cc41da24976e491", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b68efaf-3b", "ovs_interfaceid": "8b68efaf-3bec-4475-88f7-dd2b3e15802c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.437 221324 DEBUG nova.network.os_vif_util [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.438 221324 DEBUG os_vif [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.441 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b68efaf-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.445 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [NOTICE]   (249240) : haproxy version is 2.8.14-c23fe91
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [NOTICE]   (249240) : path to executable is /usr/sbin/haproxy
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [WARNING]  (249240) : Exiting Master process...
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [WARNING]  (249240) : Exiting Master process...
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [ALERT]    (249240) : Current worker (249242) exited with code 143 (Terminated)
Jan 31 03:09:18 np0005603622 neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e[249236]: [WARNING]  (249240) : All workers exited. Exiting... (0)
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.453 221324 INFO os_vif [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:b4:f2,bridge_name='br-int',has_traffic_filtering=True,id=8b68efaf-3bec-4475-88f7-dd2b3e15802c,network=Network(805a5162-0756-4738-89ca-694bfe9d313e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b68efaf-3b')#033[00m
Jan 31 03:09:18 np0005603622 systemd[1]: libpod-7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558.scope: Deactivated successfully.
Jan 31 03:09:18 np0005603622 podman[249274]: 2026-01-31 08:09:18.4629963 +0000 UTC m=+0.083414344 container died 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:09:18 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558-userdata-shm.mount: Deactivated successfully.
Jan 31 03:09:18 np0005603622 systemd[1]: var-lib-containers-storage-overlay-7758e3a00ce72df4079a2f928e17eeccaed49d9c3ecc48b30730732a79cc3508-merged.mount: Deactivated successfully.
Jan 31 03:09:18 np0005603622 podman[249274]: 2026-01-31 08:09:18.506260024 +0000 UTC m=+0.126678048 container cleanup 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:09:18 np0005603622 systemd[1]: libpod-conmon-7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558.scope: Deactivated successfully.
Jan 31 03:09:18 np0005603622 podman[249327]: 2026-01-31 08:09:18.606491839 +0000 UTC m=+0.065257306 container remove 7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.613 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7d751b25-17e2-46dd-9c48-c3683b02b9b9]: (4, ('Sat Jan 31 08:09:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e (7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558)\n7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558\nSat Jan 31 08:09:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e (7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558)\n7624cfab9f032035689ab99356b98c73802fec82478c1411cec51919e4e2b558\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.616 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[12b4c47c-ba94-4b33-a1f4-0d66571d7c0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.618 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap805a5162-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 kernel: tap805a5162-00: left promiscuous mode
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.634 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee2c6f3-fc3b-4351-9b10-72c6b6bb9456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.652 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[52642c12-24ed-4bb7-89a1-88befe9b7f11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.653 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dd982d06-feab-4fb3-95d8-0acc1a2e64eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.671 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2785d700-fa1b-4c9c-a840-9cce4996e3a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620300, 'reachable_time': 36356, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249344, 'error': None, 'target': 'ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.675 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-805a5162-0756-4738-89ca-694bfe9d313e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:09:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:18.675 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[05fecfa8-dbd6-441a-a718-a961aa5c1da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:09:18 np0005603622 systemd[1]: run-netns-ovnmeta\x2d805a5162\x2d0756\x2d4738\x2d89ca\x2d694bfe9d313e.mount: Deactivated successfully.
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.965 221324 INFO nova.virt.libvirt.driver [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Deleting instance files /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b_del#033[00m
Jan 31 03:09:18 np0005603622 nova_compute[221301]: 2026-01-31 08:09:18.966 221324 INFO nova.virt.libvirt.driver [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Deletion of /var/lib/nova/instances/ebe0c91b-fcdb-446a-9991-a40196d98c4b_del complete#033[00m
Jan 31 03:09:19 np0005603622 nova_compute[221301]: 2026-01-31 08:09:19.115 221324 INFO nova.compute.manager [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:09:19 np0005603622 nova_compute[221301]: 2026-01-31 08:09:19.115 221324 DEBUG oslo.service.loopingcall [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:09:19 np0005603622 nova_compute[221301]: 2026-01-31 08:09:19.115 221324 DEBUG nova.compute.manager [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:09:19 np0005603622 nova_compute[221301]: 2026-01-31 08:09:19.116 221324 DEBUG nova.network.neutron [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:09:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:19.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:19 np0005603622 nova_compute[221301]: 2026-01-31 08:09:19.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:19.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.079 221324 DEBUG nova.network.neutron [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.163 221324 INFO nova.compute.manager [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Took 1.05 seconds to deallocate network for instance.#033[00m
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.472 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.473 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.539 221324 DEBUG oslo_concurrency.processutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1314275738' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.982 221324 DEBUG oslo_concurrency.processutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:20 np0005603622 nova_compute[221301]: 2026-01-31 08:09:20.990 221324 DEBUG nova.compute.provider_tree [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:21 np0005603622 nova_compute[221301]: 2026-01-31 08:09:21.023 221324 DEBUG nova.scheduler.client.report [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:21 np0005603622 nova_compute[221301]: 2026-01-31 08:09:21.055 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:21 np0005603622 nova_compute[221301]: 2026-01-31 08:09:21.132 221324 INFO nova.scheduler.client.report [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Deleted allocations for instance ebe0c91b-fcdb-446a-9991-a40196d98c4b#033[00m
Jan 31 03:09:21 np0005603622 nova_compute[221301]: 2026-01-31 08:09:21.226 221324 DEBUG oslo_concurrency.lockutils [None req-dc9fed8d-3bb8-4be9-85e6-0a974e1453f7 6ec2f1b397404af09d4aba95271759dd dd99975994ff45c99cc41da24976e491 - - default default] Lock "ebe0c91b-fcdb-446a-9991-a40196d98c4b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:21 np0005603622 nova_compute[221301]: 2026-01-31 08:09:21.288 221324 DEBUG nova.compute.manager [req-51008b1d-b5bf-4a3d-af1e-db136ddf149d req-18b64c4f-cb1a-416a-a094-97aabc8813b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Received event network-vif-deleted-8b68efaf-3bec-4475-88f7-dd2b3e15802c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:21.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:09:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:21.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:09:23 np0005603622 podman[249368]: 2026-01-31 08:09:23.256560588 +0000 UTC m=+0.180099905 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:09:23 np0005603622 nova_compute[221301]: 2026-01-31 08:09:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:23 np0005603622 nova_compute[221301]: 2026-01-31 08:09:23.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:09:23 np0005603622 nova_compute[221301]: 2026-01-31 08:09:23.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:23.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:23.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:24 np0005603622 nova_compute[221301]: 2026-01-31 08:09:24.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:25 np0005603622 podman[249393]: 2026-01-31 08:09:25.155137667 +0000 UTC m=+0.067489716 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 31 03:09:25 np0005603622 nova_compute[221301]: 2026-01-31 08:09:25.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:25.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:25.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:27.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:27.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:28 np0005603622 nova_compute[221301]: 2026-01-31 08:09:28.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603622 nova_compute[221301]: 2026-01-31 08:09:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:29 np0005603622 nova_compute[221301]: 2026-01-31 08:09:29.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:29 np0005603622 nova_compute[221301]: 2026-01-31 08:09:29.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:09:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:29.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:09:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.445 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.446 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2578799849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:30.863 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:30.864 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:30.864 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:30 np0005603622 nova_compute[221301]: 2026-01-31 08:09:30.874 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.055 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.057 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4642MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.057 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.058 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.165 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.166 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.184 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:09:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1099843889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.661 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.667 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.685 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:31.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.744 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:09:31 np0005603622 nova_compute[221301]: 2026-01-31 08:09:31.744 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:32 np0005603622 nova_compute[221301]: 2026-01-31 08:09:32.746 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:33 np0005603622 nova_compute[221301]: 2026-01-31 08:09:33.393 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769846958.3923182, ebe0c91b-fcdb-446a-9991-a40196d98c4b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:09:33 np0005603622 nova_compute[221301]: 2026-01-31 08:09:33.394 221324 INFO nova.compute.manager [-] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:09:33 np0005603622 nova_compute[221301]: 2026-01-31 08:09:33.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:33 np0005603622 nova_compute[221301]: 2026-01-31 08:09:33.470 221324 DEBUG nova.compute.manager [None req-a9dd8041-e897-4eaf-9154-fa8c07c8b05f - - - - - -] [instance: ebe0c91b-fcdb-446a-9991-a40196d98c4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:09:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:33.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:34 np0005603622 nova_compute[221301]: 2026-01-31 08:09:34.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:09:34 np0005603622 nova_compute[221301]: 2026-01-31 08:09:34.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:09:34 np0005603622 nova_compute[221301]: 2026-01-31 08:09:34.447 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:09:34 np0005603622 nova_compute[221301]: 2026-01-31 08:09:34.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:35.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:35.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:37.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:37.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:38 np0005603622 nova_compute[221301]: 2026-01-31 08:09:38.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:39.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:39 np0005603622 nova_compute[221301]: 2026-01-31 08:09:39.583 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:09:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:39.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:09:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:41.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:41.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:43 np0005603622 nova_compute[221301]: 2026-01-31 08:09:43.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:43.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:44 np0005603622 nova_compute[221301]: 2026-01-31 08:09:44.586 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:45.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:45.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:45.774 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:09:45 np0005603622 nova_compute[221301]: 2026-01-31 08:09:45.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:45.776 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:09:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.562 221324 DEBUG nova.compute.manager [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.656 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.657 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.682 221324 DEBUG nova.objects.instance [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_requests' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.699 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.699 221324 INFO nova.compute.claims [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.700 221324 DEBUG nova.objects.instance [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.720 221324 DEBUG nova.objects.instance [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'pci_devices' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.769 221324 INFO nova.compute.resource_tracker [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updating resource usage from migration a83b42a4-5af9-4f21-8f46-bf5820c39bec#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.770 221324 DEBUG nova.compute.resource_tracker [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Starting to track incoming migration a83b42a4-5af9-4f21-8f46-bf5820c39bec with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:09:46 np0005603622 nova_compute[221301]: 2026-01-31 08:09:46.852 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1170877670' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:47 np0005603622 nova_compute[221301]: 2026-01-31 08:09:47.267 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:47 np0005603622 nova_compute[221301]: 2026-01-31 08:09:47.275 221324 DEBUG nova.compute.provider_tree [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:47 np0005603622 nova_compute[221301]: 2026-01-31 08:09:47.298 221324 DEBUG nova.scheduler.client.report [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:47 np0005603622 nova_compute[221301]: 2026-01-31 08:09:47.328 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:47 np0005603622 nova_compute[221301]: 2026-01-31 08:09:47.328 221324 INFO nova.compute.manager [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Migrating#033[00m
Jan 31 03:09:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:47.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:47.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:09:47.780 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:09:48 np0005603622 nova_compute[221301]: 2026-01-31 08:09:48.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:49.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:49 np0005603622 nova_compute[221301]: 2026-01-31 08:09:49.589 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:49 np0005603622 systemd-logind[817]: New session 54 of user nova.
Jan 31 03:09:49 np0005603622 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:09:49 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:09:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:49.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:49 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:09:49 np0005603622 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:09:49 np0005603622 systemd[249668]: Queued start job for default target Main User Target.
Jan 31 03:09:49 np0005603622 systemd[249668]: Created slice User Application Slice.
Jan 31 03:09:49 np0005603622 systemd[249668]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:09:49 np0005603622 systemd[249668]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:09:49 np0005603622 systemd[249668]: Reached target Paths.
Jan 31 03:09:49 np0005603622 systemd[249668]: Reached target Timers.
Jan 31 03:09:49 np0005603622 systemd[249668]: Starting D-Bus User Message Bus Socket...
Jan 31 03:09:49 np0005603622 systemd[249668]: Starting Create User's Volatile Files and Directories...
Jan 31 03:09:49 np0005603622 systemd[249668]: Finished Create User's Volatile Files and Directories.
Jan 31 03:09:49 np0005603622 systemd[249668]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:09:49 np0005603622 systemd[249668]: Reached target Sockets.
Jan 31 03:09:49 np0005603622 systemd[249668]: Reached target Basic System.
Jan 31 03:09:49 np0005603622 systemd[249668]: Reached target Main User Target.
Jan 31 03:09:49 np0005603622 systemd[249668]: Startup finished in 126ms.
Jan 31 03:09:49 np0005603622 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:09:49 np0005603622 systemd[1]: Started Session 54 of User nova.
Jan 31 03:09:49 np0005603622 systemd[1]: session-54.scope: Deactivated successfully.
Jan 31 03:09:49 np0005603622 systemd-logind[817]: Session 54 logged out. Waiting for processes to exit.
Jan 31 03:09:49 np0005603622 systemd-logind[817]: Removed session 54.
Jan 31 03:09:50 np0005603622 systemd-logind[817]: New session 56 of user nova.
Jan 31 03:09:50 np0005603622 systemd[1]: Started Session 56 of User nova.
Jan 31 03:09:50 np0005603622 systemd[1]: session-56.scope: Deactivated successfully.
Jan 31 03:09:50 np0005603622 systemd-logind[817]: Session 56 logged out. Waiting for processes to exit.
Jan 31 03:09:50 np0005603622 systemd-logind[817]: Removed session 56.
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.267 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.269 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.289 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.359 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.359 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.369 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.371 221324 INFO nova.compute.claims [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:09:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1529264301' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:50 np0005603622 nova_compute[221301]: 2026-01-31 08:09:50.571 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:09:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2247929480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.032 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.039 221324 DEBUG nova.compute.provider_tree [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.064 221324 DEBUG nova.scheduler.client.report [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:09:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.101 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.102 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.160 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.161 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.184 221324 INFO nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.202 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.317 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.319 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.319 221324 INFO nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Creating image(s)#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.361 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.409 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.449 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.455 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:09:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:51.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.532 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.534 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.535 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.535 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.576 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:51 np0005603622 nova_compute[221301]: 2026-01-31 08:09:51.581 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:51.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:52 np0005603622 nova_compute[221301]: 2026-01-31 08:09:52.121 221324 DEBUG nova.policy [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12a823bd7c6e4cf492ebf6c1d002a91f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c03fec1b3664105996aa979e226d8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:09:53 np0005603622 nova_compute[221301]: 2026-01-31 08:09:53.006 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Successfully created port: bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:09:53 np0005603622 nova_compute[221301]: 2026-01-31 08:09:53.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:53.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:53.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:54 np0005603622 podman[249806]: 2026-01-31 08:09:54.23009489 +0000 UTC m=+0.146430059 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.560 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Successfully updated port: bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.582 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.582 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.582 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.591 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.684 221324 DEBUG nova.compute.manager [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.685 221324 DEBUG nova.compute.manager [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing instance network info cache due to event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:09:54 np0005603622 nova_compute[221301]: 2026-01-31 08:09:54.686 221324 DEBUG oslo_concurrency.lockutils [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:09:55 np0005603622 nova_compute[221301]: 2026-01-31 08:09:55.027 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:09:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:55.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:55.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:09:56 np0005603622 podman[249833]: 2026-01-31 08:09:56.170989919 +0000 UTC m=+0.086345713 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.252 221324 DEBUG nova.network.neutron [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.269 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.269 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Instance network_info: |[{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.269 221324 DEBUG oslo_concurrency.lockutils [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.270 221324 DEBUG nova.network.neutron [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.388 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.806s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:56 np0005603622 nova_compute[221301]: 2026-01-31 08:09:56.520 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] resizing rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:09:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:57.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.667 221324 DEBUG nova.network.neutron [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated VIF entry in instance network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.668 221324 DEBUG nova.network.neutron [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.687 221324 DEBUG oslo_concurrency.lockutils [req-2caaf641-e91f-4b13-9e08-9142166dae9b req-fc5704b8-d264-4fb0-b9a3-117344a0466a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:09:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:57.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.829 221324 DEBUG nova.objects.instance [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.848 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.849 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Ensure instance console log exists: /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.850 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.850 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.851 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.855 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Start _get_guest_xml network_info=[{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.862 221324 WARNING nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.870 221324 DEBUG nova.virt.libvirt.host [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.871 221324 DEBUG nova.virt.libvirt.host [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.874 221324 DEBUG nova.virt.libvirt.host [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.875 221324 DEBUG nova.virt.libvirt.host [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.878 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.878 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.879 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.880 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.880 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.880 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.881 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.881 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.882 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.882 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.883 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.883 221324 DEBUG nova.virt.hardware [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:09:57 np0005603622 nova_compute[221301]: 2026-01-31 08:09:57.889 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2667885371' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:58 np0005603622 nova_compute[221301]: 2026-01-31 08:09:58.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:58 np0005603622 nova_compute[221301]: 2026-01-31 08:09:58.496 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:58 np0005603622 nova_compute[221301]: 2026-01-31 08:09:58.537 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:58 np0005603622 nova_compute[221301]: 2026-01-31 08:09:58.543 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:09:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/441164949' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.035 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.039 221324 DEBUG nova.virt.libvirt.vif [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1838803378',display_name='tempest-ServerActionsTestOtherA-server-1838803378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1838803378',id=81,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-12td3w0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=183fba4a-610d-4b96-a12f-996f9f2dc30b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.040 221324 DEBUG nova.network.os_vif_util [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.042 221324 DEBUG nova.network.os_vif_util [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.044 221324 DEBUG nova.objects.instance [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.064 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <uuid>183fba4a-610d-4b96-a12f-996f9f2dc30b</uuid>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <name>instance-00000051</name>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestOtherA-server-1838803378</nova:name>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:09:57</nova:creationTime>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <nova:port uuid="bde2310b-2169-48e8-8a6f-c3f25dcd8fd5">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="serial">183fba4a-610d-4b96-a12f-996f9f2dc30b</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="uuid">183fba4a-610d-4b96-a12f-996f9f2dc30b</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/183fba4a-610d-4b96-a12f-996f9f2dc30b_disk">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:5f:ec:3b"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <target dev="tapbde2310b-21"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/console.log" append="off"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:09:59 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:09:59 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:09:59 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:09:59 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.066 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Preparing to wait for external event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.068 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.068 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.069 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.070 221324 DEBUG nova.virt.libvirt.vif [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1838803378',display_name='tempest-ServerActionsTestOtherA-server-1838803378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1838803378',id=81,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-12td3w0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:09:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=183fba4a-610d-4b96-a12f-996f9f2dc30b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.071 221324 DEBUG nova.network.os_vif_util [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.072 221324 DEBUG nova.network.os_vif_util [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.073 221324 DEBUG os_vif [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.074 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.075 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.076 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.081 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.081 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbde2310b-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.082 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbde2310b-21, col_values=(('external_ids', {'iface-id': 'bde2310b-2169-48e8-8a6f-c3f25dcd8fd5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:ec:3b', 'vm-uuid': '183fba4a-610d-4b96-a12f-996f9f2dc30b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:59 np0005603622 NetworkManager[49080]: <info>  [1769846999.0873] manager: (tapbde2310b-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.088 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.094 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.095 221324 INFO os_vif [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21')#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.153 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.154 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.154 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:5f:ec:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.155 221324 INFO nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Using config drive#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.195 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:09:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:09:59.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.594 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.613 221324 INFO nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Creating config drive at /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.619 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_80nugqc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:09:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:09:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:09:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:09:59.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.748 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_80nugqc" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.790 221324 DEBUG nova.storage.rbd_utils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:09:59 np0005603622 nova_compute[221301]: 2026-01-31 08:09:59.795 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:00 np0005603622 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:10:00 np0005603622 systemd[249668]: Activating special unit Exit the Session...
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped target Main User Target.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped target Basic System.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped target Paths.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped target Sockets.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped target Timers.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:10:00 np0005603622 systemd[249668]: Closed D-Bus User Message Bus Socket.
Jan 31 03:10:00 np0005603622 systemd[249668]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:10:00 np0005603622 systemd[249668]: Removed slice User Application Slice.
Jan 31 03:10:00 np0005603622 systemd[249668]: Reached target Shutdown.
Jan 31 03:10:00 np0005603622 systemd[249668]: Finished Exit the Session.
Jan 31 03:10:00 np0005603622 systemd[249668]: Reached target Exit the Session.
Jan 31 03:10:00 np0005603622 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:10:00 np0005603622 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:10:00 np0005603622 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:10:00 np0005603622 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:10:00 np0005603622 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:10:00 np0005603622 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:10:00 np0005603622 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:10:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:10:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:01.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:01.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:01 np0005603622 nova_compute[221301]: 2026-01-31 08:10:01.935 221324 DEBUG oslo_concurrency.processutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config 183fba4a-610d-4b96-a12f-996f9f2dc30b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:01 np0005603622 nova_compute[221301]: 2026-01-31 08:10:01.937 221324 INFO nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Deleting local config drive /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:10:01 np0005603622 kernel: tapbde2310b-21: entered promiscuous mode
Jan 31 03:10:01 np0005603622 NetworkManager[49080]: <info>  [1769847001.9851] manager: (tapbde2310b-21): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Jan 31 03:10:01 np0005603622 nova_compute[221301]: 2026-01-31 08:10:01.984 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:01Z|00264|binding|INFO|Claiming lport bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 for this chassis.
Jan 31 03:10:01 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:01Z|00265|binding|INFO|bde2310b-2169-48e8-8a6f-c3f25dcd8fd5: Claiming fa:16:3e:5f:ec:3b 10.100.0.6
Jan 31 03:10:01 np0005603622 nova_compute[221301]: 2026-01-31 08:10:01.987 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:01 np0005603622 nova_compute[221301]: 2026-01-31 08:10:01.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.003 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ec:3b 10.100.0.6'], port_security=['fa:16:3e:5f:ec:3b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '183fba4a-610d-4b96-a12f-996f9f2dc30b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4d0e926d-7b9d-4115-9719-7f0d71edaace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.004 139793 INFO neutron.agent.ovn.metadata.agent [-] Port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.006 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.020 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4aea2e-6afb-47c3-a3c6-6b0622655f34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.021 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f564452-51 in ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:10:02 np0005603622 systemd-machined[190406]: New machine qemu-33-instance-00000051.
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.022 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f564452-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.023 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09ae547a-0b56-467e-afa2-6469d840c17e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.024 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:02Z|00266|binding|INFO|Setting lport bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 ovn-installed in OVS
Jan 31 03:10:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:02Z|00267|binding|INFO|Setting lport bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 up in Southbound
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.026 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[60f5f225-1f60-40b6-8adf-e40868be0769]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 systemd[1]: Started Virtual Machine qemu-33-instance-00000051.
Jan 31 03:10:02 np0005603622 systemd-udevd[250065]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.038 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2918e5-ebb4-41df-8b30-98164b9244c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 NetworkManager[49080]: <info>  [1769847002.0570] device (tapbde2310b-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:10:02 np0005603622 NetworkManager[49080]: <info>  [1769847002.0579] device (tapbde2310b-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.066 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[809315a0-1050-4d82-afef-be0298b64e9b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.096 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6919b9-45fd-4e6d-baed-a39743a1f2b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.102 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f512340b-07c6-452c-a81e-8fb9b2d2352a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 NetworkManager[49080]: <info>  [1769847002.1049] manager: (tap1f564452-50): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.133 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f5792c34-58fd-4589-934e-d6f12c7175df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.136 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3f71aa14-6a05-4827-8f2f-e175dd616c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 NetworkManager[49080]: <info>  [1769847002.1567] device (tap1f564452-50): carrier: link connected
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.165 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80059e40-ef43-44ac-a9ed-d9be45804660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.185 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4506e12d-0975-4e7b-9f1c-075f257d9146]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625349, 'reachable_time': 38775, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250096, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.200 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4deceb5b-1acc-4df5-bbb2-11a578032b83]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:23e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625349, 'tstamp': 625349}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250097, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.218 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[677dc131-d69a-4808-bec2-2b12ba751948]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625349, 'reachable_time': 38775, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250098, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.244 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee25e27-12d0-4d2e-bd66-77e329c65324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.266 221324 DEBUG nova.compute.manager [req-9c48b55c-018e-4704-92d2-766376d1dd1c req-9c5ceb84-c66c-43df-820c-842a0527b2ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.267 221324 DEBUG oslo_concurrency.lockutils [req-9c48b55c-018e-4704-92d2-766376d1dd1c req-9c5ceb84-c66c-43df-820c-842a0527b2ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.268 221324 DEBUG oslo_concurrency.lockutils [req-9c48b55c-018e-4704-92d2-766376d1dd1c req-9c5ceb84-c66c-43df-820c-842a0527b2ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.268 221324 DEBUG oslo_concurrency.lockutils [req-9c48b55c-018e-4704-92d2-766376d1dd1c req-9c5ceb84-c66c-43df-820c-842a0527b2ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.268 221324 DEBUG nova.compute.manager [req-9c48b55c-018e-4704-92d2-766376d1dd1c req-9c5ceb84-c66c-43df-820c-842a0527b2ed fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Processing event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.295 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1af7b9-a50a-458b-8e12-77c9d9678f52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.297 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.298 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.299 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.341 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 NetworkManager[49080]: <info>  [1769847002.3424] manager: (tap1f564452-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 31 03:10:02 np0005603622 kernel: tap1f564452-50: entered promiscuous mode
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.347 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:02Z|00268|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.349 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.360 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.361 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc6fa8b-e15d-4785-8a1c-107b4dc4e71e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.362 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1f564452-5f08-4a1c-921e-f2daee9ec936.pid.haproxy
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1f564452-5f08-4a1c-921e-f2daee9ec936
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:10:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:02.363 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'env', 'PROCESS_TAG=haproxy-1f564452-5f08-4a1c-921e-f2daee9ec936', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f564452-5f08-4a1c-921e-f2daee9ec936.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.521 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847002.521357, 183fba4a-610d-4b96-a12f-996f9f2dc30b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.522 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.525 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.530 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.533 221324 INFO nova.virt.libvirt.driver [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Instance spawned successfully.#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.534 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.553 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.557 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.557 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.558 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.558 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.559 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.559 221324 DEBUG nova.virt.libvirt.driver [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.564 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.600 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.600 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847002.5223262, 183fba4a-610d-4b96-a12f-996f9f2dc30b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.601 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.627 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.631 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847002.5289054, 183fba4a-610d-4b96-a12f-996f9f2dc30b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.631 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.637 221324 INFO nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Took 11.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.638 221324 DEBUG nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.650 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.653 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.676 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.724 221324 INFO nova.compute.manager [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Took 12.38 seconds to build instance.#033[00m
Jan 31 03:10:02 np0005603622 podman[250172]: 2026-01-31 08:10:02.736429167 +0000 UTC m=+0.050507929 container create b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:10:02 np0005603622 nova_compute[221301]: 2026-01-31 08:10:02.748 221324 DEBUG oslo_concurrency.lockutils [None req-1312444c-70ed-40ad-ac73-26d094fa4632 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:02 np0005603622 systemd[1]: Started libpod-conmon-b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d.scope.
Jan 31 03:10:02 np0005603622 podman[250172]: 2026-01-31 08:10:02.706630306 +0000 UTC m=+0.020709088 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:10:02 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:10:02 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba85ee115b71a5449b3f6210383c98442e140ebe6c4b2778465058f33109bc2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:10:02 np0005603622 podman[250172]: 2026-01-31 08:10:02.832712657 +0000 UTC m=+0.146791459 container init b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:10:02 np0005603622 podman[250172]: 2026-01-31 08:10:02.841207935 +0000 UTC m=+0.155286707 container start b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:10:02 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [NOTICE]   (250191) : New worker (250193) forked
Jan 31 03:10:02 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [NOTICE]   (250191) : Loading success.
Jan 31 03:10:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:03.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:03.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.087 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.392 221324 DEBUG nova.compute.manager [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.392 221324 DEBUG oslo_concurrency.lockutils [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.392 221324 DEBUG oslo_concurrency.lockutils [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.392 221324 DEBUG oslo_concurrency.lockutils [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.392 221324 DEBUG nova.compute.manager [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] No waiting events found dispatching network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.393 221324 WARNING nova.compute.manager [req-27e1412a-f920-4c79-bae9-c1914e28487a req-d1d47d26-e7b4-4d56-9946-8cc983e4217b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received unexpected event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:10:04 np0005603622 nova_compute[221301]: 2026-01-31 08:10:04.598 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.139 221324 DEBUG nova.compute.manager [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.140 221324 DEBUG oslo_concurrency.lockutils [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.140 221324 DEBUG oslo_concurrency.lockutils [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.140 221324 DEBUG oslo_concurrency.lockutils [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.140 221324 DEBUG nova.compute.manager [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:05 np0005603622 nova_compute[221301]: 2026-01-31 08:10:05.141 221324 WARNING nova.compute.manager [req-81adb845-4e39-4acc-93af-d59b7b43284f req-485bbb59-0ef3-4ab1-b588-711638f96d3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:10:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:05.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:05.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:06 np0005603622 NetworkManager[49080]: <info>  [1769847006.8068] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 31 03:10:06 np0005603622 NetworkManager[49080]: <info>  [1769847006.8081] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 31 03:10:06 np0005603622 nova_compute[221301]: 2026-01-31 08:10:06.815 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:06 np0005603622 nova_compute[221301]: 2026-01-31 08:10:06.847 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:06 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:06Z|00269|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:06 np0005603622 nova_compute[221301]: 2026-01-31 08:10:06.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.273 221324 DEBUG nova.compute.manager [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.274 221324 DEBUG oslo_concurrency.lockutils [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.274 221324 DEBUG oslo_concurrency.lockutils [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.275 221324 DEBUG oslo_concurrency.lockutils [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.275 221324 DEBUG nova.compute.manager [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.275 221324 WARNING nova.compute.manager [req-2cdb5eee-b294-4c78-a9e9-1817030350ba req-17858fe0-a46c-4afd-b52e-728e4af4acd1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:10:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:07.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.574 221324 DEBUG nova.compute.manager [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.575 221324 DEBUG nova.compute.manager [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing instance network info cache due to event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.575 221324 DEBUG oslo_concurrency.lockutils [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.575 221324 DEBUG oslo_concurrency.lockutils [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.576 221324 DEBUG nova.network.neutron [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:10:07 np0005603622 nova_compute[221301]: 2026-01-31 08:10:07.594 221324 INFO nova.network.neutron [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updating port 22e87c0f-c5ce-4700-94a2-0d70f1b9655f with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:10:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:07.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:08 np0005603622 nova_compute[221301]: 2026-01-31 08:10:08.924 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:08 np0005603622 nova_compute[221301]: 2026-01-31 08:10:08.925 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquired lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:08 np0005603622 nova_compute[221301]: 2026-01-31 08:10:08.926 221324 DEBUG nova.network.neutron [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.092 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.206 221324 DEBUG nova.network.neutron [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated VIF entry in instance network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.207 221324 DEBUG nova.network.neutron [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.319 221324 DEBUG oslo_concurrency.lockutils [req-c1899a01-ca2b-421c-8b2d-e1e110c6de3c req-71c92dca-07cd-4c1c-88a4-a3a55dcea28c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.414 221324 DEBUG nova.compute.manager [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-changed-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.415 221324 DEBUG nova.compute.manager [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Refreshing instance network info cache due to event network-changed-22e87c0f-c5ce-4700-94a2-0d70f1b9655f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.415 221324 DEBUG oslo_concurrency.lockutils [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:09.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:09 np0005603622 nova_compute[221301]: 2026-01-31 08:10:09.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:09 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 31 03:10:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:09.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.289 221324 DEBUG nova.network.neutron [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updating instance_info_cache with network_info: [{"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.334 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Releasing lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.341 221324 DEBUG oslo_concurrency.lockutils [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.341 221324 DEBUG nova.network.neutron [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Refreshing network info cache for port 22e87c0f-c5ce-4700-94a2-0d70f1b9655f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.528 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.530 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.531 221324 INFO nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Creating image(s)#033[00m
Jan 31 03:10:10 np0005603622 nova_compute[221301]: 2026-01-31 08:10:10.595 221324 DEBUG nova.storage.rbd_utils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] creating snapshot(nova-resize) on rbd image(e528a53a-8ada-4966-912c-1f15ed61e649_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:10:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 31 03:10:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.317 221324 DEBUG nova.objects.instance [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'trusted_certs' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.491 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.492 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Ensure instance console log exists: /var/lib/nova/instances/e528a53a-8ada-4966-912c-1f15ed61e649/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.492 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.492 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.493 221324 DEBUG oslo_concurrency.lockutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.495 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Start _get_guest_xml network_info=[{"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-676112264-network", "vif_mac": "fa:16:3e:1c:4e:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.506 221324 WARNING nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.512 221324 DEBUG nova.virt.libvirt.host [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.513 221324 DEBUG nova.virt.libvirt.host [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.517 221324 DEBUG nova.virt.libvirt.host [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.517 221324 DEBUG nova.virt.libvirt.host [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.518 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.519 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.519 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.520 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.520 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.520 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.520 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.521 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.521 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.521 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.522 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.522 221324 DEBUG nova.virt.hardware [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.522 221324 DEBUG nova.objects.instance [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'vcpu_model' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:10:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:11.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:10:11 np0005603622 nova_compute[221301]: 2026-01-31 08:10:11.588 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:10:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:10:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:10:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/99657397' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.065 221324 DEBUG nova.network.neutron [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updated VIF entry in instance network info cache for port 22e87c0f-c5ce-4700-94a2-0d70f1b9655f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.067 221324 DEBUG nova.network.neutron [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updating instance_info_cache with network_info: [{"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.124 221324 DEBUG oslo_concurrency.lockutils [req-4695ad8b-1c4b-42fe-8131-0aa8bee329e7 req-00752347-91cd-44a1-8951-a678f1c4f1c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e528a53a-8ada-4966-912c-1f15ed61e649" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.169 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.492 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.501 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:10:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3108477737' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.961 221324 DEBUG oslo_concurrency.processutils [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.963 221324 DEBUG nova.virt.libvirt.vif [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1368556693',display_name='tempest-DeleteServersTestJSON-server-1368556693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1368556693',id=79,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-n1ah7sd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:07Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e528a53a-8ada-4966-912c-1f15ed61e649,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-676112264-network", "vif_mac": "fa:16:3e:1c:4e:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.964 221324 DEBUG nova.network.os_vif_util [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-676112264-network", "vif_mac": "fa:16:3e:1c:4e:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.965 221324 DEBUG nova.network.os_vif_util [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.968 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <uuid>e528a53a-8ada-4966-912c-1f15ed61e649</uuid>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <name>instance-0000004f</name>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <memory>196608</memory>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:name>tempest-DeleteServersTestJSON-server-1368556693</nova:name>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:10:11</nova:creationTime>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.micro">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:memory>192</nova:memory>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:user uuid="16d731f5875748ca9b8036b2ba061042">tempest-DeleteServersTestJSON-808715310-project-member</nova:user>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:project uuid="3469c253459e40e39dcf5bcb6a32008f">tempest-DeleteServersTestJSON-808715310</nova:project>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <nova:port uuid="22e87c0f-c5ce-4700-94a2-0d70f1b9655f">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="serial">e528a53a-8ada-4966-912c-1f15ed61e649</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="uuid">e528a53a-8ada-4966-912c-1f15ed61e649</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e528a53a-8ada-4966-912c-1f15ed61e649_disk">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e528a53a-8ada-4966-912c-1f15ed61e649_disk.config">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:1c:4e:08"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <target dev="tap22e87c0f-c5"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/e528a53a-8ada-4966-912c-1f15ed61e649/console.log" append="off"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:10:12 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:10:12 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:10:12 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:10:12 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.973 221324 DEBUG nova.virt.libvirt.vif [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1368556693',display_name='tempest-DeleteServersTestJSON-server-1368556693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1368556693',id=79,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:09:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-n1ah7sd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:07Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e528a53a-8ada-4966-912c-1f15ed61e649,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-676112264-network", "vif_mac": "fa:16:3e:1c:4e:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.973 221324 DEBUG nova.network.os_vif_util [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-676112264-network", "vif_mac": "fa:16:3e:1c:4e:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.974 221324 DEBUG nova.network.os_vif_util [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.975 221324 DEBUG os_vif [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.975 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.976 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.976 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.979 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.980 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e87c0f-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.980 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap22e87c0f-c5, col_values=(('external_ids', {'iface-id': '22e87c0f-c5ce-4700-94a2-0d70f1b9655f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:4e:08', 'vm-uuid': 'e528a53a-8ada-4966-912c-1f15ed61e649'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:12 np0005603622 NetworkManager[49080]: <info>  [1769847012.9829] manager: (tap22e87c0f-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.984 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:12 np0005603622 nova_compute[221301]: 2026-01-31 08:10:12.991 221324 INFO os_vif [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5')#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.084 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.085 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.086 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] No VIF found with MAC fa:16:3e:1c:4e:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.086 221324 INFO nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Using config drive#033[00m
Jan 31 03:10:13 np0005603622 kernel: tap22e87c0f-c5: entered promiscuous mode
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.1862] manager: (tap22e87c0f-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:13Z|00270|binding|INFO|Claiming lport 22e87c0f-c5ce-4700-94a2-0d70f1b9655f for this chassis.
Jan 31 03:10:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:13Z|00271|binding|INFO|22e87c0f-c5ce-4700-94a2-0d70f1b9655f: Claiming fa:16:3e:1c:4e:08 10.100.0.7
Jan 31 03:10:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:13Z|00272|binding|INFO|Setting lport 22e87c0f-c5ce-4700-94a2-0d70f1b9655f ovn-installed in OVS
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.196 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:4e:08 10.100.0.7'], port_security=['fa:16:3e:1c:4e:08 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e528a53a-8ada-4966-912c-1f15ed61e649', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=22e87c0f-c5ce-4700-94a2-0d70f1b9655f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.198 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 22e87c0f-c5ce-4700-94a2-0d70f1b9655f in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 bound to our chassis#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.198 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.200 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6#033[00m
Jan 31 03:10:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:13Z|00273|binding|INFO|Setting lport 22e87c0f-c5ce-4700-94a2-0d70f1b9655f up in Southbound
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.208 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.213 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c3930d-a358-4461-a8b0-a354c87ff510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.215 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1c6810e-e1 in ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.216 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.221 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1c6810e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.222 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ecd26d-76f6-4435-969d-7e9272729c73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.225 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[166bd800-f3bc-4a06-b081-e511ce518117]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 systemd-machined[190406]: New machine qemu-34-instance-0000004f.
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.242 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e67b9e78-e64b-433a-a4e4-b8ac285c0dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 systemd[1]: Started Virtual Machine qemu-34-instance-0000004f.
Jan 31 03:10:13 np0005603622 systemd-udevd[250373]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.259 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[90b66a67-0203-40a7-8f27-3c90affe74ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.2745] device (tap22e87c0f-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.2751] device (tap22e87c0f-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.293 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fb93e27e-d78b-43a2-bd8d-f0564cf2e946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.2994] manager: (tapc1c6810e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Jan 31 03:10:13 np0005603622 systemd-udevd[250377]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.301 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7b281821-c75e-41ce-8d26-0a9c1bb28062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.342 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f7f0e1-8fa2-4e70-bfa6-6440b1cff64b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.346 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bae41055-a878-47ce-86e7-9a2d85474036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.3732] device (tapc1c6810e-e0): carrier: link connected
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.380 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a108b372-43b4-4414-8431-260738fe3c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.402 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0e12046e-f788-4e5a-90fd-8999d7fb44dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626471, 'reachable_time': 16836, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250403, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.420 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3cdcc78a-b19f-466f-9467-a4a57c53363a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:9781'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 626471, 'tstamp': 626471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250404, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.440 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2d411814-b7b5-4908-be68-e91c6ff09610]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1c6810e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:97:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626471, 'reachable_time': 16836, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250405, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.468 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8e31c22-5a43-4802-b458-6fb804688af2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.525 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d1bd4eac-4b28-4abc-b5aa-a057a0b8038c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.529 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.529 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.531 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1c6810e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.534 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 NetworkManager[49080]: <info>  [1769847013.5353] manager: (tapc1c6810e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 31 03:10:13 np0005603622 kernel: tapc1c6810e-e0: entered promiscuous mode
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:13.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.549 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1c6810e-e0, col_values=(('external_ids', {'iface-id': '937542c1-ab1e-4312-ab3a-ee4483fcdf7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.551 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:13Z|00274|binding|INFO|Releasing lport 937542c1-ab1e-4312-ab3a-ee4483fcdf7b from this chassis (sb_readonly=0)
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.553 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.554 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.561 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6224f44a-ff65-4d77-8143-5190b42bcbc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.563 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.pid.haproxy
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c1c6810e-ec8f-43f3-a3c6-22606d9416b6
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:10:13 np0005603622 nova_compute[221301]: 2026-01-31 08:10:13.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:13.564 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'env', 'PROCESS_TAG=haproxy-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1c6810e-ec8f-43f3-a3c6-22606d9416b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:10:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:13.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:13 np0005603622 podman[250438]: 2026-01-31 08:10:13.975329532 +0000 UTC m=+0.053437358 container create c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:10:14 np0005603622 systemd[1]: Started libpod-conmon-c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb.scope.
Jan 31 03:10:14 np0005603622 podman[250438]: 2026-01-31 08:10:13.943861746 +0000 UTC m=+0.021969552 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:10:14 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:10:14 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b81f3df788bce893d1fee4dc9dfc85a6c3c5a873d2d8f5b26fe84b71977e556/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:10:14 np0005603622 podman[250438]: 2026-01-31 08:10:14.078896877 +0000 UTC m=+0.157004703 container init c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:10:14 np0005603622 podman[250438]: 2026-01-31 08:10:14.084492958 +0000 UTC m=+0.162600744 container start c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:10:14 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [NOTICE]   (250457) : New worker (250459) forked
Jan 31 03:10:14 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [NOTICE]   (250457) : Loading success.
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.858 221324 DEBUG nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.859 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.859 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.859 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.860 221324 DEBUG nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.860 221324 WARNING nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.860 221324 DEBUG nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.860 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.861 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.861 221324 DEBUG oslo_concurrency.lockutils [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.861 221324 DEBUG nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.861 221324 WARNING nova.compute.manager [req-1e7e3caf-1eea-467d-b4cc-36ad88f31d0f req-ccd3db2c-6818-4c36-9a06-c5dfc739dbb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.878 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847014.8781018, e528a53a-8ada-4966-912c-1f15ed61e649 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.879 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.881 221324 DEBUG nova.compute.manager [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.886 221324 INFO nova.virt.libvirt.driver [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Instance running successfully.#033[00m
Jan 31 03:10:14 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.891 221324 DEBUG nova.virt.libvirt.guest [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:10:14 np0005603622 nova_compute[221301]: 2026-01-31 08:10:14.892 221324 DEBUG nova.virt.libvirt.driver [None req-024ec3a6-b778-4b7d-80fd-39f80b5c3bb6 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.031 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.036 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.077 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.077 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847014.8793657, e528a53a-8ada-4966-912c-1f15ed61e649 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.077 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] VM Started (Lifecycle Event)#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.287 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:15 np0005603622 nova_compute[221301]: 2026-01-31 08:10:15.298 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:10:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:15.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:15.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:16Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:ec:3b 10.100.0.6
Jan 31 03:10:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:16Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:ec:3b 10.100.0.6
Jan 31 03:10:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:17.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:17.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:17 np0005603622 nova_compute[221301]: 2026-01-31 08:10:17.983 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:19.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:19 np0005603622 nova_compute[221301]: 2026-01-31 08:10:19.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:19.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:21.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.022 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:23.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:23.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.790 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.791 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.791 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.792 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.792 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.794 221324 INFO nova.compute.manager [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Terminating instance#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.796 221324 DEBUG nova.compute.manager [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:10:23 np0005603622 kernel: tap22e87c0f-c5 (unregistering): left promiscuous mode
Jan 31 03:10:23 np0005603622 NetworkManager[49080]: <info>  [1769847023.8534] device (tap22e87c0f-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:10:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:23Z|00275|binding|INFO|Releasing lport 22e87c0f-c5ce-4700-94a2-0d70f1b9655f from this chassis (sb_readonly=0)
Jan 31 03:10:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:23Z|00276|binding|INFO|Setting lport 22e87c0f-c5ce-4700-94a2-0d70f1b9655f down in Southbound
Jan 31 03:10:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:23Z|00277|binding|INFO|Removing iface tap22e87c0f-c5 ovn-installed in OVS
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.860 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:23 np0005603622 nova_compute[221301]: 2026-01-31 08:10:23.869 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:23.891 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:4e:08 10.100.0.7'], port_security=['fa:16:3e:1c:4e:08 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e528a53a-8ada-4966-912c-1f15ed61e649', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3469c253459e40e39dcf5bcb6a32008f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e42c06e8-2644-4a21-adfb-06ef74de77bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=298bbe2a-1faa-4c77-b3c3-4633e58f5921, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=22e87c0f-c5ce-4700-94a2-0d70f1b9655f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:23.893 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 22e87c0f-c5ce-4700-94a2-0d70f1b9655f in datapath c1c6810e-ec8f-43f3-a3c6-22606d9416b6 unbound from our chassis#033[00m
Jan 31 03:10:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:23.895 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1c6810e-ec8f-43f3-a3c6-22606d9416b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:10:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:23.898 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2cd618-10f5-4435-bffb-28feedd7aff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:23.900 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 namespace which is not needed anymore#033[00m
Jan 31 03:10:23 np0005603622 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 31 03:10:23 np0005603622 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004f.scope: Consumed 10.660s CPU time.
Jan 31 03:10:23 np0005603622 systemd-machined[190406]: Machine qemu-34-instance-0000004f terminated.
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.036 221324 INFO nova.virt.libvirt.driver [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Instance destroyed successfully.#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.038 221324 DEBUG nova.objects.instance [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lazy-loading 'resources' on Instance uuid e528a53a-8ada-4966-912c-1f15ed61e649 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [NOTICE]   (250457) : haproxy version is 2.8.14-c23fe91
Jan 31 03:10:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [NOTICE]   (250457) : path to executable is /usr/sbin/haproxy
Jan 31 03:10:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [WARNING]  (250457) : Exiting Master process...
Jan 31 03:10:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [ALERT]    (250457) : Current worker (250459) exited with code 143 (Terminated)
Jan 31 03:10:24 np0005603622 neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6[250453]: [WARNING]  (250457) : All workers exited. Exiting... (0)
Jan 31 03:10:24 np0005603622 systemd[1]: libpod-c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb.scope: Deactivated successfully.
Jan 31 03:10:24 np0005603622 podman[250535]: 2026-01-31 08:10:24.056220924 +0000 UTC m=+0.049416410 container died c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.056 221324 DEBUG nova.virt.libvirt.vif [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1368556693',display_name='tempest-DeleteServersTestJSON-server-1368556693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1368556693',id=79,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3469c253459e40e39dcf5bcb6a32008f',ramdisk_id='',reservation_id='r-n1ah7sd5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-808715310',owner_user_name='tempest-DeleteServersTestJSON-808715310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:10:15Z,user_data=None,user_id='16d731f5875748ca9b8036b2ba061042',uuid=e528a53a-8ada-4966-912c-1f15ed61e649,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.057 221324 DEBUG nova.network.os_vif_util [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converting VIF {"id": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "address": "fa:16:3e:1c:4e:08", "network": {"id": "c1c6810e-ec8f-43f3-a3c6-22606d9416b6", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-676112264-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3469c253459e40e39dcf5bcb6a32008f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22e87c0f-c5", "ovs_interfaceid": "22e87c0f-c5ce-4700-94a2-0d70f1b9655f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.059 221324 DEBUG nova.network.os_vif_util [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.059 221324 DEBUG os_vif [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.063 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.064 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e87c0f-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.111 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.114 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.118 221324 INFO os_vif [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:4e:08,bridge_name='br-int',has_traffic_filtering=True,id=22e87c0f-c5ce-4700-94a2-0d70f1b9655f,network=Network(c1c6810e-ec8f-43f3-a3c6-22606d9416b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22e87c0f-c5')#033[00m
Jan 31 03:10:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb-userdata-shm.mount: Deactivated successfully.
Jan 31 03:10:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay-2b81f3df788bce893d1fee4dc9dfc85a6c3c5a873d2d8f5b26fe84b71977e556-merged.mount: Deactivated successfully.
Jan 31 03:10:24 np0005603622 podman[250535]: 2026-01-31 08:10:24.142539935 +0000 UTC m=+0.135735421 container cleanup c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:10:24 np0005603622 systemd[1]: libpod-conmon-c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb.scope: Deactivated successfully.
Jan 31 03:10:24 np0005603622 podman[250588]: 2026-01-31 08:10:24.219529236 +0000 UTC m=+0.055006211 container remove c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.225 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d42c7814-442a-45a7-a2e4-feb6c6f994b3]: (4, ('Sat Jan 31 08:10:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb)\nc8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb\nSat Jan 31 08:10:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 (c8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb)\nc8da08dc8bd2e0b0d1095c805684a1b66aed363a1c19586e33e31d3d0b9462fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.228 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aa423d00-cbc0-4b7d-8407-584dd940b3b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.229 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1c6810e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:24 np0005603622 kernel: tapc1c6810e-e0: left promiscuous mode
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4dd3f8-1bdc-482e-9795-fcfdd8d9b155]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.258 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[962f9332-79a0-459f-8d3c-d9ef9eb3a359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.259 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[79d42167-567a-4f1a-a710-892f427437cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.278 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[62c203e4-81e1-4633-a046-c7fac011063c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626462, 'reachable_time': 23000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250615, 'error': None, 'target': 'ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.282 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1c6810e-ec8f-43f3-a3c6-22606d9416b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:10:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:24.283 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[ef22538a-8f34-4124-8585-834184e39fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:10:24 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc1c6810e\x2dec8f\x2d43f3\x2da3c6\x2d22606d9416b6.mount: Deactivated successfully.
Jan 31 03:10:24 np0005603622 podman[250607]: 2026-01-31 08:10:24.374054061 +0000 UTC m=+0.113555984 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.558 221324 DEBUG nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.559 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.559 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.559 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.559 221324 DEBUG nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.559 221324 WARNING nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-unplugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state None.#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 DEBUG nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 DEBUG oslo_concurrency.lockutils [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 DEBUG nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] No waiting events found dispatching network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.560 221324 WARNING nova.compute.manager [req-51ac44a8-8e69-4a54-a9a5-90bbc37ea86e req-bdd6497a-dba1-4d62-849e-5693edfdb71c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received unexpected event network-vif-plugged-22e87c0f-c5ce-4700-94a2-0d70f1b9655f for instance with vm_state active and task_state None.#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.602 221324 INFO nova.virt.libvirt.driver [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Deleting instance files /var/lib/nova/instances/e528a53a-8ada-4966-912c-1f15ed61e649_del#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.602 221324 INFO nova.virt.libvirt.driver [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Deletion of /var/lib/nova/instances/e528a53a-8ada-4966-912c-1f15ed61e649_del complete#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.659 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.665 221324 INFO nova.compute.manager [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.665 221324 DEBUG oslo.service.loopingcall [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.665 221324 DEBUG nova.compute.manager [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:10:24 np0005603622 nova_compute[221301]: 2026-01-31 08:10:24.666 221324 DEBUG nova.network.neutron [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:10:25 np0005603622 nova_compute[221301]: 2026-01-31 08:10:25.438 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:25 np0005603622 nova_compute[221301]: 2026-01-31 08:10:25.438 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:25 np0005603622 nova_compute[221301]: 2026-01-31 08:10:25.439 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:10:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:25 np0005603622 nova_compute[221301]: 2026-01-31 08:10:25.908 221324 DEBUG nova.network.neutron [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:25 np0005603622 nova_compute[221301]: 2026-01-31 08:10:25.928 221324 INFO nova.compute.manager [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Took 1.26 seconds to deallocate network for instance.#033[00m
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.001 221324 DEBUG nova.compute.manager [req-5600ac28-3878-4395-8610-1de3a48f2cbe req-9591fb7a-0384-4924-8641-2cf6366f1b36 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Received event network-vif-deleted-22e87c0f-c5ce-4700-94a2-0d70f1b9655f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.033 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.034 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.040 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.080 221324 INFO nova.scheduler.client.report [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Deleted allocations for instance e528a53a-8ada-4966-912c-1f15ed61e649#033[00m
Jan 31 03:10:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:26 np0005603622 nova_compute[221301]: 2026-01-31 08:10:26.155 221324 DEBUG oslo_concurrency.lockutils [None req-c20ad082-5775-4f52-8ea8-ec777f79ef06 16d731f5875748ca9b8036b2ba061042 3469c253459e40e39dcf5bcb6a32008f - - default default] Lock "e528a53a-8ada-4966-912c-1f15ed61e649" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:27 np0005603622 podman[250638]: 2026-01-31 08:10:27.132425283 +0000 UTC m=+0.060094338 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 31 03:10:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:27.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:27.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:10:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2302215876' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:10:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:10:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2302215876' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:10:29 np0005603622 nova_compute[221301]: 2026-01-31 08:10:29.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:29 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:29Z|00278|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:29 np0005603622 nova_compute[221301]: 2026-01-31 08:10:29.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:29 np0005603622 nova_compute[221301]: 2026-01-31 08:10:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:29.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:29 np0005603622 nova_compute[221301]: 2026-01-31 08:10:29.710 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:29.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:30 np0005603622 nova_compute[221301]: 2026-01-31 08:10:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/651360895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:30.864 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:30.865 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:31.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:31.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.453 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.454 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.454 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.454 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.455 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/718843882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:32 np0005603622 nova_compute[221301]: 2026-01-31 08:10:32.918 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.109 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.110 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.301 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.302 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4441MB free_disk=20.921764373779297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.303 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.303 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.538 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 183fba4a-610d-4b96-a12f-996f9f2dc30b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.538 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.539 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.560 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:10:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:33.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.580 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.580 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.599 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.636 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:10:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.671 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:33Z|00279|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:33 np0005603622 nova_compute[221301]: 2026-01-31 08:10:33.797 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:33.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.116 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2937903151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.158 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.164 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.197 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.284 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.284 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.285 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.285 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:10:34 np0005603622 nova_compute[221301]: 2026-01-31 08:10:34.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:35.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:35.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.303 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.304 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.304 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:10:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:37.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.706 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.707 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.707 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:10:37 np0005603622 nova_compute[221301]: 2026-01-31 08:10:37.708 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:10:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:37.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:39 np0005603622 nova_compute[221301]: 2026-01-31 08:10:39.035 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847024.0328314, e528a53a-8ada-4966-912c-1f15ed61e649 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:10:39 np0005603622 nova_compute[221301]: 2026-01-31 08:10:39.035 221324 INFO nova.compute.manager [-] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:10:39 np0005603622 nova_compute[221301]: 2026-01-31 08:10:39.081 221324 DEBUG nova.compute.manager [None req-b0096e8f-d847-4c89-b705-260a8d881c62 - - - - - -] [instance: e528a53a-8ada-4966-912c-1f15ed61e649] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:10:39 np0005603622 nova_compute[221301]: 2026-01-31 08:10:39.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:39.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:39 np0005603622 nova_compute[221301]: 2026-01-31 08:10:39.715 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:39.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:40 np0005603622 nova_compute[221301]: 2026-01-31 08:10:40.026 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:40 np0005603622 nova_compute[221301]: 2026-01-31 08:10:40.627 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:10:40 np0005603622 nova_compute[221301]: 2026-01-31 08:10:40.699 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:10:40 np0005603622 nova_compute[221301]: 2026-01-31 08:10:40.700 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:10:40 np0005603622 nova_compute[221301]: 2026-01-31 08:10:40.789 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:41.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:41.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:43.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:44 np0005603622 nova_compute[221301]: 2026-01-31 08:10:44.191 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:44 np0005603622 nova_compute[221301]: 2026-01-31 08:10:44.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:45.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:45.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:46 np0005603622 nova_compute[221301]: 2026-01-31 08:10:46.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:47.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:47.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:48 np0005603622 nova_compute[221301]: 2026-01-31 08:10:48.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603622 nova_compute[221301]: 2026-01-31 08:10:49.195 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603622 nova_compute[221301]: 2026-01-31 08:10:49.261 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:49.262 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:10:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:49.263 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:10:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:49.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:49 np0005603622 nova_compute[221301]: 2026-01-31 08:10:49.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:49.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:10:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:10:50 np0005603622 nova_compute[221301]: 2026-01-31 08:10:50.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:50 np0005603622 nova_compute[221301]: 2026-01-31 08:10:50.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:10:50 np0005603622 nova_compute[221301]: 2026-01-31 08:10:50.441 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:10:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:10:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:51.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:10:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:51.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:10:52Z|00280|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:10:52 np0005603622 nova_compute[221301]: 2026-01-31 08:10:52.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:53.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:10:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:53.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:10:54 np0005603622 nova_compute[221301]: 2026-01-31 08:10:54.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:54 np0005603622 nova_compute[221301]: 2026-01-31 08:10:54.197 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:54 np0005603622 nova_compute[221301]: 2026-01-31 08:10:54.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:54 np0005603622 nova_compute[221301]: 2026-01-31 08:10:54.836 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:55 np0005603622 podman[250835]: 2026-01-31 08:10:55.223463655 +0000 UTC m=+0.142586546 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:10:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:55.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:55.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:56 np0005603622 nova_compute[221301]: 2026-01-31 08:10:56.007 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:10:56 np0005603622 nova_compute[221301]: 2026-01-31 08:10:56.041 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:10:56 np0005603622 nova_compute[221301]: 2026-01-31 08:10:56.042 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:56 np0005603622 nova_compute[221301]: 2026-01-31 08:10:56.043 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:10:56 np0005603622 nova_compute[221301]: 2026-01-31 08:10:56.090 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:10:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:57.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:58 np0005603622 podman[250912]: 2026-01-31 08:10:58.12322123 +0000 UTC m=+0.052287248 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.441 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.442 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.465 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.632 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.633 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.640 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.640 221324 INFO nova.compute.claims [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:58 np0005603622 nova_compute[221301]: 2026-01-31 08:10:58.893 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:10:59.264 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:10:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:10:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3826185153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.365 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.373 221324 DEBUG nova.compute.provider_tree [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.405 221324 DEBUG nova.scheduler.client.report [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.433 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.434 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.489 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.490 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.525 221324 INFO nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.598 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:10:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:10:59.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:10:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:10:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:10:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:10:59.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.896 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.898 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.899 221324 INFO nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating image(s)#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.939 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:10:59 np0005603622 nova_compute[221301]: 2026-01-31 08:10:59.982 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.023 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.028 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.055 221324 DEBUG nova.policy [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '111fdaf79c084a91902fe37a7a502020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58e900992be7400fb940ca20f13e12d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.089 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.090 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.090 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.091 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.127 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.131 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:00 np0005603622 nova_compute[221301]: 2026-01-31 08:11:00.961 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Successfully created port: d4dade51-eb59-41d5-9971-370ad716e47b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:11:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:01.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:01.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.306 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Successfully updated port: d4dade51-eb59-41d5-9971-370ad716e47b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.332 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.333 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.333 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.392 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.486 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] resizing rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.533 221324 DEBUG nova.compute.manager [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-changed-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.534 221324 DEBUG nova.compute.manager [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Refreshing instance network info cache due to event network-changed-d4dade51-eb59-41d5-9971-370ad716e47b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.534 221324 DEBUG oslo_concurrency.lockutils [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.616 221324 DEBUG nova.objects.instance [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.643 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.644 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Ensure instance console log exists: /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.645 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.645 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:02 np0005603622 nova_compute[221301]: 2026-01-31 08:11:02.645 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:03.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:03.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:04 np0005603622 nova_compute[221301]: 2026-01-31 08:11:04.060 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:11:04 np0005603622 nova_compute[221301]: 2026-01-31 08:11:04.205 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:04 np0005603622 nova_compute[221301]: 2026-01-31 08:11:04.806 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.458 221324 DEBUG nova.network.neutron [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Updating instance_info_cache with network_info: [{"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.554 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.554 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance network_info: |[{"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.555 221324 DEBUG oslo_concurrency.lockutils [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.556 221324 DEBUG nova.network.neutron [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Refreshing network info cache for port d4dade51-eb59-41d5-9971-370ad716e47b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.560 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start _get_guest_xml network_info=[{"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.566 221324 WARNING nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.571 221324 DEBUG nova.virt.libvirt.host [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.572 221324 DEBUG nova.virt.libvirt.host [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.575 221324 DEBUG nova.virt.libvirt.host [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.576 221324 DEBUG nova.virt.libvirt.host [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.577 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.578 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.578 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.579 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.579 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.579 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.580 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.580 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.580 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.581 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.581 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.581 221324 DEBUG nova.virt.hardware [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:11:05 np0005603622 nova_compute[221301]: 2026-01-31 08:11:05.585 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:05.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:05.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3544229572' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.057 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.103 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.110 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2888722518' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.539 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.542 221324 DEBUG nova.virt.libvirt.vif [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:59Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.543 221324 DEBUG nova.network.os_vif_util [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.545 221324 DEBUG nova.network.os_vif_util [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.546 221324 DEBUG nova.objects.instance [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.601 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <uuid>8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</uuid>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <name>instance-00000054</name>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-996778047</nova:name>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:11:05</nova:creationTime>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:user uuid="111fdaf79c084a91902fe37a7a502020">tempest-ServerDiskConfigTestJSON-855158150-project-member</nova:user>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:project uuid="58e900992be7400fb940ca20f13e12d1">tempest-ServerDiskConfigTestJSON-855158150</nova:project>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <nova:port uuid="d4dade51-eb59-41d5-9971-370ad716e47b">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="serial">8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="uuid">8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:56:26:96"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <target dev="tapd4dade51-eb"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/console.log" append="off"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:11:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:11:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:11:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:11:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.603 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Preparing to wait for external event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.604 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.605 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.606 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.607 221324 DEBUG nova.virt.libvirt.vif [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:10:59Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.608 221324 DEBUG nova.network.os_vif_util [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.609 221324 DEBUG nova.network.os_vif_util [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.610 221324 DEBUG os_vif [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.613 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.614 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.621 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.622 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4dade51-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.623 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4dade51-eb, col_values=(('external_ids', {'iface-id': 'd4dade51-eb59-41d5-9971-370ad716e47b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:26:96', 'vm-uuid': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:06 np0005603622 NetworkManager[49080]: <info>  [1769847066.6272] manager: (tapd4dade51-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.636 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.637 221324 INFO os_vif [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb')#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.750 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.751 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.751 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No VIF found with MAC fa:16:3e:56:26:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.752 221324 INFO nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Using config drive#033[00m
Jan 31 03:11:06 np0005603622 nova_compute[221301]: 2026-01-31 08:11:06.788 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:07.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:07 np0005603622 nova_compute[221301]: 2026-01-31 08:11:07.685 221324 INFO nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating config drive at /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config#033[00m
Jan 31 03:11:07 np0005603622 nova_compute[221301]: 2026-01-31 08:11:07.694 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphq4qu1jp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:07 np0005603622 nova_compute[221301]: 2026-01-31 08:11:07.837 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphq4qu1jp" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:07.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:07 np0005603622 nova_compute[221301]: 2026-01-31 08:11:07.884 221324 DEBUG nova.storage.rbd_utils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:07 np0005603622 nova_compute[221301]: 2026-01-31 08:11:07.891 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.703 221324 DEBUG oslo_concurrency.processutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.812s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.704 221324 INFO nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deleting local config drive /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config because it was imported into RBD.#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.718 221324 DEBUG nova.network.neutron [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Updated VIF entry in instance network info cache for port d4dade51-eb59-41d5-9971-370ad716e47b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.720 221324 DEBUG nova.network.neutron [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Updating instance_info_cache with network_info: [{"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.749 221324 DEBUG oslo_concurrency.lockutils [req-76bd9428-e32a-4825-bcdf-4f9c12cc043a req-7f1cbf3f-0d66-44c2-b406-89689089a327 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:08 np0005603622 kernel: tapd4dade51-eb: entered promiscuous mode
Jan 31 03:11:08 np0005603622 NetworkManager[49080]: <info>  [1769847068.7826] manager: (tapd4dade51-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Jan 31 03:11:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:08Z|00281|binding|INFO|Claiming lport d4dade51-eb59-41d5-9971-370ad716e47b for this chassis.
Jan 31 03:11:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:08Z|00282|binding|INFO|d4dade51-eb59-41d5-9971-370ad716e47b: Claiming fa:16:3e:56:26:96 10.100.0.6
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.782 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.794 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.796 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 bound to our chassis#033[00m
Jan 31 03:11:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:08Z|00283|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b ovn-installed in OVS
Jan 31 03:11:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:08Z|00284|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b up in Southbound
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.799 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f218695f-c744-4bd8-b2d8-122a920c7ca0#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.799 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:08 np0005603622 nova_compute[221301]: 2026-01-31 08:11:08.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.819 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39f071a5-e367-4dbe-9469-a5f8d6429bc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.820 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf218695f-c1 in ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.823 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf218695f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6779dd-b1f5-4bf4-95c9-526b07d1563a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.824 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[53a1f364-bce0-49be-8f72-84f403c124d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 systemd-machined[190406]: New machine qemu-35-instance-00000054.
Jan 31 03:11:08 np0005603622 systemd-udevd[251257]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.897 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[13c28f31-578e-42e0-b002-915f236a8fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 systemd[1]: Started Virtual Machine qemu-35-instance-00000054.
Jan 31 03:11:08 np0005603622 NetworkManager[49080]: <info>  [1769847068.8997] device (tapd4dade51-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:08 np0005603622 NetworkManager[49080]: <info>  [1769847068.9011] device (tapd4dade51-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.921 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e6aad331-567a-42e6-bab0-605079f88aa7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.962 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[832821db-2838-4839-a635-b5882c6d72b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:08.968 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3e1fff-edce-4308-9d01-5da16d161f82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:08 np0005603622 NetworkManager[49080]: <info>  [1769847068.9704] manager: (tapf218695f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/143)
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.007 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d2806826-cf5f-4dc1-8bac-454567028148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.012 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1cabe3ea-f48d-47eb-84c4-961c55437c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 NetworkManager[49080]: <info>  [1769847069.0432] device (tapf218695f-c0): carrier: link connected
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.050 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ab3344-130e-4566-bc05-86eb1efd2b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.071 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5a9a1b-51c0-463b-8403-9070cc98122b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632038, 'reachable_time': 41904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251288, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.088 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b202a7d2-5641-4a27-ae93-6ff19df35bb9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632038, 'tstamp': 632038}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251289, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.112 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dce93736-2d2b-405a-adb3-1abbce993ceb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632038, 'reachable_time': 41904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251290, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.151 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15024407-021d-4f40-9e08-4d29fe9cfa15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.221 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[da40a9ec-b54d-4a15-8197-fe5bdbc6bffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.223 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.223 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.223 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf218695f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:09 np0005603622 kernel: tapf218695f-c0: entered promiscuous mode
Jan 31 03:11:09 np0005603622 NetworkManager[49080]: <info>  [1769847069.2290] manager: (tapf218695f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.230 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf218695f-c0, col_values=(('external_ids', {'iface-id': 'd3a551a2-38e3-48d3-bdee-f2493a79eca0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:09Z|00285|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.242 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[edb63370-0fe2-44dd-aad2-fe92a0ad1e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.244 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:09.246 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'env', 'PROCESS_TAG=haproxy-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f218695f-c744-4bd8-b2d8-122a920c7ca0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.597 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847069.5962, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.598 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Started (Lifecycle Event)#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.624 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.630 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847069.596594, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.630 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:11:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:09.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.652 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.656 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:09 np0005603622 podman[251364]: 2026-01-31 08:11:09.603364191 +0000 UTC m=+0.027600664 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.776 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:11:09 np0005603622 nova_compute[221301]: 2026-01-31 08:11:09.832 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:09.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.155 221324 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.156 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.157 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.157 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.158 221324 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Processing event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.158 221324 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.158 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.159 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.159 221324 DEBUG oslo_concurrency.lockutils [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.160 221324 DEBUG nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.160 221324 WARNING nova.compute.manager [req-afc76cc0-527b-446c-b4ad-e19821ab2e9d req-7af7b72c-f5e4-427a-b866-06ce8521459b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.161 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.167 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847070.1675823, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.168 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.171 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.176 221324 INFO nova.virt.libvirt.driver [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance spawned successfully.#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.177 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:11:10 np0005603622 podman[251364]: 2026-01-31 08:11:10.223251402 +0000 UTC m=+0.647487845 container create d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.336 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.337 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.338 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.339 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.340 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.340 221324 DEBUG nova.virt.libvirt.driver [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.384 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.393 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:10 np0005603622 systemd[1]: Started libpod-conmon-d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c.scope.
Jan 31 03:11:10 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:11:10 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b3aa7649e60dd40b0c7ebf62cb957d2aea00155bfd695bc1c79280b03a51502/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.479 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.509 221324 INFO nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Took 10.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.510 221324 DEBUG nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:10 np0005603622 podman[251364]: 2026-01-31 08:11:10.562211598 +0000 UTC m=+0.986448051 container init d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:11:10 np0005603622 podman[251364]: 2026-01-31 08:11:10.570945933 +0000 UTC m=+0.995182376 container start d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.586 221324 INFO nova.compute.manager [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Took 12.08 seconds to build instance.#033[00m
Jan 31 03:11:10 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [NOTICE]   (251383) : New worker (251385) forked
Jan 31 03:11:10 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [NOTICE]   (251383) : Loading success.
Jan 31 03:11:10 np0005603622 nova_compute[221301]: 2026-01-31 08:11:10.617 221324 DEBUG oslo_concurrency.lockutils [None req-3850aaab-6e80-4483-920c-cf27d8043a5f 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:11 np0005603622 nova_compute[221301]: 2026-01-31 08:11:11.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:11.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:11.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:13.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:11:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:13.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:11:14 np0005603622 nova_compute[221301]: 2026-01-31 08:11:14.824 221324 INFO nova.compute.manager [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Rebuilding instance#033[00m
Jan 31 03:11:14 np0005603622 nova_compute[221301]: 2026-01-31 08:11:14.840 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.190 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.229 221324 DEBUG nova.compute.manager [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.336 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.378 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.412 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'resources' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.445 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.532 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:11:15 np0005603622 nova_compute[221301]: 2026-01-31 08:11:15.538 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:11:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:15.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:15.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:16 np0005603622 nova_compute[221301]: 2026-01-31 08:11:16.634 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:17.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:17.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:19.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:19 np0005603622 nova_compute[221301]: 2026-01-31 08:11:19.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:19.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:21 np0005603622 nova_compute[221301]: 2026-01-31 08:11:21.637 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:21.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:21.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:23.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:23.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:24 np0005603622 nova_compute[221301]: 2026-01-31 08:11:24.841 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:25 np0005603622 nova_compute[221301]: 2026-01-31 08:11:25.432 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:25 np0005603622 nova_compute[221301]: 2026-01-31 08:11:25.595 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:11:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:25.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:25.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:26 np0005603622 podman[251394]: 2026-01-31 08:11:26.224436263 +0000 UTC m=+0.138394303 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 31 03:11:26 np0005603622 nova_compute[221301]: 2026-01-31 08:11:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:26 np0005603622 nova_compute[221301]: 2026-01-31 08:11:26.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:11:26 np0005603622 nova_compute[221301]: 2026-01-31 08:11:26.639 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:27.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:27.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:28 np0005603622 kernel: tapd4dade51-eb (unregistering): left promiscuous mode
Jan 31 03:11:28 np0005603622 NetworkManager[49080]: <info>  [1769847088.0367] device (tapd4dade51-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00286|binding|INFO|Releasing lport d4dade51-eb59-41d5-9971-370ad716e47b from this chassis (sb_readonly=0)
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00287|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b down in Southbound
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00288|binding|INFO|Removing iface tapd4dade51-eb ovn-installed in OVS
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.050 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.062 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.066 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.069 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.072 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.074 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b26344-6bbd-4fde-b984-73c5a100d5b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.075 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace which is not needed anymore#033[00m
Jan 31 03:11:28 np0005603622 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 31 03:11:28 np0005603622 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Consumed 14.676s CPU time.
Jan 31 03:11:28 np0005603622 systemd-machined[190406]: Machine qemu-35-instance-00000054 terminated.
Jan 31 03:11:28 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [NOTICE]   (251383) : haproxy version is 2.8.14-c23fe91
Jan 31 03:11:28 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [NOTICE]   (251383) : path to executable is /usr/sbin/haproxy
Jan 31 03:11:28 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [WARNING]  (251383) : Exiting Master process...
Jan 31 03:11:28 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [ALERT]    (251383) : Current worker (251385) exited with code 143 (Terminated)
Jan 31 03:11:28 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251379]: [WARNING]  (251383) : All workers exited. Exiting... (0)
Jan 31 03:11:28 np0005603622 systemd[1]: libpod-d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c.scope: Deactivated successfully.
Jan 31 03:11:28 np0005603622 podman[251445]: 2026-01-31 08:11:28.24155666 +0000 UTC m=+0.056296015 container died d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:11:28 np0005603622 kernel: tapd4dade51-eb: entered promiscuous mode
Jan 31 03:11:28 np0005603622 systemd-udevd[251424]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:28 np0005603622 NetworkManager[49080]: <info>  [1769847088.2669] manager: (tapd4dade51-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 31 03:11:28 np0005603622 kernel: tapd4dade51-eb (unregistering): left promiscuous mode
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.269 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00289|binding|INFO|Claiming lport d4dade51-eb59-41d5-9971-370ad716e47b for this chassis.
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00290|binding|INFO|d4dade51-eb59-41d5-9971-370ad716e47b: Claiming fa:16:3e:56:26:96 10.100.0.6
Jan 31 03:11:28 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c-userdata-shm.mount: Deactivated successfully.
Jan 31 03:11:28 np0005603622 systemd[1]: var-lib-containers-storage-overlay-4b3aa7649e60dd40b0c7ebf62cb957d2aea00155bfd695bc1c79280b03a51502-merged.mount: Deactivated successfully.
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.287 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:28 np0005603622 podman[251445]: 2026-01-31 08:11:28.295555412 +0000 UTC m=+0.110294727 container cleanup d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00291|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b ovn-installed in OVS
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00292|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b up in Southbound
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00293|binding|INFO|Releasing lport d4dade51-eb59-41d5-9971-370ad716e47b from this chassis (sb_readonly=1)
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00294|binding|INFO|Removing iface tapd4dade51-eb ovn-installed in OVS
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00295|if_status|INFO|Not setting lport d4dade51-eb59-41d5-9971-370ad716e47b down as sb is readonly
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00296|binding|INFO|Releasing lport d4dade51-eb59-41d5-9971-370ad716e47b from this chassis (sb_readonly=0)
Jan 31 03:11:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:28Z|00297|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b down in Southbound
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.305 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.313 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:28 np0005603622 systemd[1]: libpod-conmon-d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c.scope: Deactivated successfully.
Jan 31 03:11:28 np0005603622 podman[251461]: 2026-01-31 08:11:28.343144272 +0000 UTC m=+0.075589954 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 31 03:11:28 np0005603622 podman[251483]: 2026-01-31 08:11:28.378503603 +0000 UTC m=+0.054498957 container remove d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.382 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6192609b-f826-48bc-862e-bb9a77256553]: (4, ('Sat Jan 31 08:11:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c)\nd99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c\nSat Jan 31 08:11:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (d99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c)\nd99312db8dafa58eb2369d2befde7d23a32359765e280817709799b3c33ce02c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.384 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d3474a-6e4e-456d-ac55-cfaa72b294dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.386 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 kernel: tapf218695f-c0: left promiscuous mode
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.400 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.403 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9abedb37-8c88-434e-82a2-169e90722e82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.416 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b283a1bc-a7b7-46d0-97c0-726b9e13c830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.417 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[82656db5-ac11-4459-b8be-3904a0f563bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.435 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e94f047-0392-4346-9e36-2c0ed72a7a77]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632029, 'reachable_time': 29245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251513, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.439 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:11:28 np0005603622 systemd[1]: run-netns-ovnmeta\x2df218695f\x2dc744\x2d4bd8\x2db2d8\x2d122a920c7ca0.mount: Deactivated successfully.
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.439 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[0e97db61-caa8-4f4a-96f2-2a542b6526d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.442 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.446 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.447 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcda368c-799f-4108-a67b-2280c926ec75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.448 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.451 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:28.452 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe72fff9-ae17-43e3-9e0b-6fbbf1737f5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.613 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.622 221324 INFO nova.virt.libvirt.driver [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance destroyed successfully.#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.629 221324 INFO nova.virt.libvirt.driver [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance destroyed successfully.#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.630 221324 DEBUG nova.virt.libvirt.vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:14Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.631 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.632 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.633 221324 DEBUG os_vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.637 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.638 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4dade51-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.640 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.647 221324 INFO os_vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb')#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.958 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.959 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.960 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.960 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.961 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.961 221324 WARNING nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.962 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.962 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.962 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.963 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.963 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.964 221324 WARNING nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.964 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.964 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:28 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.965 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:28 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.965 221324 DEBUG oslo_concurrency.lockutils [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.966 221324 DEBUG nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:28 np0005603622 nova_compute[221301]: 2026-01-31 08:11:28.966 221324 WARNING nova.compute.manager [req-3329c378-a6b2-45ca-bd5e-aa334d43560f req-c299fb53-672d-4ff3-8968-c1658173f67f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuilding.#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.124 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deleting instance files /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_del#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.126 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deletion of /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_del complete#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.319 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.320 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating image(s)#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.362 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.410 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.479 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.484 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.486 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:29.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.846 221324 DEBUG nova.virt.libvirt.imagebackend [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0864ca59-9877-4e6d-adfc-f0a3204ed8f8/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0864ca59-9877-4e6d-adfc-f0a3204ed8f8/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:11:29 np0005603622 nova_compute[221301]: 2026-01-31 08:11:29.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:29.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:30.865 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.151 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.151 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.152 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.152 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.153 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.153 221324 WARNING nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.154 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.154 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.155 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.155 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.155 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.156 221324 WARNING nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.156 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.157 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.157 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.157 221324 DEBUG oslo_concurrency.lockutils [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.158 221324 DEBUG nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.158 221324 WARNING nova.compute.manager [req-9314ed88-e56b-435d-9e13-f0b627a50a74 req-0bc4a5d1-3948-4e7d-b3a8-7b85385d40dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.247 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.324 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.325 221324 DEBUG nova.virt.images [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] 0864ca59-9877-4e6d-adfc-f0a3204ed8f8 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.327 221324 DEBUG nova.privsep.utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.328 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.554 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.part /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.559 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.650 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5.converted --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.653 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.701 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:31 np0005603622 nova_compute[221301]: 2026-01-31 08:11:31.708 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:31.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.113 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.221 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] resizing rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.917 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.918 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Ensure instance console log exists: /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.919 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.920 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.920 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.924 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start _get_guest_xml network_info=[{"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.931 221324 WARNING nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.938 221324 DEBUG nova.virt.libvirt.host [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.939 221324 DEBUG nova.virt.libvirt.host [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.944 221324 DEBUG nova.virt.libvirt.host [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.945 221324 DEBUG nova.virt.libvirt.host [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.947 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.948 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.949 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.949 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.949 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.950 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.950 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.951 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.951 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.952 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.952 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.952 221324 DEBUG nova.virt.hardware [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.953 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:32 np0005603622 nova_compute[221301]: 2026-01-31 08:11:32.982 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/320934881' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.468 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.509 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.515 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.640 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.854314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093854393, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2419, "num_deletes": 253, "total_data_size": 5582740, "memory_usage": 5660808, "flush_reason": "Manual Compaction"}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093879213, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3647581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39433, "largest_seqno": 41847, "table_properties": {"data_size": 3637977, "index_size": 5970, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20941, "raw_average_key_size": 20, "raw_value_size": 3618368, "raw_average_value_size": 3578, "num_data_blocks": 259, "num_entries": 1011, "num_filter_entries": 1011, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769846893, "oldest_key_time": 1769846893, "file_creation_time": 1769847093, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 25024 microseconds, and 11833 cpu microseconds.
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.879338) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3647581 bytes OK
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.879369) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.881124) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.881147) EVENT_LOG_v1 {"time_micros": 1769847093881139, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.881177) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5572106, prev total WAL file size 5572106, number of live WAL files 2.
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.882547) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3562KB)], [75(10161KB)]
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093882624, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 14053446, "oldest_snapshot_seqno": -1}
Jan 31 03:11:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:33.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2002759257' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.954 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.957 221324 DEBUG nova.virt.libvirt.vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:29Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.958 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.959 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.963 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <uuid>8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</uuid>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <name>instance-00000054</name>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-996778047</nova:name>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:11:32</nova:creationTime>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:user uuid="111fdaf79c084a91902fe37a7a502020">tempest-ServerDiskConfigTestJSON-855158150-project-member</nova:user>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:project uuid="58e900992be7400fb940ca20f13e12d1">tempest-ServerDiskConfigTestJSON-855158150</nova:project>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <nova:port uuid="d4dade51-eb59-41d5-9971-370ad716e47b">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="serial">8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="uuid">8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:56:26:96"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <target dev="tapd4dade51-eb"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/console.log" append="off"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:11:33 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:11:33 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:11:33 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:11:33 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.965 221324 DEBUG nova.compute.manager [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Preparing to wait for external event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.966 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.967 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.967 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.969 221324 DEBUG nova.virt.libvirt.vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:11:29Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.969 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.971 221324 DEBUG nova.network.os_vif_util [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.971 221324 DEBUG os_vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.973 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.974 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.977 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.978 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4dade51-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.979 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4dade51-eb, col_values=(('external_ids', {'iface-id': 'd4dade51-eb59-41d5-9971-370ad716e47b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:26:96', 'vm-uuid': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.981 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603622 NetworkManager[49080]: <info>  [1769847093.9838] manager: (tapd4dade51-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.984 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6783 keys, 12004121 bytes, temperature: kUnknown
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093986192, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 12004121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11956427, "index_size": 29662, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 173702, "raw_average_key_size": 25, "raw_value_size": 11832809, "raw_average_value_size": 1744, "num_data_blocks": 1182, "num_entries": 6783, "num_filter_entries": 6783, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847093, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.986570) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 12004121 bytes
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.988145) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.5 rd, 115.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.9 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(7.1) write-amplify(3.3) OK, records in: 7309, records dropped: 526 output_compression: NoCompression
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.988177) EVENT_LOG_v1 {"time_micros": 1769847093988161, "job": 46, "event": "compaction_finished", "compaction_time_micros": 103702, "compaction_time_cpu_micros": 49150, "output_level": 6, "num_output_files": 1, "total_output_size": 12004121, "num_input_records": 7309, "num_output_records": 6783, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093988746, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847093990228, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.882399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.990333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.990344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.990347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.990350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:11:33.990352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:11:33 np0005603622 nova_compute[221301]: 2026-01-31 08:11:33.991 221324 INFO os_vif [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb')#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.054 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.055 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.056 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No VIF found with MAC fa:16:3e:56:26:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.057 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Using config drive#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.099 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.122 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.161 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'keypairs' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.416 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.417 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.417 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.417 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.418 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.848 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Creating config drive at /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.855 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpplz6tdxl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/342360510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.891 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.988 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.989 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.992 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000054 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.993 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000054 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:11:34 np0005603622 nova_compute[221301]: 2026-01-31 08:11:34.997 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpplz6tdxl" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.043 221324 DEBUG nova.storage.rbd_utils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.048 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.252 221324 DEBUG oslo_concurrency.processutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.254 221324 INFO nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deleting local config drive /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9/disk.config because it was imported into RBD.#033[00m
Jan 31 03:11:35 np0005603622 kernel: tapd4dade51-eb: entered promiscuous mode
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.3211] manager: (tapd4dade51-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Jan 31 03:11:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:35Z|00298|binding|INFO|Claiming lport d4dade51-eb59-41d5-9971-370ad716e47b for this chassis.
Jan 31 03:11:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:35Z|00299|binding|INFO|d4dade51-eb59-41d5-9971-370ad716e47b: Claiming fa:16:3e:56:26:96 10.100.0.6
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.330 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '7', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.332 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 bound to our chassis#033[00m
Jan 31 03:11:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:35Z|00300|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b ovn-installed in OVS
Jan 31 03:11:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:35Z|00301|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b up in Southbound
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.336 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f218695f-c744-4bd8-b2d8-122a920c7ca0#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.336 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.348 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a6ba91-fc01-44fd-8cda-210e6268881c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.349 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf218695f-c1 in ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:11:35 np0005603622 systemd-udevd[251870]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.358 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf218695f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.359 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6130df51-0c0c-4a6f-82f0-0f688193f7fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.360 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1640bc80-b280-4d61-887a-72c0f91c1665]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.3698] device (tapd4dade51-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.3720] device (tapd4dade51-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:11:35 np0005603622 systemd-machined[190406]: New machine qemu-36-instance-00000054.
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.373 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[55fb1f79-1345-49fb-b3cf-d1380cb8dac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 systemd[1]: Started Virtual Machine qemu-36-instance-00000054.
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.390 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[316c2904-af43-4f42-8e72-332fb2103477]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.407 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.409 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4406MB free_disk=20.86536407470703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.410 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.410 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.426 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4b55426b-07a0-47d3-9f07-bbfe9954bb0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 systemd-udevd[251874]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.4368] manager: (tapf218695f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/148)
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.435 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c01fc3b6-1261-4dbd-92c7-9598ee1176ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.476 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[168bac6d-4a38-47b4-a458-eda810389806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.481 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[97639fbf-2978-47da-b644-56fa6eb34d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.491 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 183fba4a-610d-4b96-a12f-996f9f2dc30b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.492 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.492 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.493 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.5155] device (tapf218695f-c0): carrier: link connected
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.519 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5da602-88fe-4025-8340-576ff76f525b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.537 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5061fd-e963-41b1-9c4c-bc181f7add82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634685, 'reachable_time': 28646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251903, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.544 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.558 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5a0ad6-f482-4cd6-b16e-3301935d71d1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 634685, 'tstamp': 634685}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251904, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15a056cc-adef-4e30-8a6c-0f6039bba195]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634685, 'reachable_time': 28646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251906, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.611 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[52f9fc6e-eb29-4e25-8c02-3f9c109278e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:35.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.681 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f30c22f3-091b-44f4-9c38-b0b75dd8de30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.683 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.683 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.684 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf218695f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603622 NetworkManager[49080]: <info>  [1769847095.7239] manager: (tapf218695f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 31 03:11:35 np0005603622 kernel: tapf218695f-c0: entered promiscuous mode
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.729 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf218695f-c0, col_values=(('external_ids', {'iface-id': 'd3a551a2-38e3-48d3-bdee-f2493a79eca0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:35Z|00302|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.744 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.745 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[230823b4-4bdd-4a69-9848-e38e217eaefc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.746 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:11:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:35.747 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'env', 'PROCESS_TAG=haproxy-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f218695f-c744-4bd8-b2d8-122a920c7ca0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:11:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:35.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2522985369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.976 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:35 np0005603622 nova_compute[221301]: 2026-01-31 08:11:35.985 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.007 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.045 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.046 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:36 np0005603622 podman[251959]: 2026-01-31 08:11:36.167074754 +0000 UTC m=+0.064652070 container create 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:11:36 np0005603622 podman[251959]: 2026-01-31 08:11:36.132319079 +0000 UTC m=+0.029896465 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:11:36 np0005603622 systemd[1]: Started libpod-conmon-3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4.scope.
Jan 31 03:11:36 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:11:36 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/858d2a5428484fdc4da87809e331455e89d74935adcb5a7851a7b64162580d35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:11:36 np0005603622 podman[251959]: 2026-01-31 08:11:36.283945067 +0000 UTC m=+0.181522443 container init 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:11:36 np0005603622 podman[251959]: 2026-01-31 08:11:36.294392988 +0000 UTC m=+0.191970304 container start 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:11:36 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [NOTICE]   (252020) : New worker (252022) forked
Jan 31 03:11:36 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [NOTICE]   (252020) : Loading success.
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.384 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.385 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847096.3828, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.385 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Started (Lifecycle Event)#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.409 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.414 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847096.3837352, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.415 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.453 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.458 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:36 np0005603622 nova_compute[221301]: 2026-01-31 08:11:36.477 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.047 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.049 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.049 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:11:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:37.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.783 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.784 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.784 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:11:37 np0005603622 nova_compute[221301]: 2026-01-31 08:11:37.785 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:37.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.973 221324 DEBUG nova.compute.manager [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.975 221324 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.976 221324 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.976 221324 DEBUG oslo_concurrency.lockutils [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.976 221324 DEBUG nova.compute.manager [req-989b2742-353a-46d0-a0a9-d1dbfc8e43fc req-af4c9140-665d-46d8-8c33-3497b4d2bd6f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Processing event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.978 221324 DEBUG nova.compute.manager [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.983 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847098.9826913, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.983 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.986 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.994 221324 INFO nova.virt.libvirt.driver [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance spawned successfully.#033[00m
Jan 31 03:11:38 np0005603622 nova_compute[221301]: 2026-01-31 08:11:38.994 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.007 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.012 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.025 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.026 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.028 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.028 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.029 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.030 221324 DEBUG nova.virt.libvirt.driver [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.036 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.085 221324 DEBUG nova.compute.manager [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.147 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.148 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.148 221324 DEBUG nova.objects.instance [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.215 221324 DEBUG oslo_concurrency.lockutils [None req-39a56a84-67c9-49ae-a5f5-fb42c9951887 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:39 np0005603622 nova_compute[221301]: 2026-01-31 08:11:39.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:39.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:40 np0005603622 nova_compute[221301]: 2026-01-31 08:11:40.664 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:40 np0005603622 nova_compute[221301]: 2026-01-31 08:11:40.686 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:11:40 np0005603622 nova_compute[221301]: 2026-01-31 08:11:40.686 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.137 221324 DEBUG nova.compute.manager [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.138 221324 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.138 221324 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.139 221324 DEBUG oslo_concurrency.lockutils [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.139 221324 DEBUG nova.compute.manager [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:41 np0005603622 nova_compute[221301]: 2026-01-31 08:11:41.140 221324 WARNING nova.compute.manager [req-11c3c2d9-1bfc-4fb9-8ea4-8b70af085545 req-eb4df482-ad89-46f2-aa16-42ad1b40a2eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state None.#033[00m
Jan 31 03:11:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:41.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:41.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.075 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.076 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.077 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.077 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.077 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.080 221324 INFO nova.compute.manager [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Terminating instance#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.081 221324 DEBUG nova.compute.manager [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:11:42 np0005603622 kernel: tapd4dade51-eb (unregistering): left promiscuous mode
Jan 31 03:11:42 np0005603622 NetworkManager[49080]: <info>  [1769847102.1305] device (tapd4dade51-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:11:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:42Z|00303|binding|INFO|Releasing lport d4dade51-eb59-41d5-9971-370ad716e47b from this chassis (sb_readonly=0)
Jan 31 03:11:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:42Z|00304|binding|INFO|Setting lport d4dade51-eb59-41d5-9971-370ad716e47b down in Southbound
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:11:42Z|00305|binding|INFO|Removing iface tapd4dade51-eb ovn-installed in OVS
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.143 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:26:96 10.100.0.6'], port_security=['fa:16:3e:56:26:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '8', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d4dade51-eb59-41d5-9971-370ad716e47b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.145 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d4dade51-eb59-41d5-9971-370ad716e47b in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.148 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.149 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca785ca5-40ee-4c2e-b137-460f92668b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.150 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace which is not needed anymore#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.155 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 31 03:11:42 np0005603622 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000054.scope: Consumed 4.207s CPU time.
Jan 31 03:11:42 np0005603622 systemd-machined[190406]: Machine qemu-36-instance-00000054 terminated.
Jan 31 03:11:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:42 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [NOTICE]   (252020) : haproxy version is 2.8.14-c23fe91
Jan 31 03:11:42 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [NOTICE]   (252020) : path to executable is /usr/sbin/haproxy
Jan 31 03:11:42 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [WARNING]  (252020) : Exiting Master process...
Jan 31 03:11:42 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [ALERT]    (252020) : Current worker (252022) exited with code 143 (Terminated)
Jan 31 03:11:42 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[251993]: [WARNING]  (252020) : All workers exited. Exiting... (0)
Jan 31 03:11:42 np0005603622 systemd[1]: libpod-3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4.scope: Deactivated successfully.
Jan 31 03:11:42 np0005603622 podman[252057]: 2026-01-31 08:11:42.305327572 +0000 UTC m=+0.051168217 container died 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.309 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.318 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.326 221324 INFO nova.virt.libvirt.driver [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Instance destroyed successfully.#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.327 221324 DEBUG nova.objects.instance [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'resources' on Instance uuid 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:11:42 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4-userdata-shm.mount: Deactivated successfully.
Jan 31 03:11:42 np0005603622 systemd[1]: var-lib-containers-storage-overlay-858d2a5428484fdc4da87809e331455e89d74935adcb5a7851a7b64162580d35-merged.mount: Deactivated successfully.
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.340 221324 DEBUG nova.virt.libvirt.vif [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:10:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-996778047',display_name='tempest-ServerDiskConfigTestJSON-server-996778047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-996778047',id=84,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:11:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-slqfoa0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:11:39Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.341 221324 DEBUG nova.network.os_vif_util [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "d4dade51-eb59-41d5-9971-370ad716e47b", "address": "fa:16:3e:56:26:96", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4dade51-eb", "ovs_interfaceid": "d4dade51-eb59-41d5-9971-370ad716e47b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.342 221324 DEBUG nova.network.os_vif_util [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.342 221324 DEBUG os_vif [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.346 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4dade51-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.350 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 podman[252057]: 2026-01-31 08:11:42.355878492 +0000 UTC m=+0.101719147 container cleanup 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.356 221324 INFO os_vif [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:26:96,bridge_name='br-int',has_traffic_filtering=True,id=d4dade51-eb59-41d5-9971-370ad716e47b,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4dade51-eb')#033[00m
Jan 31 03:11:42 np0005603622 systemd[1]: libpod-conmon-3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4.scope: Deactivated successfully.
Jan 31 03:11:42 np0005603622 podman[252106]: 2026-01-31 08:11:42.433466848 +0000 UTC m=+0.054560808 container remove 3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.437 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aba10aba-f933-485a-9695-960cbe4c1bc4]: (4, ('Sat Jan 31 08:11:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4)\n3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4\nSat Jan 31 08:11:42 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4)\n3dc13c1256c5ecaf06b2dc6e1e169e5b34dd9ba0277be58325bcdb823ae2a2e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.439 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[894ce91b-202a-4f41-8acc-b2a07377e729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.441 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 kernel: tapf218695f-c0: left promiscuous mode
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.453 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99671eb8-9b1f-4dbf-a4f1-5afd1b56bd27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.473 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba2dabc-f9f0-4dc1-9f05-a8d61372758d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.476 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9958cf31-a4c9-4e6a-b476-caf9efec4091]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.493 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1eca60e1-9c3f-495c-81f1-8f01cc375c9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634675, 'reachable_time': 38989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252133, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 systemd[1]: run-netns-ovnmeta\x2df218695f\x2dc744\x2d4bd8\x2db2d8\x2d122a920c7ca0.mount: Deactivated successfully.
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.497 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:11:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:42.498 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[27149530-438f-42f6-8f5c-3d8d2094f035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.884 221324 INFO nova.virt.libvirt.driver [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deleting instance files /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_del#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.885 221324 INFO nova.virt.libvirt.driver [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deletion of /var/lib/nova/instances/8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9_del complete#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.952 221324 INFO nova.compute.manager [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.953 221324 DEBUG oslo.service.loopingcall [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.954 221324 DEBUG nova.compute.manager [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:11:42 np0005603622 nova_compute[221301]: 2026-01-31 08:11:42.954 221324 DEBUG nova.network.neutron [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.252 221324 DEBUG nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.252 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.253 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.254 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.254 221324 DEBUG nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.255 221324 DEBUG nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-unplugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.255 221324 DEBUG nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.256 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.256 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.257 221324 DEBUG oslo_concurrency.lockutils [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.258 221324 DEBUG nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] No waiting events found dispatching network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:11:43 np0005603622 nova_compute[221301]: 2026-01-31 08:11:43.258 221324 WARNING nova.compute.manager [req-4b69cad0-071c-4a7d-9981-c8f498a0c15e req-a09a417d-c40d-4283-bf80-815c8414b946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received unexpected event network-vif-plugged-d4dade51-eb59-41d5-9971-370ad716e47b for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:11:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:43.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.630 221324 DEBUG nova.network.neutron [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.648 221324 INFO nova.compute.manager [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Took 1.69 seconds to deallocate network for instance.#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.686 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.687 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.753 221324 DEBUG nova.compute.manager [req-4a461e8c-884c-4df1-8609-f8d6dfea9a11 req-3ab1411a-d803-4c0c-b545-184e82490dd2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Received event network-vif-deleted-d4dade51-eb59-41d5-9971-370ad716e47b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.794 221324 DEBUG oslo_concurrency.processutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:11:44 np0005603622 nova_compute[221301]: 2026-01-31 08:11:44.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:11:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3318990936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.258 221324 DEBUG oslo_concurrency.processutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.264 221324 DEBUG nova.compute.provider_tree [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.289 221324 DEBUG nova.scheduler.client.report [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.320 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.363 221324 INFO nova.scheduler.client.report [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Deleted allocations for instance 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9#033[00m
Jan 31 03:11:45 np0005603622 nova_compute[221301]: 2026-01-31 08:11:45.479 221324 DEBUG oslo_concurrency.lockutils [None req-bf2a6ad0-d2cf-460d-9253-447c22f616c7 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:11:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:45.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:45.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:47 np0005603622 nova_compute[221301]: 2026-01-31 08:11:47.380 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:47.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:47.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:49.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:49 np0005603622 nova_compute[221301]: 2026-01-31 08:11:49.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:49.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:51.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:51.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:52 np0005603622 nova_compute[221301]: 2026-01-31 08:11:52.414 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:52.738 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:11:52 np0005603622 nova_compute[221301]: 2026-01-31 08:11:52.738 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:52.740 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:11:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:11:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:53.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:11:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:53.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:54 np0005603622 nova_compute[221301]: 2026-01-31 08:11:54.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:11:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:55.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:11:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:55.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:56 np0005603622 podman[252181]: 2026-01-31 08:11:56.779756142 +0000 UTC m=+0.113733400 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 03:11:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:11:57 np0005603622 nova_compute[221301]: 2026-01-31 08:11:57.326 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847102.3241844, 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:11:57 np0005603622 nova_compute[221301]: 2026-01-31 08:11:57.327 221324 INFO nova.compute.manager [-] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:11:57 np0005603622 nova_compute[221301]: 2026-01-31 08:11:57.367 221324 DEBUG nova.compute.manager [None req-02a9df07-cab9-4310-887a-9b0d58e8bd1e - - - - - -] [instance: 8cdfc9c6-7e9a-4a9f-8ca8-f3721e1dc0d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:11:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:11:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:11:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:11:57 np0005603622 nova_compute[221301]: 2026-01-31 08:11:57.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:57.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:57.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:59 np0005603622 podman[252318]: 2026-01-31 08:11:59.125466816 +0000 UTC m=+0.054981870 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:11:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:11:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:11:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:11:59.743 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:11:59 np0005603622 nova_compute[221301]: 2026-01-31 08:11:59.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:11:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:11:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:11:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:11:59.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.273 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.274 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.293 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.385 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.385 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.393 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.394 221324 INFO nova.compute.claims [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:12:00 np0005603622 nova_compute[221301]: 2026-01-31 08:12:00.527 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2795853240' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.038 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.045 221324 DEBUG nova.compute.provider_tree [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.084 221324 DEBUG nova.scheduler.client.report [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.107 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.107 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.147 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.148 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.170 221324 INFO nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.193 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.294 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.296 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.303 221324 INFO nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Creating image(s)#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.348 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.392 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.437 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.444 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.473 221324 DEBUG nova.policy [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55f81600a60b49aaae5b4c28549afdaf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88b896f61c644b6fac0351ce6828b6e1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.519 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.520 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.520 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.521 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.550 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.555 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.879 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.324s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:01.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:01 np0005603622 nova_compute[221301]: 2026-01-31 08:12:01.968 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] resizing rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.119 221324 DEBUG nova.objects.instance [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'migration_context' on Instance uuid 82428132-ddb9-49b8-83a5-603ff90fb7a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.132 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.133 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Ensure instance console log exists: /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:12:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.133 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 28K writes, 113K keys, 28K commit groups, 1.0 writes per commit group, ingest: 0.11 GB, 0.04 MB/s#012Cumulative WAL: 28K writes, 9900 syncs, 2.90 writes per sync, written: 0.11 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9657 writes, 37K keys, 9657 commit groups, 1.0 writes per commit group, ingest: 36.89 MB, 0.06 MB/s#012Interval WAL: 9656 writes, 3895 syncs, 2.48 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.134 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.134 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.506 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:02 np0005603622 nova_compute[221301]: 2026-01-31 08:12:02.859 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Successfully created port: 09451288-598f-46f6-be9e-b950f73f24f7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:12:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:03.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:12:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:12:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:03.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:04 np0005603622 nova_compute[221301]: 2026-01-31 08:12:04.267 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Successfully created port: 53194663-ae7f-4f54-9034-942c5881d66b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:12:04 np0005603622 nova_compute[221301]: 2026-01-31 08:12:04.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:05 np0005603622 nova_compute[221301]: 2026-01-31 08:12:05.338 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Successfully updated port: 09451288-598f-46f6-be9e-b950f73f24f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:12:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:05.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:05.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.670 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Successfully updated port: 53194663-ae7f-4f54-9034-942c5881d66b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.733 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.733 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquired lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.734 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.948 221324 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-changed-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.948 221324 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Refreshing instance network info cache due to event network-changed-09451288-598f-46f6-be9e-b950f73f24f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:12:06 np0005603622 nova_compute[221301]: 2026-01-31 08:12:06.949 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:07 np0005603622 nova_compute[221301]: 2026-01-31 08:12:07.006 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:12:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/681844042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:07 np0005603622 nova_compute[221301]: 2026-01-31 08:12:07.508 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:07.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:07.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.583 221324 DEBUG nova.network.neutron [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updating instance_info_cache with network_info: [{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.633 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Releasing lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.634 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Instance network_info: |[{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.634 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.634 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Refreshing network info cache for port 09451288-598f-46f6-be9e-b950f73f24f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.639 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Start _get_guest_xml network_info=[{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.646 221324 WARNING nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.652 221324 DEBUG nova.virt.libvirt.host [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.653 221324 DEBUG nova.virt.libvirt.host [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.656 221324 DEBUG nova.virt.libvirt.host [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.657 221324 DEBUG nova.virt.libvirt.host [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.659 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.659 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.659 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.660 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.660 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.660 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.660 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.660 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.661 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.662 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.663 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.663 221324 DEBUG nova.virt.hardware [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.667 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:09.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:09 np0005603622 nova_compute[221301]: 2026-01-31 08:12:09.904 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:09.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3828853113' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.118 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.160 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.166 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2444786878' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.655 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.657 221324 DEBUG nova.virt.libvirt.vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:01Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.658 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.659 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.661 221324 DEBUG nova.virt.libvirt.vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:01Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.662 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.663 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.664 221324 DEBUG nova.objects.instance [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 82428132-ddb9-49b8-83a5-603ff90fb7a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.685 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <uuid>82428132-ddb9-49b8-83a5-603ff90fb7a3</uuid>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <name>instance-00000059</name>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersTestMultiNic-server-619605874</nova:name>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:12:09</nova:creationTime>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:user uuid="55f81600a60b49aaae5b4c28549afdaf">tempest-ServersTestMultiNic-1053198737-project-member</nova:user>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:project uuid="88b896f61c644b6fac0351ce6828b6e1">tempest-ServersTestMultiNic-1053198737</nova:project>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:port uuid="09451288-598f-46f6-be9e-b950f73f24f7">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.139" ipVersion="4"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <nova:port uuid="53194663-ae7f-4f54-9034-942c5881d66b">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.1.167" ipVersion="4"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="serial">82428132-ddb9-49b8-83a5-603ff90fb7a3</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="uuid">82428132-ddb9-49b8-83a5-603ff90fb7a3</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/82428132-ddb9-49b8-83a5-603ff90fb7a3_disk">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b9:a5:d0"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <target dev="tap09451288-59"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:6b:36:06"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <target dev="tap53194663-ae"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/console.log" append="off"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:12:10 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:12:10 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:12:10 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:12:10 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.688 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Preparing to wait for external event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.689 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.690 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.690 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.691 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Preparing to wait for external event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.691 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.692 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.692 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.693 221324 DEBUG nova.virt.libvirt.vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:01Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.694 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.695 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.696 221324 DEBUG os_vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.697 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.699 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.700 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.707 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09451288-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.708 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09451288-59, col_values=(('external_ids', {'iface-id': '09451288-598f-46f6-be9e-b950f73f24f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:a5:d0', 'vm-uuid': '82428132-ddb9-49b8-83a5-603ff90fb7a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 NetworkManager[49080]: <info>  [1769847130.7175] manager: (tap09451288-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.724 221324 INFO os_vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59')#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.726 221324 DEBUG nova.virt.libvirt.vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:01Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.726 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.727 221324 DEBUG nova.network.os_vif_util [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.728 221324 DEBUG os_vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.729 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.729 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.730 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.734 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53194663-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.734 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap53194663-ae, col_values=(('external_ids', {'iface-id': '53194663-ae7f-4f54-9034-942c5881d66b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:36:06', 'vm-uuid': '82428132-ddb9-49b8-83a5-603ff90fb7a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 NetworkManager[49080]: <info>  [1769847130.7374] manager: (tap53194663-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.739 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.745 221324 INFO os_vif [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae')#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.823 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.824 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.824 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No VIF found with MAC fa:16:3e:b9:a5:d0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.824 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] No VIF found with MAC fa:16:3e:6b:36:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.825 221324 INFO nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Using config drive#033[00m
Jan 31 03:12:10 np0005603622 nova_compute[221301]: 2026-01-31 08:12:10.866 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.687 221324 INFO nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Creating config drive at /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.694 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpm4kqev4n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.720 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updated VIF entry in instance network info cache for port 09451288-598f-46f6-be9e-b950f73f24f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.722 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updating instance_info_cache with network_info: [{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:11.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.756 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.757 221324 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-changed-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.757 221324 DEBUG nova.compute.manager [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Refreshing instance network info cache due to event network-changed-53194663-ae7f-4f54-9034-942c5881d66b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.758 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.758 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.759 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Refreshing network info cache for port 53194663-ae7f-4f54-9034-942c5881d66b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.828 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpm4kqev4n" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.859 221324 DEBUG nova.storage.rbd_utils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] rbd image 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:11 np0005603622 nova_compute[221301]: 2026-01-31 08:12:11.864 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:11.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.030 221324 DEBUG oslo_concurrency.processutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config 82428132-ddb9-49b8-83a5-603ff90fb7a3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.032 221324 INFO nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Deleting local config drive /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3/disk.config because it was imported into RBD.#033[00m
Jan 31 03:12:12 np0005603622 kernel: tap09451288-59: entered promiscuous mode
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.0952] manager: (tap09451288-59): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.099 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00306|binding|INFO|Claiming lport 09451288-598f-46f6-be9e-b950f73f24f7 for this chassis.
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00307|binding|INFO|09451288-598f-46f6-be9e-b950f73f24f7: Claiming fa:16:3e:b9:a5:d0 10.100.0.139
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.1121] manager: (tap53194663-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Jan 31 03:12:12 np0005603622 kernel: tap53194663-ae: entered promiscuous mode
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 systemd-udevd[252717]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00308|if_status|INFO|Not updating pb chassis for 53194663-ae7f-4f54-9034-942c5881d66b now as sb is readonly
Jan 31 03:12:12 np0005603622 systemd-udevd[252716]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.143 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:a5:d0 10.100.0.139'], port_security=['fa:16:3e:b9:a5:d0 10.100.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.139/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a49c35a-d05b-4ae0-9679-52bef61cd9dc, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=09451288-598f-46f6-be9e-b950f73f24f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.145 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 09451288-598f-46f6-be9e-b950f73f24f7 in datapath fd3db6e1-c411-452c-973b-3f63c98ddf49 bound to our chassis#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.149 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd3db6e1-c411-452c-973b-3f63c98ddf49#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00309|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00310|binding|INFO|Claiming lport 53194663-ae7f-4f54-9034-942c5881d66b for this chassis.
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00311|binding|INFO|53194663-ae7f-4f54-9034-942c5881d66b: Claiming fa:16:3e:6b:36:06 10.100.1.167
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.1575] device (tap09451288-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.1584] device (tap09451288-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.1658] device (tap53194663-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.1668] device (tap53194663-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.168 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[420c0517-9dfa-4914-b286-88adf7dd789d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.169 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd3db6e1-c1 in ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00312|binding|INFO|Setting lport 09451288-598f-46f6-be9e-b950f73f24f7 ovn-installed in OVS
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.173 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd3db6e1-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.173 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[beb564f2-8fa3-4adc-9418-7a8effb2d636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.175 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[670f7261-ece7-43ab-97db-4e357f37d9da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 systemd-machined[190406]: New machine qemu-37-instance-00000059.
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.188 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[6a24ca08-5cca-4ff3-9a16-b23054e8c88e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00313|binding|INFO|Setting lport 53194663-ae7f-4f54-9034-942c5881d66b ovn-installed in OVS
Jan 31 03:12:12 np0005603622 systemd[1]: Started Virtual Machine qemu-37-instance-00000059.
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.195 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00314|binding|INFO|Setting lport 53194663-ae7f-4f54-9034-942c5881d66b up in Southbound
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00315|binding|INFO|Setting lport 09451288-598f-46f6-be9e-b950f73f24f7 up in Southbound
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.200 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:36:06 10.100.1.167'], port_security=['fa:16:3e:6b:36:06 10.100.1.167'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.167/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c78a769-e218-4034-aa19-d1bf989f0728', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bd448df-2ee0-43cd-ae7b-ed825cd5f61a, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=53194663-ae7f-4f54-9034-942c5881d66b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.203 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a7f501-d458-4f7a-a258-1183c241731f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.235 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbefe6b-bd22-45c1-826d-454cf0b9864b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 systemd-udevd[252723]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9501dca2-dac0-4e13-b847-eef9f4ec9147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.2443] manager: (tapfd3db6e1-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/154)
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.270 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e5caa0-3252-4c87-906f-a818f0017eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.274 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[990cc578-93aa-4dae-b5da-4ba686bc25bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.2946] device (tapfd3db6e1-c0): carrier: link connected
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.300 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[65f19aa9-9f28-4f38-a0b4-64a08d6e8001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.316 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf60150-cb67-4620-8da4-6575892e5543]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd3db6e1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:a8:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638363, 'reachable_time': 29671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252751, 'error': None, 'target': 'ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.335 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec6de5a-2d2d-46af-97c8-963b31322bbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:a8b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638363, 'tstamp': 638363}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252753, 'error': None, 'target': 'ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.353 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[76aa0d27-e367-4531-9ce5-e01cbc40b9ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd3db6e1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:a8:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638363, 'reachable_time': 29671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252754, 'error': None, 'target': 'ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.386 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c162365-c56c-4535-b28a-5a58e9dcb62f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.448 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c4194fad-c8e1-4090-b1fc-3dd81d80a5ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.450 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd3db6e1-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.451 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.452 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd3db6e1-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 NetworkManager[49080]: <info>  [1769847132.4566] manager: (tapfd3db6e1-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Jan 31 03:12:12 np0005603622 kernel: tapfd3db6e1-c0: entered promiscuous mode
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.464 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd3db6e1-c0, col_values=(('external_ids', {'iface-id': '8d98e755-d698-4951-b573-5dcccd0d4588'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:12Z|00316|binding|INFO|Releasing lport 8d98e755-d698-4951-b573-5dcccd0d4588 from this chassis (sb_readonly=0)
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.469 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd3db6e1-c411-452c-973b-3f63c98ddf49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd3db6e1-c411-452c-973b-3f63c98ddf49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.470 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb575cac-f060-4603-b6d3-47e0fda9a9c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.472 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-fd3db6e1-c411-452c-973b-3f63c98ddf49
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/fd3db6e1-c411-452c-973b-3f63c98ddf49.pid.haproxy
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID fd3db6e1-c411-452c-973b-3f63c98ddf49
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:12:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:12.473 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'env', 'PROCESS_TAG=haproxy-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd3db6e1-c411-452c-973b-3f63c98ddf49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 nova_compute[221301]: 2026-01-31 08:12:12.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:12 np0005603622 podman[252786]: 2026-01-31 08:12:12.894318652 +0000 UTC m=+0.078726118 container create 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:12:12 np0005603622 systemd[1]: Started libpod-conmon-9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab.scope.
Jan 31 03:12:12 np0005603622 podman[252786]: 2026-01-31 08:12:12.856919786 +0000 UTC m=+0.041327302 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:12:12 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:12:12 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5cdb17d7c4725f8f7a6ef75b0a90b3df4465102c7c0e0c5bb80bbe8d5f0a011/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:12:13 np0005603622 podman[252786]: 2026-01-31 08:12:13.002225063 +0000 UTC m=+0.186632539 container init 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:12:13 np0005603622 podman[252786]: 2026-01-31 08:12:13.007711451 +0000 UTC m=+0.192118917 container start 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:12:13 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [NOTICE]   (252805) : New worker (252807) forked
Jan 31 03:12:13 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [NOTICE]   (252805) : Loading success.
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.085 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 53194663-ae7f-4f54-9034-942c5881d66b in datapath 5c78a769-e218-4034-aa19-d1bf989f0728 unbound from our chassis#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.088 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c78a769-e218-4034-aa19-d1bf989f0728#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.100 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb58a440-03e6-4f82-a140-1792e0498662]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.101 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c78a769-e1 in ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.104 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c78a769-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.104 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f24723f6-141b-4aa5-a8f3-d621d1c11d91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.106 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[63c11a14-71a7-4d63-b004-aac560829cdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.118 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb2f0b9-db25-4559-8267-2387ba1fad26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.132 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f470082f-b49b-4f84-83dc-e2de42dd285c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.159 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[30962d72-6f07-4ce1-907c-d73b006a28c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 systemd-udevd[252737]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:13 np0005603622 NetworkManager[49080]: <info>  [1769847133.1706] manager: (tap5c78a769-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/156)
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.167 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f31e53f9-49cb-4b44-b1d6-4da01a1a2767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.211 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[53d9bf4d-8fe5-4338-8763-6a0d2ab44a97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.216 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6c184383-a2d7-4d77-af9e-a7b18024ca6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 NetworkManager[49080]: <info>  [1769847133.2353] device (tap5c78a769-e0): carrier: link connected
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.241 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff88129-c0c4-4d1c-8a3a-5873eed953fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.266 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c568de0-fec2-42bb-b637-72143b7dacea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c78a769-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:85:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638457, 'reachable_time': 36382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252844, 'error': None, 'target': 'ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.283 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a55efb-7a26-4935-aac5-903967f06f69]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:85f3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638457, 'tstamp': 638457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252860, 'error': None, 'target': 'ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.302 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4736cb35-5cc3-4263-9bf9-aa9be89a8a8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c78a769-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:85:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638457, 'reachable_time': 36382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252864, 'error': None, 'target': 'ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.333 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[49ee2e10-a550-4216-9dc6-46957b75d687]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.394 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[299cd367-27d7-4c7e-be4d-ead77c139b91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.396 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c78a769-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.397 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.397 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c78a769-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.399 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603622 NetworkManager[49080]: <info>  [1769847133.4004] manager: (tap5c78a769-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 31 03:12:13 np0005603622 kernel: tap5c78a769-e0: entered promiscuous mode
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.402 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.403 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c78a769-e0, col_values=(('external_ids', {'iface-id': 'fd55ea8f-feba-4022-8669-53d1aa387663'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:13Z|00317|binding|INFO|Releasing lport fd55ea8f-feba-4022-8669-53d1aa387663 from this chassis (sb_readonly=0)
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.406 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847133.4057865, 82428132-ddb9-49b8-83a5-603ff90fb7a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.406 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] VM Started (Lifecycle Event)#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.408 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c78a769-e218-4034-aa19-d1bf989f0728.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c78a769-e218-4034-aa19-d1bf989f0728.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.408 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.409 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[11d844d4-d6b1-4058-9104-351a1ac9d894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.410 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-5c78a769-e218-4034-aa19-d1bf989f0728
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/5c78a769-e218-4034-aa19-d1bf989f0728.pid.haproxy
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 5c78a769-e218-4034-aa19-d1bf989f0728
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:12:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:13.411 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728', 'env', 'PROCESS_TAG=haproxy-5c78a769-e218-4034-aa19-d1bf989f0728', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c78a769-e218-4034-aa19-d1bf989f0728.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.417 221324 DEBUG nova.compute.manager [req-ac65465f-d753-4313-bf69-baf43864b2b5 req-8e02b31f-9c96-48ad-91b2-9084edcca319 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.418 221324 DEBUG oslo_concurrency.lockutils [req-ac65465f-d753-4313-bf69-baf43864b2b5 req-8e02b31f-9c96-48ad-91b2-9084edcca319 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.418 221324 DEBUG oslo_concurrency.lockutils [req-ac65465f-d753-4313-bf69-baf43864b2b5 req-8e02b31f-9c96-48ad-91b2-9084edcca319 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.418 221324 DEBUG oslo_concurrency.lockutils [req-ac65465f-d753-4313-bf69-baf43864b2b5 req-8e02b31f-9c96-48ad-91b2-9084edcca319 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.419 221324 DEBUG nova.compute.manager [req-ac65465f-d753-4313-bf69-baf43864b2b5 req-8e02b31f-9c96-48ad-91b2-9084edcca319 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Processing event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.420 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.658 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.664 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847133.4058788, 82428132-ddb9-49b8-83a5-603ff90fb7a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.665 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.692 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.697 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:12:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:13.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:12:13 np0005603622 nova_compute[221301]: 2026-01-31 08:12:13.733 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:13 np0005603622 podman[252903]: 2026-01-31 08:12:13.791807108 +0000 UTC m=+0.045074323 container create f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:12:13 np0005603622 systemd[1]: Started libpod-conmon-f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519.scope.
Jan 31 03:12:13 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:12:13 np0005603622 podman[252903]: 2026-01-31 08:12:13.770029722 +0000 UTC m=+0.023296907 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:12:13 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9879150543edecf75dede2c22d2ecf81d24da60bf35b9c24c7f07547c28993aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:12:13 np0005603622 podman[252903]: 2026-01-31 08:12:13.881347326 +0000 UTC m=+0.134614521 container init f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:12:13 np0005603622 podman[252903]: 2026-01-31 08:12:13.888510169 +0000 UTC m=+0.141777344 container start f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Jan 31 03:12:13 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [NOTICE]   (252922) : New worker (252924) forked
Jan 31 03:12:13 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [NOTICE]   (252922) : Loading success.
Jan 31 03:12:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:13.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:14 np0005603622 nova_compute[221301]: 2026-01-31 08:12:14.907 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.023 221324 DEBUG nova.compute.manager [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.024 221324 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.024 221324 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.024 221324 DEBUG oslo_concurrency.lockutils [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.024 221324 DEBUG nova.compute.manager [req-9bf6901b-1393-43de-9bdd-0661b4b63404 req-397e2b92-50e6-4936-a777-850e3052238c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Processing event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.025 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.029 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847135.0296903, 82428132-ddb9-49b8-83a5-603ff90fb7a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.030 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.034 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updated VIF entry in instance network info cache for port 53194663-ae7f-4f54-9034-942c5881d66b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.035 221324 DEBUG nova.network.neutron [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updating instance_info_cache with network_info: [{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.037 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.041 221324 INFO nova.virt.libvirt.driver [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Instance spawned successfully.#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.041 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.073 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.082 221324 DEBUG oslo_concurrency.lockutils [req-63c56a2d-00e5-46b5-8604-1933c37d6f9e req-f15b321f-89ba-49ed-9cf1-4c550a80f3cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-82428132-ddb9-49b8-83a5-603ff90fb7a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.097 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.105 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.106 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.107 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.108 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.109 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.110 221324 DEBUG nova.virt.libvirt.driver [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.125 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.179 221324 INFO nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Took 13.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.179 221324 DEBUG nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.260 221324 INFO nova.compute.manager [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Took 14.90 seconds to build instance.#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.297 221324 DEBUG oslo_concurrency.lockutils [None req-647219a6-2f39-4650-aa6a-843ef43c35af 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.540 221324 DEBUG nova.compute.manager [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.541 221324 DEBUG oslo_concurrency.lockutils [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.541 221324 DEBUG oslo_concurrency.lockutils [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.542 221324 DEBUG oslo_concurrency.lockutils [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.542 221324 DEBUG nova.compute.manager [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.543 221324 WARNING nova.compute.manager [req-9bf9636b-30d6-427d-a788-b970f265731e req-639610c0-3f5d-4689-8402-5d0d5fcf29fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received unexpected event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:12:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:15.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:15 np0005603622 nova_compute[221301]: 2026-01-31 08:12:15.737 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:15.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.084 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.084 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.085 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.085 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.085 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.086 221324 INFO nova.compute.manager [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Terminating instance#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.088 221324 DEBUG nova.compute.manager [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:12:17 np0005603622 kernel: tap09451288-59 (unregistering): left promiscuous mode
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.136 221324 DEBUG nova.compute.manager [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.136 221324 DEBUG oslo_concurrency.lockutils [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.136 221324 DEBUG oslo_concurrency.lockutils [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.137 221324 DEBUG oslo_concurrency.lockutils [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.137 221324 DEBUG nova.compute.manager [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.137 221324 WARNING nova.compute.manager [req-62518d10-2ccc-42ea-a366-d4c4bf1d3ff6 req-77911061-461c-45d0-a142-15f883f2200b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received unexpected event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:12:17 np0005603622 NetworkManager[49080]: <info>  [1769847137.1379] device (tap09451288-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.149 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00318|binding|INFO|Releasing lport 09451288-598f-46f6-be9e-b950f73f24f7 from this chassis (sb_readonly=0)
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00319|binding|INFO|Setting lport 09451288-598f-46f6-be9e-b950f73f24f7 down in Southbound
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00320|binding|INFO|Removing iface tap09451288-59 ovn-installed in OVS
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.159 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:a5:d0 10.100.0.139'], port_security=['fa:16:3e:b9:a5:d0 10.100.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.139/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a49c35a-d05b-4ae0-9679-52bef61cd9dc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=09451288-598f-46f6-be9e-b950f73f24f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:17 np0005603622 kernel: tap53194663-ae (unregistering): left promiscuous mode
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.162 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 09451288-598f-46f6-be9e-b950f73f24f7 in datapath fd3db6e1-c411-452c-973b-3f63c98ddf49 unbound from our chassis#033[00m
Jan 31 03:12:17 np0005603622 NetworkManager[49080]: <info>  [1769847137.1640] device (tap53194663-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.165 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd3db6e1-c411-452c-973b-3f63c98ddf49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.167 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b521f64e-a9d3-46cb-8be0-180483b9a5e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.169 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49 namespace which is not needed anymore#033[00m
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00321|binding|INFO|Releasing lport 53194663-ae7f-4f54-9034-942c5881d66b from this chassis (sb_readonly=0)
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.172 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00322|binding|INFO|Setting lport 53194663-ae7f-4f54-9034-942c5881d66b down in Southbound
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00323|binding|INFO|Removing iface tap53194663-ae ovn-installed in OVS
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.175 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.189 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:36:06 10.100.1.167'], port_security=['fa:16:3e:6b:36:06 10.100.1.167'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.167/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c78a769-e218-4034-aa19-d1bf989f0728', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bd448df-2ee0-43cd-ae7b-ed825cd5f61a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=53194663-ae7f-4f54-9034-942c5881d66b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:17 np0005603622 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000059.scope: Deactivated successfully.
Jan 31 03:12:17 np0005603622 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000059.scope: Consumed 3.323s CPU time.
Jan 31 03:12:17 np0005603622 systemd-machined[190406]: Machine qemu-37-instance-00000059 terminated.
Jan 31 03:12:17 np0005603622 kernel: tap09451288-59: entered promiscuous mode
Jan 31 03:12:17 np0005603622 kernel: tap09451288-59 (unregistering): left promiscuous mode
Jan 31 03:12:17 np0005603622 NetworkManager[49080]: <info>  [1769847137.3105] manager: (tap09451288-59): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00324|binding|INFO|Claiming lport 09451288-598f-46f6-be9e-b950f73f24f7 for this chassis.
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00325|binding|INFO|09451288-598f-46f6-be9e-b950f73f24f7: Claiming fa:16:3e:b9:a5:d0 10.100.0.139
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.314 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 NetworkManager[49080]: <info>  [1769847137.3198] manager: (tap53194663-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.330 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:a5:d0 10.100.0.139'], port_security=['fa:16:3e:b9:a5:d0 10.100.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.139/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a49c35a-d05b-4ae0-9679-52bef61cd9dc, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=09451288-598f-46f6-be9e-b950f73f24f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [NOTICE]   (252805) : haproxy version is 2.8.14-c23fe91
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [NOTICE]   (252805) : path to executable is /usr/sbin/haproxy
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [WARNING]  (252805) : Exiting Master process...
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [WARNING]  (252805) : Exiting Master process...
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [ALERT]    (252805) : Current worker (252807) exited with code 143 (Terminated)
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49[252801]: [WARNING]  (252805) : All workers exited. Exiting... (0)
Jan 31 03:12:17 np0005603622 systemd[1]: libpod-9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab.scope: Deactivated successfully.
Jan 31 03:12:17 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:17Z|00326|binding|INFO|Releasing lport 09451288-598f-46f6-be9e-b950f73f24f7 from this chassis (sb_readonly=0)
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 podman[252958]: 2026-01-31 08:12:17.345286554 +0000 UTC m=+0.073515138 container died 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.348 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:a5:d0 10.100.0.139'], port_security=['fa:16:3e:b9:a5:d0 10.100.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.139/24', 'neutron:device_id': '82428132-ddb9-49b8-83a5-603ff90fb7a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88b896f61c644b6fac0351ce6828b6e1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52fcb5e9-44f4-488b-bcc4-743586d363ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a49c35a-d05b-4ae0-9679-52bef61cd9dc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=09451288-598f-46f6-be9e-b950f73f24f7) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.350 221324 INFO nova.virt.libvirt.driver [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Instance destroyed successfully.#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.351 221324 DEBUG nova.objects.instance [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lazy-loading 'resources' on Instance uuid 82428132-ddb9-49b8-83a5-603ff90fb7a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.367 221324 DEBUG nova.virt.libvirt.vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:12:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:12:15Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.368 221324 DEBUG nova.network.os_vif_util [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.369 221324 DEBUG nova.network.os_vif_util [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.370 221324 DEBUG os_vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.373 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09451288-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.416 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.419 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.425 221324 INFO os_vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:a5:d0,bridge_name='br-int',has_traffic_filtering=True,id=09451288-598f-46f6-be9e-b950f73f24f7,network=Network(fd3db6e1-c411-452c-973b-3f63c98ddf49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09451288-59')#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.427 221324 DEBUG nova.virt.libvirt.vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:11:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-619605874',display_name='tempest-ServersTestMultiNic-server-619605874',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-619605874',id=89,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:12:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88b896f61c644b6fac0351ce6828b6e1',ramdisk_id='',reservation_id='r-icxi91ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1053198737',owner_user_name='tempest-ServersTestMultiNic-1053198737-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:12:15Z,user_data=None,user_id='55f81600a60b49aaae5b4c28549afdaf',uuid=82428132-ddb9-49b8-83a5-603ff90fb7a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.427 221324 DEBUG nova.network.os_vif_util [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converting VIF {"id": "53194663-ae7f-4f54-9034-942c5881d66b", "address": "fa:16:3e:6b:36:06", "network": {"id": "5c78a769-e218-4034-aa19-d1bf989f0728", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2081669385", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53194663-ae", "ovs_interfaceid": "53194663-ae7f-4f54-9034-942c5881d66b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.428 221324 DEBUG nova.network.os_vif_util [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.428 221324 DEBUG os_vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab-userdata-shm.mount: Deactivated successfully.
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.430 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53194663-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay-e5cdb17d7c4725f8f7a6ef75b0a90b3df4465102c7c0e0c5bb80bbe8d5f0a011-merged.mount: Deactivated successfully.
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.434 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.439 221324 INFO os_vif [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:36:06,bridge_name='br-int',has_traffic_filtering=True,id=53194663-ae7f-4f54-9034-942c5881d66b,network=Network(5c78a769-e218-4034-aa19-d1bf989f0728),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53194663-ae')#033[00m
Jan 31 03:12:17 np0005603622 podman[252958]: 2026-01-31 08:12:17.446838755 +0000 UTC m=+0.175067329 container cleanup 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:12:17 np0005603622 systemd[1]: libpod-conmon-9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab.scope: Deactivated successfully.
Jan 31 03:12:17 np0005603622 podman[253012]: 2026-01-31 08:12:17.532564831 +0000 UTC m=+0.057977820 container remove 9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.537 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[36b56ee7-6588-4335-badf-c269b39af192]: (4, ('Sat Jan 31 08:12:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49 (9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab)\n9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab\nSat Jan 31 08:12:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49 (9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab)\n9e464b88555396cf10560b9aeb67c2335a129bcbb41e0724676d1ce0d3764bab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.540 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd3bd0e-d5b9-4fa2-b455-ffd0e45c7b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.541 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd3db6e1-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.543 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 kernel: tapfd3db6e1-c0: left promiscuous mode
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.547 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.551 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[135e6d19-2a2c-4310-9291-e0f5275cfd3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.569 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b95013e7-b072-4da1-85c5-a552c453aba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[30c0129f-db0a-4d16-9966-59000250d5a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.591 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f445ba7-613d-46b1-bcdf-c6f2a14df984]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638356, 'reachable_time': 27263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253036, 'error': None, 'target': 'ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 systemd[1]: run-netns-ovnmeta\x2dfd3db6e1\x2dc411\x2d452c\x2d973b\x2d3f63c98ddf49.mount: Deactivated successfully.
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.597 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd3db6e1-c411-452c-973b-3f63c98ddf49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.598 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[355884b6-a89e-4e66-9ba6-720bae697c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.600 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 53194663-ae7f-4f54-9034-942c5881d66b in datapath 5c78a769-e218-4034-aa19-d1bf989f0728 unbound from our chassis#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.604 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c78a769-e218-4034-aa19-d1bf989f0728, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.605 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[19a4a81e-97e0-4665-8006-f925a1e6718f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.606 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728 namespace which is not needed anymore#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.643 221324 DEBUG nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-unplugged-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.645 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.645 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.645 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.646 221324 DEBUG nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-unplugged-09451288-598f-46f6-be9e-b950f73f24f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.646 221324 DEBUG nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-unplugged-09451288-598f-46f6-be9e-b950f73f24f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.647 221324 DEBUG nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.647 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.647 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.648 221324 DEBUG oslo_concurrency.lockutils [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.648 221324 DEBUG nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.649 221324 WARNING nova.compute.manager [req-fee1fa1b-bafb-4a4e-9924-55880db1729e req-08b9687b-bd8f-4b1f-8e73-c029addf3237 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received unexpected event network-vif-plugged-09451288-598f-46f6-be9e-b950f73f24f7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:12:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:17.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [NOTICE]   (252922) : haproxy version is 2.8.14-c23fe91
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [NOTICE]   (252922) : path to executable is /usr/sbin/haproxy
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [WARNING]  (252922) : Exiting Master process...
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [ALERT]    (252922) : Current worker (252924) exited with code 143 (Terminated)
Jan 31 03:12:17 np0005603622 neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728[252918]: [WARNING]  (252922) : All workers exited. Exiting... (0)
Jan 31 03:12:17 np0005603622 systemd[1]: libpod-f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519.scope: Deactivated successfully.
Jan 31 03:12:17 np0005603622 podman[253055]: 2026-01-31 08:12:17.780367785 +0000 UTC m=+0.068832422 container died f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519-userdata-shm.mount: Deactivated successfully.
Jan 31 03:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9879150543edecf75dede2c22d2ecf81d24da60bf35b9c24c7f07547c28993aa-merged.mount: Deactivated successfully.
Jan 31 03:12:17 np0005603622 podman[253055]: 2026-01-31 08:12:17.820210007 +0000 UTC m=+0.108674684 container cleanup f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:12:17 np0005603622 systemd[1]: libpod-conmon-f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519.scope: Deactivated successfully.
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.885 221324 INFO nova.virt.libvirt.driver [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Deleting instance files /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3_del#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.886 221324 INFO nova.virt.libvirt.driver [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Deletion of /var/lib/nova/instances/82428132-ddb9-49b8-83a5-603ff90fb7a3_del complete#033[00m
Jan 31 03:12:17 np0005603622 podman[253084]: 2026-01-31 08:12:17.899017256 +0000 UTC m=+0.052186524 container remove f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.906 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf9baf9-4bbc-4b65-a284-93842f461079]: (4, ('Sat Jan 31 08:12:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728 (f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519)\nf118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519\nSat Jan 31 08:12:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728 (f118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519)\nf118fc6e5f756b450f0a12bea63fe12da773cbb92c1a5ee1b62914359768b519\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.908 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[03f879f2-709c-46d6-b379-869c5cfc9b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.910 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c78a769-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.911 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 kernel: tap5c78a769-e0: left promiscuous mode
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.922 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.926 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5c9e1ff0-d538-4b5f-b8c3-c8932c7f2b7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.940 221324 INFO nova.compute.manager [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.941 221324 DEBUG oslo.service.loopingcall [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.942 221324 DEBUG nova.compute.manager [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:12:17 np0005603622 nova_compute[221301]: 2026-01-31 08:12:17.942 221324 DEBUG nova.network.neutron [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.943 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2388fdb3-17a9-4ad5-bcb3-c09a9db68b91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.945 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bec98f5a-dea9-4671-881e-cfc25cc5acaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.962 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0adf0b4e-3a83-4444-8faf-ff1f4ed68c68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638449, 'reachable_time': 24901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253099, 'error': None, 'target': 'ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.964 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c78a769-e218-4034-aa19-d1bf989f0728 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.964 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[fb504d79-279c-43ce-bd3f-6f459fd1fecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.965 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 09451288-598f-46f6-be9e-b950f73f24f7 in datapath fd3db6e1-c411-452c-973b-3f63c98ddf49 unbound from our chassis#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.966 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd3db6e1-c411-452c-973b-3f63c98ddf49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.967 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fc464086-8014-41dc-8558-ab4b9643097e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.967 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 09451288-598f-46f6-be9e-b950f73f24f7 in datapath fd3db6e1-c411-452c-973b-3f63c98ddf49 unbound from our chassis#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.969 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd3db6e1-c411-452c-973b-3f63c98ddf49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:17.969 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f8187464-e748-4009-95fa-46984c215e95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:17.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:18 np0005603622 systemd[1]: run-netns-ovnmeta\x2d5c78a769\x2de218\x2d4034\x2daa19\x2dd1bf989f0728.mount: Deactivated successfully.
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.291 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-unplugged-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.291 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.292 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.292 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.292 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-unplugged-53194663-ae7f-4f54-9034-942c5881d66b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.292 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-unplugged-53194663-ae7f-4f54-9034-942c5881d66b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.293 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.293 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.293 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.293 221324 DEBUG oslo_concurrency.lockutils [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.293 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] No waiting events found dispatching network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.294 221324 WARNING nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received unexpected event network-vif-plugged-53194663-ae7f-4f54-9034-942c5881d66b for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.294 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-deleted-53194663-ae7f-4f54-9034-942c5881d66b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.294 221324 INFO nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Neutron deleted interface 53194663-ae7f-4f54-9034-942c5881d66b; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.294 221324 DEBUG nova.network.neutron [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updating instance_info_cache with network_info: [{"id": "09451288-598f-46f6-be9e-b950f73f24f7", "address": "fa:16:3e:b9:a5:d0", "network": {"id": "fd3db6e1-c411-452c-973b-3f63c98ddf49", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1618755788", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88b896f61c644b6fac0351ce6828b6e1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09451288-59", "ovs_interfaceid": "09451288-598f-46f6-be9e-b950f73f24f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.519 221324 DEBUG nova.compute.manager [req-087c60f6-c9d5-47d5-9802-51f8352f1d34 req-ab0e68cb-0cee-440a-a9be-e48d056ae58a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Detach interface failed, port_id=53194663-ae7f-4f54-9034-942c5881d66b, reason: Instance 82428132-ddb9-49b8-83a5-603ff90fb7a3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.710 221324 DEBUG nova.network.neutron [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:19.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.764 221324 INFO nova.compute.manager [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Took 1.82 seconds to deallocate network for instance.#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.895 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.896 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:19 np0005603622 nova_compute[221301]: 2026-01-31 08:12:19.913 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:19.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.010 221324 DEBUG oslo_concurrency.processutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3602157875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.479 221324 DEBUG oslo_concurrency.processutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.488 221324 DEBUG nova.compute.provider_tree [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.518 221324 DEBUG nova.scheduler.client.report [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.551 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.585 221324 INFO nova.scheduler.client.report [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Deleted allocations for instance 82428132-ddb9-49b8-83a5-603ff90fb7a3#033[00m
Jan 31 03:12:20 np0005603622 nova_compute[221301]: 2026-01-31 08:12:20.680 221324 DEBUG oslo_concurrency.lockutils [None req-3b72074a-548e-4091-b3f2-88b7fadb2619 55f81600a60b49aaae5b4c28549afdaf 88b896f61c644b6fac0351ce6828b6e1 - - default default] Lock "82428132-ddb9-49b8-83a5-603ff90fb7a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:21 np0005603622 nova_compute[221301]: 2026-01-31 08:12:21.401 221324 DEBUG nova.compute.manager [req-2d0679f9-1e10-4cce-be85-2cf6b39a6c19 req-5cef5032-fc46-4ce9-adaa-6c1c105e0914 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Received event network-vif-deleted-09451288-598f-46f6-be9e-b950f73f24f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:21.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:21.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:22 np0005603622 nova_compute[221301]: 2026-01-31 08:12:22.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:23.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.859 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.860 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.880 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.968 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.969 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.978 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:12:23 np0005603622 nova_compute[221301]: 2026-01-31 08:12:23.978 221324 INFO nova.compute.claims [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:12:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:23.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:24Z|00327|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.143 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1456240204' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.626 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.636 221324 DEBUG nova.compute.provider_tree [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.675 221324 DEBUG nova.scheduler.client.report [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.759 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.760 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.819 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.819 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.849 221324 INFO nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.889 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:12:24 np0005603622 nova_compute[221301]: 2026-01-31 08:12:24.912 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.436 221324 DEBUG nova.policy [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '111fdaf79c084a91902fe37a7a502020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58e900992be7400fb940ca20f13e12d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.636 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.640 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.642 221324 INFO nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Creating image(s)#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.695 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.735 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:25.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.784 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.791 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.869 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.870 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.871 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.872 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.913 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:25 np0005603622 nova_compute[221301]: 2026-01-31 08:12:25.919 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9e88446e-2147-4f66-9f77-23949a27f7e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:25.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.268 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9e88446e-2147-4f66-9f77-23949a27f7e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.366 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] resizing rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.410 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.510 221324 DEBUG nova.objects.instance [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e88446e-2147-4f66-9f77-23949a27f7e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.553 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.554 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Ensure instance console log exists: /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.555 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.555 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.556 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:26 np0005603622 nova_compute[221301]: 2026-01-31 08:12:26.906 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Successfully created port: 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:12:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:12:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8356 writes, 42K keys, 8356 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8356 writes, 8356 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1765 writes, 8360 keys, 1765 commit groups, 1.0 writes per commit group, ingest: 16.94 MB, 0.03 MB/s#012Interval WAL: 1765 writes, 1765 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     55.5      0.93              0.14        23    0.040       0      0       0.0       0.0#012  L6      1/0   11.45 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9     93.8     78.2      2.57              0.57        22    0.117    121K    12K       0.0       0.0#012 Sum      1/0   11.45 MB   0.0      0.2     0.1      0.2       0.2      0.1       0.0   4.9     68.9     72.2      3.50              0.71        45    0.078    121K    12K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.1    147.6    150.6      0.40              0.18        10    0.040     34K   2597       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0     93.8     78.2      2.57              0.57        22    0.117    121K    12K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     55.6      0.93              0.14        22    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.050, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.08 MB/s write, 0.24 GB read, 0.08 MB/s read, 3.5 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 28.13 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000341 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1624,27.19 MB,8.94311%) FilterBlock(45,342.61 KB,0.110059%) IndexBlock(45,620.14 KB,0.199213%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:12:27 np0005603622 podman[253310]: 2026-01-31 08:12:27.197635229 +0000 UTC m=+0.123248046 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:12:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:27 np0005603622 nova_compute[221301]: 2026-01-31 08:12:27.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:27.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:27.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.607 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Successfully updated port: 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.629 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.629 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:28 np0005603622 nova_compute[221301]: 2026-01-31 08:12:28.629 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:28.791740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148791889, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 865, "num_deletes": 250, "total_data_size": 1464368, "memory_usage": 1480576, "flush_reason": "Manual Compaction"}
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847148988915, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 643124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41852, "largest_seqno": 42712, "table_properties": {"data_size": 639804, "index_size": 1100, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9368, "raw_average_key_size": 20, "raw_value_size": 632583, "raw_average_value_size": 1405, "num_data_blocks": 48, "num_entries": 450, "num_filter_entries": 450, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847094, "oldest_key_time": 1769847094, "file_creation_time": 1769847148, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 197253 microseconds, and 2898 cpu microseconds.
Jan 31 03:12:28 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:12:29 np0005603622 nova_compute[221301]: 2026-01-31 08:12:29.007 221324 DEBUG nova.compute.manager [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-changed-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:29 np0005603622 nova_compute[221301]: 2026-01-31 08:12:29.008 221324 DEBUG nova.compute.manager [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Refreshing instance network info cache due to event network-changed-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:12:29 np0005603622 nova_compute[221301]: 2026-01-31 08:12:29.008 221324 DEBUG oslo_concurrency.lockutils [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:28.989001) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 643124 bytes OK
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:28.989034) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.013197) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.013234) EVENT_LOG_v1 {"time_micros": 1769847149013222, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.013265) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1459947, prev total WAL file size 1459947, number of live WAL files 2.
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.014336) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323532' seq:72057594037927935, type:22 .. '6D6772737461740031353033' seq:0, type:0; will stop at (end)
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(628KB)], [78(11MB)]
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847149014494, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12647245, "oldest_snapshot_seqno": -1}
Jan 31 03:12:29 np0005603622 nova_compute[221301]: 2026-01-31 08:12:29.132 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6741 keys, 9123228 bytes, temperature: kUnknown
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847149236739, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9123228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9079990, "index_size": 25270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 173062, "raw_average_key_size": 25, "raw_value_size": 8961362, "raw_average_value_size": 1329, "num_data_blocks": 999, "num_entries": 6741, "num_filter_entries": 6741, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.237159) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9123228 bytes
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.239890) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 56.9 rd, 41.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.4 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(33.9) write-amplify(14.2) OK, records in: 7233, records dropped: 492 output_compression: NoCompression
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.239927) EVENT_LOG_v1 {"time_micros": 1769847149239910, "job": 48, "event": "compaction_finished", "compaction_time_micros": 222413, "compaction_time_cpu_micros": 41472, "output_level": 6, "num_output_files": 1, "total_output_size": 9123228, "num_input_records": 7233, "num_output_records": 6741, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847149240328, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847149242651, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.014123) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.242843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.242854) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.242857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.242860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:12:29.242863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:12:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:29.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:29 np0005603622 nova_compute[221301]: 2026-01-31 08:12:29.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:29.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:30 np0005603622 podman[253337]: 2026-01-31 08:12:30.178917417 +0000 UTC m=+0.101127591 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.654 221324 DEBUG nova.network.neutron [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updating instance_info_cache with network_info: [{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.753 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.754 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance network_info: |[{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.755 221324 DEBUG oslo_concurrency.lockutils [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.755 221324 DEBUG nova.network.neutron [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Refreshing network info cache for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.760 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Start _get_guest_xml network_info=[{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.767 221324 WARNING nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.775 221324 DEBUG nova.virt.libvirt.host [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.776 221324 DEBUG nova.virt.libvirt.host [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.787 221324 DEBUG nova.virt.libvirt.host [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.788 221324 DEBUG nova.virt.libvirt.host [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.790 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.790 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.791 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.792 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.792 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.793 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.793 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.794 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.794 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.795 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.795 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.796 221324 DEBUG nova.virt.hardware [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:12:30 np0005603622 nova_compute[221301]: 2026-01-31 08:12:30.801 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:30.865 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:30.867 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4279155572' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.285 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.326 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.331 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:31.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:12:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/396282033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:12:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:31Z|00328|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.809 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.812 221324 DEBUG nova.virt.libvirt.vif [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:12:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2042269499',display_name='tempest-ServerDiskConfigTestJSON-server-2042269499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2042269499',id=90,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-619dusqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:24Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=9e88446e-2147-4f66-9f77-23949a27f7e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.813 221324 DEBUG nova.network.os_vif_util [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.814 221324 DEBUG nova.network.os_vif_util [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.816 221324 DEBUG nova.objects.instance [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e88446e-2147-4f66-9f77-23949a27f7e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.852 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <uuid>9e88446e-2147-4f66-9f77-23949a27f7e6</uuid>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <name>instance-0000005a</name>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-2042269499</nova:name>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:12:30</nova:creationTime>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:user uuid="111fdaf79c084a91902fe37a7a502020">tempest-ServerDiskConfigTestJSON-855158150-project-member</nova:user>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:project uuid="58e900992be7400fb940ca20f13e12d1">tempest-ServerDiskConfigTestJSON-855158150</nova:project>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <nova:port uuid="4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="serial">9e88446e-2147-4f66-9f77-23949a27f7e6</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="uuid">9e88446e-2147-4f66-9f77-23949a27f7e6</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9e88446e-2147-4f66-9f77-23949a27f7e6_disk">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:39:73:b2"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <target dev="tap4d96f0c7-92"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/console.log" append="off"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:12:31 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:12:31 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:12:31 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:12:31 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.854 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Preparing to wait for external event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.855 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.856 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.856 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.858 221324 DEBUG nova.virt.libvirt.vif [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:12:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2042269499',display_name='tempest-ServerDiskConfigTestJSON-server-2042269499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2042269499',id=90,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-619dusqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:12:24Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=9e88446e-2147-4f66-9f77-23949a27f7e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.858 221324 DEBUG nova.network.os_vif_util [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.860 221324 DEBUG nova.network.os_vif_util [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.861 221324 DEBUG os_vif [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.862 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.863 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.870 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d96f0c7-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.871 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d96f0c7-92, col_values=(('external_ids', {'iface-id': '4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:73:b2', 'vm-uuid': '9e88446e-2147-4f66-9f77-23949a27f7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603622 NetworkManager[49080]: <info>  [1769847151.8743] manager: (tap4d96f0c7-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.876 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:31 np0005603622 nova_compute[221301]: 2026-01-31 08:12:31.881 221324 INFO os_vif [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92')#033[00m
Jan 31 03:12:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:32.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.163 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.164 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.164 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No VIF found with MAC fa:16:3e:39:73:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.165 221324 INFO nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Using config drive#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.194 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.333 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847137.3328652, 82428132-ddb9-49b8-83a5-603ff90fb7a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.334 221324 INFO nova.compute.manager [-] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.373 221324 DEBUG nova.compute.manager [None req-dce4a2f9-83d8-4bab-98fb-1c449b0900be - - - - - -] [instance: 82428132-ddb9-49b8-83a5-603ff90fb7a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.854 221324 INFO nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Creating config drive at /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config#033[00m
Jan 31 03:12:32 np0005603622 nova_compute[221301]: 2026-01-31 08:12:32.860 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpa_yu7xvg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:12:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/94253250' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:12:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:12:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/94253250' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.001 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpa_yu7xvg" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.047 221324 DEBUG nova.storage.rbd_utils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.054 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config 9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.202 221324 DEBUG nova.network.neutron [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updated VIF entry in instance network info cache for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.203 221324 DEBUG nova.network.neutron [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updating instance_info_cache with network_info: [{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.229 221324 DEBUG oslo_concurrency.lockutils [req-26d51330-3ae1-406d-bcbb-64ed690cf262 req-1bfb21ec-d4d6-4481-a2bf-6c579fe3a326 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.252 221324 DEBUG oslo_concurrency.processutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config 9e88446e-2147-4f66-9f77-23949a27f7e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.253 221324 INFO nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Deleting local config drive /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/disk.config because it was imported into RBD.#033[00m
Jan 31 03:12:33 np0005603622 kernel: tap4d96f0c7-92: entered promiscuous mode
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.3175] manager: (tap4d96f0c7-92): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.317 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:33Z|00329|binding|INFO|Claiming lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for this chassis.
Jan 31 03:12:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:33Z|00330|binding|INFO|4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb: Claiming fa:16:3e:39:73:b2 10.100.0.13
Jan 31 03:12:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:33Z|00331|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb ovn-installed in OVS
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:33Z|00332|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb up in Southbound
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.329 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:73:b2 10.100.0.13'], port_security=['fa:16:3e:39:73:b2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9e88446e-2147-4f66-9f77-23949a27f7e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.330 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 bound to our chassis#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.332 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f218695f-c744-4bd8-b2d8-122a920c7ca0#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.345 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[da24c0bc-81e1-496c-8c7b-a780c8a9cc95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.346 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf218695f-c1 in ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.348 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf218695f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.348 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d1ace9-e856-4601-8d8e-eaec47456da3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.349 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8f397e-e923-48fe-a0a5-583147e19ad3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 systemd-machined[190406]: New machine qemu-38-instance-0000005a.
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.364 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[acf6079e-66f6-4d45-941f-38436e065f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 systemd-udevd[253495]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.380 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd00a9a-479c-4b3e-9c86-c6152f32844d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 systemd[1]: Started Virtual Machine qemu-38-instance-0000005a.
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.3856] device (tap4d96f0c7-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.3867] device (tap4d96f0c7-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.410 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[72aea27f-213d-4a7b-a50d-964c7d7a94ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.418 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[858a5384-cef6-4048-96c1-92c6078970ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.4195] manager: (tapf218695f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.451 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b60e6c8b-0611-447a-a4db-f1ee5d03163a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.456 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1fa3b4-411b-4371-91ac-40c9af8a750b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.4819] device (tapf218695f-c0): carrier: link connected
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.485 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[937642bb-704d-4cf5-a2be-891eb1293c4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.502 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7975a75c-4ca6-4bef-a450-92c6dd1bd2e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640482, 'reachable_time': 38240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253526, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.522 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68fb0371-88f7-48f8-aea2-43917b4501a2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 640482, 'tstamp': 640482}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253527, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.539 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[30eed1e8-eb2c-4f1a-940d-99ae95a78da8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640482, 'reachable_time': 38240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253528, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.573 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2552fe-ef0c-4002-819f-e74fc1c2113a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.649 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[05268496-dffc-45e0-b617-204029174996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.651 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.651 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.651 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf218695f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:33 np0005603622 kernel: tapf218695f-c0: entered promiscuous mode
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 NetworkManager[49080]: <info>  [1769847153.6559] manager: (tapf218695f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.659 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf218695f-c0, col_values=(('external_ids', {'iface-id': 'd3a551a2-38e3-48d3-bdee-f2493a79eca0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:33Z|00333|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.719 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.720 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.721 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2999ee07-1b81-4df3-bd86-d0b41ec504e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.722 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:12:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:33.723 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'env', 'PROCESS_TAG=haproxy-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f218695f-c744-4bd8-b2d8-122a920c7ca0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:12:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.902 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847153.9011977, 9e88446e-2147-4f66-9f77-23949a27f7e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.902 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] VM Started (Lifecycle Event)#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.933 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.939 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847153.9015026, 9e88446e-2147-4f66-9f77-23949a27f7e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:33 np0005603622 nova_compute[221301]: 2026-01-31 08:12:33.939 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:12:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:34.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:34 np0005603622 podman[253602]: 2026-01-31 08:12:34.095458006 +0000 UTC m=+0.024428348 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.216 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.223 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:34 np0005603622 podman[253602]: 2026-01-31 08:12:34.29787457 +0000 UTC m=+0.226844862 container create a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.302 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:34 np0005603622 systemd[1]: Started libpod-conmon-a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5.scope.
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.379 221324 DEBUG nova.compute.manager [req-5f8964dd-fd03-44a2-afa5-048c9105a7c8 req-f46a3d8a-78c1-4729-bc71-764e43456b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.389 221324 DEBUG oslo_concurrency.lockutils [req-5f8964dd-fd03-44a2-afa5-048c9105a7c8 req-f46a3d8a-78c1-4729-bc71-764e43456b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.390 221324 DEBUG oslo_concurrency.lockutils [req-5f8964dd-fd03-44a2-afa5-048c9105a7c8 req-f46a3d8a-78c1-4729-bc71-764e43456b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.390 221324 DEBUG oslo_concurrency.lockutils [req-5f8964dd-fd03-44a2-afa5-048c9105a7c8 req-f46a3d8a-78c1-4729-bc71-764e43456b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.391 221324 DEBUG nova.compute.manager [req-5f8964dd-fd03-44a2-afa5-048c9105a7c8 req-f46a3d8a-78c1-4729-bc71-764e43456b9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Processing event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.393 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:34 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.400 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847154.3999207, 9e88446e-2147-4f66-9f77-23949a27f7e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.401 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:12:34 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d78aaa1757cc14aedabb257e7cd8832bc1571f7189eaf6403e0707b3d628424/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.407 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.417 221324 INFO nova.virt.libvirt.driver [-] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance spawned successfully.#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.418 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.443 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.450 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.454 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.455 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.455 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.455 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.456 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.456 221324 DEBUG nova.virt.libvirt.driver [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.507 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:12:34 np0005603622 podman[253602]: 2026-01-31 08:12:34.5444328 +0000 UTC m=+0.473403142 container init a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:12:34 np0005603622 podman[253602]: 2026-01-31 08:12:34.54965534 +0000 UTC m=+0.478625632 container start a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:12:34 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [NOTICE]   (253621) : New worker (253623) forked
Jan 31 03:12:34 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [NOTICE]   (253621) : Loading success.
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.575 221324 INFO nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Took 8.94 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.576 221324 DEBUG nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.736 221324 INFO nova.compute.manager [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Took 10.80 seconds to build instance.#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.791 221324 DEBUG oslo_concurrency.lockutils [None req-5b67a155-a25a-4a61-aa7f-92a214157e84 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:34 np0005603622 nova_compute[221301]: 2026-01-31 08:12:34.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:35 np0005603622 nova_compute[221301]: 2026-01-31 08:12:35.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:35 np0005603622 nova_compute[221301]: 2026-01-31 08:12:35.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:12:35 np0005603622 nova_compute[221301]: 2026-01-31 08:12:35.426 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:12:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:12:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:35.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:12:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:36.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.452 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.452 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.453 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.453 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.795 221324 DEBUG nova.compute.manager [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.795 221324 DEBUG oslo_concurrency.lockutils [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.796 221324 DEBUG oslo_concurrency.lockutils [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.796 221324 DEBUG oslo_concurrency.lockutils [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.796 221324 DEBUG nova.compute.manager [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.797 221324 WARNING nova.compute.manager [req-393d657f-6a20-4be1-b284-8b0fe68e6bb3 req-ea430835-98ad-4785-9a32-abfd071bb629 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state None.#033[00m
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3168890202' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:36 np0005603622 nova_compute[221301]: 2026-01-31 08:12:36.954 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.110 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.111 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.115 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.115 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.285 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.286 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4268MB free_disk=20.92190170288086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.286 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.287 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.373 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 183fba4a-610d-4b96-a12f-996f9f2dc30b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.374 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9e88446e-2147-4f66-9f77-23949a27f7e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.374 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.375 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.454 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:12:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:37.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:12:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:12:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4259134751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.920 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:37 np0005603622 nova_compute[221301]: 2026-01-31 08:12:37.929 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:12:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:38.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:38 np0005603622 nova_compute[221301]: 2026-01-31 08:12:38.113 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:12:38 np0005603622 nova_compute[221301]: 2026-01-31 08:12:38.333 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:12:38 np0005603622 nova_compute[221301]: 2026-01-31 08:12:38.334 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:39Z|00334|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:12:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:39Z|00335|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:39 np0005603622 nova_compute[221301]: 2026-01-31 08:12:39.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:12:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:39.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:12:39 np0005603622 nova_compute[221301]: 2026-01-31 08:12:39.968 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:40.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:41 np0005603622 nova_compute[221301]: 2026-01-31 08:12:41.331 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:12:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:12:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:12:41 np0005603622 nova_compute[221301]: 2026-01-31 08:12:41.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:42.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:43.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:44.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:44 np0005603622 nova_compute[221301]: 2026-01-31 08:12:44.973 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:45.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:46.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:46 np0005603622 nova_compute[221301]: 2026-01-31 08:12:46.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:47Z|00336|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=0)
Jan 31 03:12:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:47Z|00337|binding|INFO|Releasing lport 5bb8c1b5-edce-4f6a-8164-58b7d89a3330 from this chassis (sb_readonly=0)
Jan 31 03:12:47 np0005603622 nova_compute[221301]: 2026-01-31 08:12:47.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:47.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:48.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:48 np0005603622 nova_compute[221301]: 2026-01-31 08:12:48.337 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:12:48 np0005603622 nova_compute[221301]: 2026-01-31 08:12:48.338 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:12:48 np0005603622 nova_compute[221301]: 2026-01-31 08:12:48.338 221324 DEBUG nova.network.neutron [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:12:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:48Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:73:b2 10.100.0.13
Jan 31 03:12:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:48Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:73:b2 10.100.0.13
Jan 31 03:12:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:49.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:49 np0005603622 nova_compute[221301]: 2026-01-31 08:12:49.997 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:50.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:51.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:51 np0005603622 nova_compute[221301]: 2026-01-31 08:12:51.889 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:52.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:53.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:54.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:54 np0005603622 nova_compute[221301]: 2026-01-31 08:12:54.701 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:54.701 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:54.704 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:12:54 np0005603622 nova_compute[221301]: 2026-01-31 08:12:54.790 221324 DEBUG nova.network.neutron [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updating instance_info_cache with network_info: [{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:12:54 np0005603622 nova_compute[221301]: 2026-01-31 08:12:54.837 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.051 221324 DEBUG nova.virt.libvirt.driver [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.052 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Creating file /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/a6cfe8e6f99c452196b548ac0efa92d6.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.052 221324 DEBUG oslo_concurrency.processutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/a6cfe8e6f99c452196b548ac0efa92d6.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.482 221324 DEBUG oslo_concurrency.processutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/a6cfe8e6f99c452196b548ac0efa92d6.tmp" returned: 1 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.483 221324 DEBUG oslo_concurrency.processutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6/a6cfe8e6f99c452196b548ac0efa92d6.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.483 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Creating directory /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.484 221324 DEBUG oslo_concurrency.processutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.683 221324 DEBUG oslo_concurrency.processutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/9e88446e-2147-4f66-9f77-23949a27f7e6" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:12:55 np0005603622 nova_compute[221301]: 2026-01-31 08:12:55.689 221324 DEBUG nova.virt.libvirt.driver [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:12:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:12:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:55.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:12:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:56.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:56 np0005603622 nova_compute[221301]: 2026-01-31 08:12:56.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:12:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:57.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:12:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:12:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:12:58.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:12:58 np0005603622 kernel: tap4d96f0c7-92 (unregistering): left promiscuous mode
Jan 31 03:12:58 np0005603622 NetworkManager[49080]: <info>  [1769847178.2048] device (tap4d96f0c7-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00338|binding|INFO|Releasing lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb from this chassis (sb_readonly=0)
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00339|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb down in Southbound
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00340|binding|INFO|Removing iface tap4d96f0c7-92 ovn-installed in OVS
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 podman[253680]: 2026-01-31 08:12:58.226785874 +0000 UTC m=+0.129058792 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.236 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:73:b2 10.100.0.13'], port_security=['fa:16:3e:39:73:b2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9e88446e-2147-4f66-9f77-23949a27f7e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.238 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.239 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.242 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dce7094c-9fc9-4ede-baf6-b75f58bfde3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.243 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace which is not needed anymore#033[00m
Jan 31 03:12:58 np0005603622 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 31 03:12:58 np0005603622 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005a.scope: Consumed 14.045s CPU time.
Jan 31 03:12:58 np0005603622 systemd-machined[190406]: Machine qemu-38-instance-0000005a terminated.
Jan 31 03:12:58 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [NOTICE]   (253621) : haproxy version is 2.8.14-c23fe91
Jan 31 03:12:58 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [NOTICE]   (253621) : path to executable is /usr/sbin/haproxy
Jan 31 03:12:58 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [WARNING]  (253621) : Exiting Master process...
Jan 31 03:12:58 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [ALERT]    (253621) : Current worker (253623) exited with code 143 (Terminated)
Jan 31 03:12:58 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[253617]: [WARNING]  (253621) : All workers exited. Exiting... (0)
Jan 31 03:12:58 np0005603622 systemd[1]: libpod-a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5.scope: Deactivated successfully.
Jan 31 03:12:58 np0005603622 podman[253731]: 2026-01-31 08:12:58.395425119 +0000 UTC m=+0.068336188 container died a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:12:58 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5-userdata-shm.mount: Deactivated successfully.
Jan 31 03:12:58 np0005603622 kernel: tap4d96f0c7-92: entered promiscuous mode
Jan 31 03:12:58 np0005603622 NetworkManager[49080]: <info>  [1769847178.4329] manager: (tap4d96f0c7-92): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Jan 31 03:12:58 np0005603622 kernel: tap4d96f0c7-92 (unregistering): left promiscuous mode
Jan 31 03:12:58 np0005603622 systemd[1]: var-lib-containers-storage-overlay-1d78aaa1757cc14aedabb257e7cd8832bc1571f7189eaf6403e0707b3d628424-merged.mount: Deactivated successfully.
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00341|binding|INFO|Claiming lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for this chassis.
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.437 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00342|binding|INFO|4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb: Claiming fa:16:3e:39:73:b2 10.100.0.13
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.451 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:73:b2 10.100.0.13'], port_security=['fa:16:3e:39:73:b2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9e88446e-2147-4f66-9f77-23949a27f7e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00343|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb ovn-installed in OVS
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00344|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb up in Southbound
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00345|binding|INFO|Releasing lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb from this chassis (sb_readonly=1)
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00346|if_status|INFO|Not setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb down as sb is readonly
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00347|binding|INFO|Removing iface tap4d96f0c7-92 ovn-installed in OVS
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00348|binding|INFO|Releasing lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb from this chassis (sb_readonly=0)
Jan 31 03:12:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:12:58Z|00349|binding|INFO|Setting lport 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb down in Southbound
Jan 31 03:12:58 np0005603622 podman[253731]: 2026-01-31 08:12:58.471755452 +0000 UTC m=+0.144666491 container cleanup a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.472 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 systemd[1]: libpod-conmon-a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5.scope: Deactivated successfully.
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.489 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:73:b2 10.100.0.13'], port_security=['fa:16:3e:39:73:b2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9e88446e-2147-4f66-9f77-23949a27f7e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:12:58 np0005603622 podman[253765]: 2026-01-31 08:12:58.578225586 +0000 UTC m=+0.079533420 container remove a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.584 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0cce22-db29-400f-9463-f4961189b909]: (4, ('Sat Jan 31 08:12:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5)\na67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5\nSat Jan 31 08:12:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (a67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5)\na67e3916e606a1979f941f3bb8c8bf4924df7399d7291c1e56111427aa238ad5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.586 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9aeedc73-e1f5-4ce6-be94-151b4cd100b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.588 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 kernel: tapf218695f-c0: left promiscuous mode
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.650 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c20447b6-7934-44e6-b49d-f5cbc80eadc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.665 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fcbadc-bc7d-48f7-846f-556a0abd426e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.667 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[33dbd1d2-abb4-4eea-9c17-f553f9fa64f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.686 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e4fe94-2546-4771-944a-177dc04911a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640474, 'reachable_time': 25217, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253784, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.695 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.695 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c27dc04d-d70d-4161-8be4-209d0922e08a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 systemd[1]: run-netns-ovnmeta\x2df218695f\x2dc744\x2d4bd8\x2db2d8\x2d122a920c7ca0.mount: Deactivated successfully.
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.697 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.700 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.701 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99b81b0e-3962-438a-a539-d7bb62b19304]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.702 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.705 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:12:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:12:58.706 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b1a050-285a-495b-b0b9-052e603791c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.710 221324 INFO nova.virt.libvirt.driver [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.719 221324 INFO nova.virt.libvirt.driver [-] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Instance destroyed successfully.#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.721 221324 DEBUG nova.virt.libvirt.vif [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:12:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2042269499',display_name='tempest-ServerDiskConfigTestJSON-server-2042269499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2042269499',id=90,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:12:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-619dusqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:12:47Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=9e88446e-2147-4f66-9f77-23949a27f7e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "vif_mac": "fa:16:3e:39:73:b2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.721 221324 DEBUG nova.network.os_vif_util [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "vif_mac": "fa:16:3e:39:73:b2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.723 221324 DEBUG nova.network.os_vif_util [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.724 221324 DEBUG os_vif [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.727 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.728 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d96f0c7-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.730 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.737 221324 INFO os_vif [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92')#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.744 221324 DEBUG nova.virt.libvirt.driver [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:58 np0005603622 nova_compute[221301]: 2026-01-31 08:12:58.745 221324 DEBUG nova.virt.libvirt.driver [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] skipping disk for instance-0000005a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.281 221324 DEBUG neutronclient.v2_0.client [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.457 221324 DEBUG nova.compute.manager [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.458 221324 DEBUG oslo_concurrency.lockutils [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.458 221324 DEBUG oslo_concurrency.lockutils [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.458 221324 DEBUG oslo_concurrency.lockutils [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.459 221324 DEBUG nova.compute.manager [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.459 221324 WARNING nova.compute.manager [req-ef06d781-b2f9-4936-8313-0717905761a6 req-828772f8-9f0f-4bc0-b426-0fc3580fda22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.571 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.572 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:12:59 np0005603622 nova_compute[221301]: 2026-01-31 08:12:59.572 221324 DEBUG oslo_concurrency.lockutils [None req-8ee045f2-a13d-45b8-8a67-60816e85fcf6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:12:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:12:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:12:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:12:59.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:00 np0005603622 nova_compute[221301]: 2026-01-31 08:13:00.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:00.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:00.708 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:01 np0005603622 podman[253785]: 2026-01-31 08:13:01.150309279 +0000 UTC m=+0.066158931 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.700 221324 DEBUG nova.compute.manager [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.701 221324 DEBUG oslo_concurrency.lockutils [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.702 221324 DEBUG oslo_concurrency.lockutils [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.702 221324 DEBUG oslo_concurrency.lockutils [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.702 221324 DEBUG nova.compute.manager [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:01 np0005603622 nova_compute[221301]: 2026-01-31 08:13:01.703 221324 WARNING nova.compute.manager [req-9ecb4aa9-a718-46a5-8bff-a571aec9b67d req-e6e7529a-e1d4-4e98-b294-b9ac3520db18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:13:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:01.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:13:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:02.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:13:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:02 np0005603622 nova_compute[221301]: 2026-01-31 08:13:02.672 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:03 np0005603622 nova_compute[221301]: 2026-01-31 08:13:03.767 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:03.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:04.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.180 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.181 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.181 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.181 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.181 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.182 221324 WARNING nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.182 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.182 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.182 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.182 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.183 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.183 221324 WARNING nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.183 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.183 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.183 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.184 221324 DEBUG oslo_concurrency.lockutils [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.184 221324 DEBUG nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:04 np0005603622 nova_compute[221301]: 2026-01-31 08:13:04.184 221324 WARNING nova.compute.manager [req-0c7f2b1d-5155-4246-9991-b65ec2a4ce38 req-2ca438ba-2436-4b04-b7e6-cf5e267a652d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-unplugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:13:05 np0005603622 nova_compute[221301]: 2026-01-31 08:13:05.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:13:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:13:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:05.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:06.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.650 221324 DEBUG nova.compute.manager [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.650 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.651 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.651 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.652 221324 DEBUG nova.compute.manager [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.652 221324 WARNING nova.compute.manager [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.652 221324 DEBUG nova.compute.manager [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-changed-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.653 221324 DEBUG nova.compute.manager [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Refreshing instance network info cache due to event network-changed-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.653 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.653 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:06 np0005603622 nova_compute[221301]: 2026-01-31 08:13:06.654 221324 DEBUG nova.network.neutron [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Refreshing network info cache for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:13:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:07.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:08.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:08 np0005603622 nova_compute[221301]: 2026-01-31 08:13:08.770 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:09.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:10.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:10 np0005603622 nova_compute[221301]: 2026-01-31 08:13:10.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:13:11 np0005603622 nova_compute[221301]: 2026-01-31 08:13:11.083 221324 DEBUG nova.network.neutron [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updated VIF entry in instance network info cache for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:13:11 np0005603622 nova_compute[221301]: 2026-01-31 08:13:11.084 221324 DEBUG nova.network.neutron [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updating instance_info_cache with network_info: [{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:11 np0005603622 nova_compute[221301]: 2026-01-31 08:13:11.442 221324 DEBUG oslo_concurrency.lockutils [req-8e9d0e71-3397-4d0c-a7d6-7ee985d4eca3 req-ac4aedcf-accc-4810-9fdf-c6231beb7e17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 31 03:13:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:11.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:12.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:12 np0005603622 nova_compute[221301]: 2026-01-31 08:13:12.417 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.458 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847178.4570973, 9e88446e-2147-4f66-9f77-23949a27f7e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.459 221324 INFO nova.compute.manager [-] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.511 221324 DEBUG nova.compute.manager [None req-91c7e239-fcaa-42f6-925b-83957b8fb5fc - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.516 221324 DEBUG nova.compute.manager [None req-91c7e239-fcaa-42f6-925b-83957b8fb5fc - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.545 221324 INFO nova.compute.manager [None req-91c7e239-fcaa-42f6-925b-83957b8fb5fc - - - - - -] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:13:13 np0005603622 nova_compute[221301]: 2026-01-31 08:13:13.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:13.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:14.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.866 221324 DEBUG nova.compute.manager [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.867 221324 DEBUG oslo_concurrency.lockutils [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.867 221324 DEBUG oslo_concurrency.lockutils [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.867 221324 DEBUG oslo_concurrency.lockutils [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.868 221324 DEBUG nova.compute.manager [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:14 np0005603622 nova_compute[221301]: 2026-01-31 08:13:14.868 221324 WARNING nova.compute.manager [req-d9a4943f-cea2-44f1-b901-9d323a3e897b req-446d9094-312d-4ef6-a6ab-620a4a86655d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:13:15 np0005603622 nova_compute[221301]: 2026-01-31 08:13:15.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:15 np0005603622 nova_compute[221301]: 2026-01-31 08:13:15.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:15.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:16.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.078 221324 DEBUG nova.compute.manager [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.079 221324 DEBUG oslo_concurrency.lockutils [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.079 221324 DEBUG oslo_concurrency.lockutils [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.080 221324 DEBUG oslo_concurrency.lockutils [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.080 221324 DEBUG nova.compute.manager [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] No waiting events found dispatching network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:17 np0005603622 nova_compute[221301]: 2026-01-31 08:13:17.080 221324 WARNING nova.compute.manager [req-b6bc0b19-8d13-433e-8349-b35acf2609bd req-8d6e0f5b-9b17-4270-8956-29dfa6332349 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Received unexpected event network-vif-plugged-4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:13:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:17.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:18.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.348 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "9e88446e-2147-4f66-9f77-23949a27f7e6" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.349 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.349 221324 DEBUG nova.compute.manager [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.984 221324 DEBUG neutronclient.v2_0.client [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.985 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.985 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.986 221324 DEBUG nova.network.neutron [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:13:18 np0005603622 nova_compute[221301]: 2026-01-31 08:13:18.986 221324 DEBUG nova.objects.instance [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'info_cache' on Instance uuid 9e88446e-2147-4f66-9f77-23949a27f7e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:19.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:20 np0005603622 nova_compute[221301]: 2026-01-31 08:13:20.076 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:21 np0005603622 nova_compute[221301]: 2026-01-31 08:13:21.811 221324 DEBUG nova.network.neutron [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 9e88446e-2147-4f66-9f77-23949a27f7e6] Updating instance_info_cache with network_info: [{"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:21 np0005603622 nova_compute[221301]: 2026-01-31 08:13:21.824 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:13:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:21.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:13:21 np0005603622 nova_compute[221301]: 2026-01-31 08:13:21.886 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-9e88446e-2147-4f66-9f77-23949a27f7e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:21 np0005603622 nova_compute[221301]: 2026-01-31 08:13:21.887 221324 DEBUG nova.objects.instance [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e88446e-2147-4f66-9f77-23949a27f7e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:22.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.093 221324 DEBUG nova.storage.rbd_utils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] removing snapshot(nova-resize) on rbd image(9e88446e-2147-4f66-9f77-23949a27f7e6_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:13:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.911 221324 DEBUG nova.virt.libvirt.vif [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:12:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2042269499',display_name='tempest-ServerDiskConfigTestJSON-server-2042269499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2042269499',id=90,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:13:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-619dusqe',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:13:14Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=9e88446e-2147-4f66-9f77-23949a27f7e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.912 221324 DEBUG nova.network.os_vif_util [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "address": "fa:16:3e:39:73:b2", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d96f0c7-92", "ovs_interfaceid": "4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.914 221324 DEBUG nova.network.os_vif_util [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.914 221324 DEBUG os_vif [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.918 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d96f0c7-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.918 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.921 221324 INFO os_vif [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:73:b2,bridge_name='br-int',has_traffic_filtering=True,id=4d96f0c7-92c2-4d4d-9f1b-b26f7d707dbb,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d96f0c7-92')#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.922 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:22 np0005603622 nova_compute[221301]: 2026-01-31 08:13:22.922 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.074 221324 DEBUG oslo_concurrency.processutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.422 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3827048267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.577 221324 DEBUG oslo_concurrency.processutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.585 221324 DEBUG nova.compute.provider_tree [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.652 221324 DEBUG nova.scheduler.client.report [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.721 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.776 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:23.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:23 np0005603622 nova_compute[221301]: 2026-01-31 08:13:23.910 221324 INFO nova.scheduler.client.report [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Deleted allocation for migration 7ca3af68-bbd7-45fb-8062-e32509f344f5#033[00m
Jan 31 03:13:24 np0005603622 nova_compute[221301]: 2026-01-31 08:13:24.052 221324 DEBUG oslo_concurrency.lockutils [None req-30340908-87b8-4b19-9629-1bd60465d60b 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "9e88446e-2147-4f66-9f77-23949a27f7e6" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:24.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:25 np0005603622 nova_compute[221301]: 2026-01-31 08:13:25.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:25.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:26.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:27 np0005603622 nova_compute[221301]: 2026-01-31 08:13:27.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:27.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:28.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:28 np0005603622 nova_compute[221301]: 2026-01-31 08:13:28.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:28 np0005603622 nova_compute[221301]: 2026-01-31 08:13:28.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:13:28 np0005603622 nova_compute[221301]: 2026-01-31 08:13:28.779 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:29 np0005603622 podman[254045]: 2026-01-31 08:13:29.222847289 +0000 UTC m=+0.135629948 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:13:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:29.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:30.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.442 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.443 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.595 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.745 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.747 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.770 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:13:30 np0005603622 nova_compute[221301]: 2026-01-31 08:13:30.770 221324 INFO nova.compute.claims [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:30.866 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:30.867 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:30.867 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.196 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3251864214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.623 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.629 221324 DEBUG nova.compute.provider_tree [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.651 221324 DEBUG nova.scheduler.client.report [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.736 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.737 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:13:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:31.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.906 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.909 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:13:31 np0005603622 nova_compute[221301]: 2026-01-31 08:13:31.985 221324 INFO nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.061 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:13:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:32.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:32 np0005603622 podman[254093]: 2026-01-31 08:13:32.170621463 +0000 UTC m=+0.083189229 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.180 221324 INFO nova.virt.block_device [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Booting with volume a072271f-7f43-470f-93ee-c6396eaabeba at /dev/vda#033[00m
Jan 31 03:13:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.291 221324 DEBUG nova.policy [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12a823bd7c6e4cf492ebf6c1d002a91f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c03fec1b3664105996aa979e226d8f8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.451 221324 DEBUG os_brick.utils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.455 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.469 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.470 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c30b13-f436-449f-a0e8-c429de986391]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.473 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.482 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.483 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7ffa88bf-d22c-413d-886d-eb99a90a46c1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.485 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.497 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.497 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[0803ec16-22cb-4eb2-813a-73ce75865dda]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.500 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[99773de0-70dc-46ec-b9d6-7cf996986932]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.501 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.533 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.537 221324 DEBUG os_brick.initiator.connectors.lightos [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.537 221324 DEBUG os_brick.initiator.connectors.lightos [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.538 221324 DEBUG os_brick.initiator.connectors.lightos [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.538 221324 DEBUG os_brick.utils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] <== get_connector_properties: return (85ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:13:32 np0005603622 nova_compute[221301]: 2026-01-31 08:13:32.539 221324 DEBUG nova.virt.block_device [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating existing volume attachment record: 8fc86e4a-4e40-4c18-a9f7-f380e5c59b91 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:13:33 np0005603622 nova_compute[221301]: 2026-01-31 08:13:33.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:33 np0005603622 nova_compute[221301]: 2026-01-31 08:13:33.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:13:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4159150135' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:13:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 31 03:13:33 np0005603622 nova_compute[221301]: 2026-01-31 08:13:33.780 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:33.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:34.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:34 np0005603622 nova_compute[221301]: 2026-01-31 08:13:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:35 np0005603622 nova_compute[221301]: 2026-01-31 08:13:35.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:35 np0005603622 nova_compute[221301]: 2026-01-31 08:13:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:35 np0005603622 nova_compute[221301]: 2026-01-31 08:13:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:35.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:36.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:36 np0005603622 nova_compute[221301]: 2026-01-31 08:13:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:36 np0005603622 nova_compute[221301]: 2026-01-31 08:13:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:37.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.101 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.101 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.102 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.102 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.102 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:38.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.233 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.238 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.239 221324 INFO nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Creating image(s)#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.240 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.240 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Ensure instance console log exists: /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.241 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.242 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.243 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.570 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.663 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.663 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.777 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Successfully created port: f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.911 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.913 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4416MB free_disk=20.875896453857422GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.913 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:38 np0005603622 nova_compute[221301]: 2026-01-31 08:13:38.914 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.209 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 183fba4a-610d-4b96-a12f-996f9f2dc30b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.209 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 8a8d8223-9051-487a-a4d6-a33911813797 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.210 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.210 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.275 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:13:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3881893028' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.734 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:39 np0005603622 nova_compute[221301]: 2026-01-31 08:13:39.743 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:13:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:39.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:40.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:40 np0005603622 nova_compute[221301]: 2026-01-31 08:13:40.149 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:40 np0005603622 nova_compute[221301]: 2026-01-31 08:13:40.222 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:13:40 np0005603622 nova_compute[221301]: 2026-01-31 08:13:40.376 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:13:40 np0005603622 nova_compute[221301]: 2026-01-31 08:13:40.377 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.379 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.380 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.380 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.489 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.632 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Successfully updated port: f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.742 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.743 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.743 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:13:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:41.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:41 np0005603622 nova_compute[221301]: 2026-01-31 08:13:41.984 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:13:42 np0005603622 nova_compute[221301]: 2026-01-31 08:13:42.002 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:42 np0005603622 nova_compute[221301]: 2026-01-31 08:13:42.003 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:42 np0005603622 nova_compute[221301]: 2026-01-31 08:13:42.003 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:13:42 np0005603622 nova_compute[221301]: 2026-01-31 08:13:42.004 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:42.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.296 221324 DEBUG nova.network.neutron [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.433 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.433 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance network_info: |[{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.440 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Start _get_guest_xml network_info=[{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'boot_index': 0, 'attachment_id': '8fc86e4a-4e40-4c18-a9f7-f380e5c59b91', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a072271f-7f43-470f-93ee-c6396eaabeba', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a072271f-7f43-470f-93ee-c6396eaabeba', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '8a8d8223-9051-487a-a4d6-a33911813797', 'attached_at': '', 'detached_at': '', 'volume_id': 'a072271f-7f43-470f-93ee-c6396eaabeba', 'serial': 'a072271f-7f43-470f-93ee-c6396eaabeba'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.448 221324 WARNING nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.460 221324 DEBUG nova.virt.libvirt.host [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.461 221324 DEBUG nova.virt.libvirt.host [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.465 221324 DEBUG nova.virt.libvirt.host [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.467 221324 DEBUG nova.virt.libvirt.host [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.469 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.470 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.471 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.471 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.472 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.472 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.473 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.473 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.474 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.474 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.475 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.475 221324 DEBUG nova.virt.hardware [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.525 221324 DEBUG nova.storage.rbd_utils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 8a8d8223-9051-487a-a4d6-a33911813797_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.531 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.570 221324 DEBUG nova.compute.manager [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.571 221324 DEBUG nova.compute.manager [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing instance network info cache due to event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.571 221324 DEBUG oslo_concurrency.lockutils [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.571 221324 DEBUG oslo_concurrency.lockutils [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.571 221324 DEBUG nova.network.neutron [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:13:43 np0005603622 nova_compute[221301]: 2026-01-31 08:13:43.790 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:43.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:13:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4222082381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.004 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:44.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.521 221324 DEBUG nova.virt.libvirt.vif [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:13:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-865768110',display_name='tempest-ServerActionsTestOtherA-server-865768110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-865768110',id=92,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-y7gogkds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:13:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=8a8d8223-9051-487a-a4d6-a33911813797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.522 221324 DEBUG nova.network.os_vif_util [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.523 221324 DEBUG nova.network.os_vif_util [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.524 221324 DEBUG nova.objects.instance [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a8d8223-9051-487a-a4d6-a33911813797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:13:44 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.997 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:13:44 np0005603622 nova_compute[221301]:  <uuid>8a8d8223-9051-487a-a4d6-a33911813797</uuid>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:  <name>instance-0000005c</name>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestOtherA-server-865768110</nova:name>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:13:43</nova:creationTime>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:user uuid="12a823bd7c6e4cf492ebf6c1d002a91f">tempest-ServerActionsTestOtherA-1768827668-project-member</nova:user>
Jan 31 03:13:44 np0005603622 nova_compute[221301]:        <nova:project uuid="9c03fec1b3664105996aa979e226d8f8">tempest-ServerActionsTestOtherA-1768827668</nova:project>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <nova:port uuid="f1b92dea-fcbf-4fdc-a875-d4273610d4c5">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="serial">8a8d8223-9051-487a-a4d6-a33911813797</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="uuid">8a8d8223-9051-487a-a4d6-a33911813797</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8a8d8223-9051-487a-a4d6-a33911813797_disk.config">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-a072271f-7f43-470f-93ee-c6396eaabeba">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <serial>a072271f-7f43-470f-93ee-c6396eaabeba</serial>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:33:f6:b6"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <target dev="tapf1b92dea-fc"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/console.log" append="off"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:13:45 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:13:45 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:13:45 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:13:45 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.998 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Preparing to wait for external event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.998 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.999 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:44.999 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.001 221324 DEBUG nova.virt.libvirt.vif [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:13:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-865768110',display_name='tempest-ServerActionsTestOtherA-server-865768110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-865768110',id=92,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-y7gogkds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:13:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=8a8d8223-9051-487a-a4d6-a33911813797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.001 221324 DEBUG nova.network.os_vif_util [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.002 221324 DEBUG nova.network.os_vif_util [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.003 221324 DEBUG os_vif [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.005 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.005 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.011 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.011 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1b92dea-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.012 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf1b92dea-fc, col_values=(('external_ids', {'iface-id': 'f1b92dea-fcbf-4fdc-a875-d4273610d4c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:f6:b6', 'vm-uuid': '8a8d8223-9051-487a-a4d6-a33911813797'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:45 np0005603622 NetworkManager[49080]: <info>  [1769847225.0593] manager: (tapf1b92dea-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.063 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.067 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.068 221324 INFO os_vif [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc')#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.153 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.505 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.506 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.506 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] No VIF found with MAC fa:16:3e:33:f6:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.507 221324 INFO nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Using config drive#033[00m
Jan 31 03:13:45 np0005603622 nova_compute[221301]: 2026-01-31 08:13:45.570 221324 DEBUG nova.storage.rbd_utils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 8a8d8223-9051-487a-a4d6-a33911813797_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:45.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:46.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.008 221324 INFO nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Creating config drive at /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.013 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp52bbch2e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.144 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp52bbch2e" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.176 221324 DEBUG nova.storage.rbd_utils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 8a8d8223-9051-487a-a4d6-a33911813797_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.180 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config 8a8d8223-9051-487a-a4d6-a33911813797_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:13:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.388 221324 DEBUG oslo_concurrency.processutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config 8a8d8223-9051-487a-a4d6-a33911813797_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.389 221324 INFO nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Deleting local config drive /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/disk.config because it was imported into RBD.#033[00m
Jan 31 03:13:47 np0005603622 kernel: tapf1b92dea-fc: entered promiscuous mode
Jan 31 03:13:47 np0005603622 NetworkManager[49080]: <info>  [1769847227.4570] manager: (tapf1b92dea-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:13:47Z|00350|binding|INFO|Claiming lport f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for this chassis.
Jan 31 03:13:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:13:47Z|00351|binding|INFO|f1b92dea-fcbf-4fdc-a875-d4273610d4c5: Claiming fa:16:3e:33:f6:b6 10.100.0.3
Jan 31 03:13:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:13:47Z|00352|binding|INFO|Setting lport f1b92dea-fcbf-4fdc-a875-d4273610d4c5 ovn-installed in OVS
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.469 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.472 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:47 np0005603622 systemd-udevd[254274]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:13:47 np0005603622 systemd-machined[190406]: New machine qemu-39-instance-0000005c.
Jan 31 03:13:47 np0005603622 NetworkManager[49080]: <info>  [1769847227.5175] device (tapf1b92dea-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:13:47 np0005603622 NetworkManager[49080]: <info>  [1769847227.5184] device (tapf1b92dea-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:13:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:13:47Z|00353|binding|INFO|Setting lport f1b92dea-fcbf-4fdc-a875-d4273610d4c5 up in Southbound
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.525 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:f6:b6 10.100.0.3'], port_security=['fa:16:3e:33:f6:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8a8d8223-9051-487a-a4d6-a33911813797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4d0e926d-7b9d-4115-9719-7f0d71edaace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f1b92dea-fcbf-4fdc-a875-d4273610d4c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.528 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 bound to our chassis#033[00m
Jan 31 03:13:47 np0005603622 systemd[1]: Started Virtual Machine qemu-39-instance-0000005c.
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.533 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.552 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0abab7f-70c4-4344-bd4a-230ef8df7b71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.600 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[49d2b60c-2a88-4315-93e8-b5f507393cfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.605 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e73469-b343-4216-be99-45f5b5a855d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.643 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd9b3e-a4e3-4466-b1cc-52982a13680f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.661 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09b34950-72cc-4a8f-8c0d-866c10acc780]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625349, 'reachable_time': 18174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254291, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.682 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a55a859-2484-4096-b8d1-68e6a297099d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f564452-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625360, 'tstamp': 625360}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254292, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f564452-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625363, 'tstamp': 625363}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254292, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.686 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.690 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.690 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.691 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:13:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:13:47.692 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:13:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:47.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.974 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847227.9738681, 8a8d8223-9051-487a-a4d6-a33911813797 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:13:47 np0005603622 nova_compute[221301]: 2026-01-31 08:13:47.975 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] VM Started (Lifecycle Event)#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.074 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.081 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847227.9740214, 8a8d8223-9051-487a-a4d6-a33911813797 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.081 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:13:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:48.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.163 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.168 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.377 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.484 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.551 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.552 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.939 221324 DEBUG nova.network.neutron [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updated VIF entry in instance network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:13:48 np0005603622 nova_compute[221301]: 2026-01-31 08:13:48.939 221324 DEBUG nova.network.neutron [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.142 221324 DEBUG oslo_concurrency.lockutils [req-097265a3-c6ba-4eb8-9f7c-a955f35ade99 req-f88e6e28-788b-4497-9a67-a59bfd517b84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.693 221324 DEBUG nova.compute.manager [req-50693066-68d8-4b27-ba23-26ba5eeda023 req-17d301dc-1f9a-42cd-a209-c0ed348e03a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.694 221324 DEBUG oslo_concurrency.lockutils [req-50693066-68d8-4b27-ba23-26ba5eeda023 req-17d301dc-1f9a-42cd-a209-c0ed348e03a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.695 221324 DEBUG oslo_concurrency.lockutils [req-50693066-68d8-4b27-ba23-26ba5eeda023 req-17d301dc-1f9a-42cd-a209-c0ed348e03a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.696 221324 DEBUG oslo_concurrency.lockutils [req-50693066-68d8-4b27-ba23-26ba5eeda023 req-17d301dc-1f9a-42cd-a209-c0ed348e03a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.696 221324 DEBUG nova.compute.manager [req-50693066-68d8-4b27-ba23-26ba5eeda023 req-17d301dc-1f9a-42cd-a209-c0ed348e03a6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Processing event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.697 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.703 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847229.7030663, 8a8d8223-9051-487a-a4d6-a33911813797 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.704 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.707 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.712 221324 INFO nova.virt.libvirt.driver [-] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance spawned successfully.#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.713 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.835 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.840 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.850 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.851 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.852 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.852 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.853 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 nova_compute[221301]: 2026-01-31 08:13:49.853 221324 DEBUG nova.virt.libvirt.driver [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:13:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:49.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:50.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.153 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.265 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.425 221324 INFO nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Took 12.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.426 221324 DEBUG nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.745 221324 INFO nova.compute.manager [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Took 20.05 seconds to build instance.#033[00m
Jan 31 03:13:50 np0005603622 nova_compute[221301]: 2026-01-31 08:13:50.872 221324 DEBUG oslo_concurrency.lockutils [None req-67d386cb-6540-4588-a99f-d49c1923b274 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:51.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.967 221324 DEBUG nova.compute.manager [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.968 221324 DEBUG oslo_concurrency.lockutils [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.969 221324 DEBUG oslo_concurrency.lockutils [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.969 221324 DEBUG oslo_concurrency.lockutils [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.969 221324 DEBUG nova.compute.manager [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] No waiting events found dispatching network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:13:51 np0005603622 nova_compute[221301]: 2026-01-31 08:13:51.970 221324 WARNING nova.compute.manager [req-d9cab9b2-2c5f-4856-87d3-b187ead30e4f req-bbdcec08-cfe4-4d94-8b0d-c7124b2694f7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received unexpected event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:13:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:52.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:53.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:54.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.101 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.134 221324 DEBUG nova.compute.manager [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.134 221324 DEBUG nova.compute.manager [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing instance network info cache due to event network-changed-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.135 221324 DEBUG oslo_concurrency.lockutils [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.135 221324 DEBUG oslo_concurrency.lockutils [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.135 221324 DEBUG nova.network.neutron [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Refreshing network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:13:55 np0005603622 nova_compute[221301]: 2026-01-31 08:13:55.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:13:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:55.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:13:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:56.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:13:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:13:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:57.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:13:58.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.738034) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238738163, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1189, "num_deletes": 257, "total_data_size": 2412117, "memory_usage": 2451680, "flush_reason": "Manual Compaction"}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238748913, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1590236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42717, "largest_seqno": 43901, "table_properties": {"data_size": 1585069, "index_size": 2627, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11503, "raw_average_key_size": 19, "raw_value_size": 1574440, "raw_average_value_size": 2695, "num_data_blocks": 116, "num_entries": 584, "num_filter_entries": 584, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847149, "oldest_key_time": 1769847149, "file_creation_time": 1769847238, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 11022 microseconds, and 6837 cpu microseconds.
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.749067) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1590236 bytes OK
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.749127) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.750519) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.750542) EVENT_LOG_v1 {"time_micros": 1769847238750534, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.750568) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 2406405, prev total WAL file size 2406405, number of live WAL files 2.
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.751956) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323535' seq:72057594037927935, type:22 .. '6C6F676D0031353037' seq:0, type:0; will stop at (end)
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1552KB)], [81(8909KB)]
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238752028, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 10713464, "oldest_snapshot_seqno": -1}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6794 keys, 10575714 bytes, temperature: kUnknown
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238818990, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10575714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10530274, "index_size": 27344, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17029, "raw_key_size": 175184, "raw_average_key_size": 25, "raw_value_size": 10408861, "raw_average_value_size": 1532, "num_data_blocks": 1085, "num_entries": 6794, "num_filter_entries": 6794, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847238, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.819675) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10575714 bytes
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.821147) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.9 rd, 156.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 8.7 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(13.4) write-amplify(6.7) OK, records in: 7325, records dropped: 531 output_compression: NoCompression
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.821180) EVENT_LOG_v1 {"time_micros": 1769847238821164, "job": 50, "event": "compaction_finished", "compaction_time_micros": 67441, "compaction_time_cpu_micros": 45241, "output_level": 6, "num_output_files": 1, "total_output_size": 10575714, "num_input_records": 7325, "num_output_records": 6794, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238822145, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847238823845, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.751808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.823986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.823990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.823992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.823993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:13:58.823995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:13:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:13:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:13:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:13:59.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:00 np0005603622 nova_compute[221301]: 2026-01-31 08:14:00.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:00.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:00 np0005603622 nova_compute[221301]: 2026-01-31 08:14:00.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:00 np0005603622 podman[254336]: 2026-01-31 08:14:00.215928672 +0000 UTC m=+0.129881244 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:14:01 np0005603622 nova_compute[221301]: 2026-01-31 08:14:01.432 221324 DEBUG nova.compute.manager [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:01 np0005603622 nova_compute[221301]: 2026-01-31 08:14:01.433 221324 DEBUG nova.compute.manager [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing instance network info cache due to event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:14:01 np0005603622 nova_compute[221301]: 2026-01-31 08:14:01.434 221324 DEBUG oslo_concurrency.lockutils [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:01 np0005603622 nova_compute[221301]: 2026-01-31 08:14:01.434 221324 DEBUG oslo_concurrency.lockutils [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:01 np0005603622 nova_compute[221301]: 2026-01-31 08:14:01.434 221324 DEBUG nova.network.neutron [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:14:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:01.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:02.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:02 np0005603622 nova_compute[221301]: 2026-01-31 08:14:02.313 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:02.315 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:14:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:02.317 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:14:03 np0005603622 podman[254363]: 2026-01-31 08:14:03.129750325 +0000 UTC m=+0.053969932 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Jan 31 03:14:03 np0005603622 nova_compute[221301]: 2026-01-31 08:14:03.231 221324 DEBUG nova.network.neutron [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated VIF entry in instance network info cache for port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:14:03 np0005603622 nova_compute[221301]: 2026-01-31 08:14:03.232 221324 DEBUG nova.network.neutron [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:03.320 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:03 np0005603622 nova_compute[221301]: 2026-01-31 08:14:03.516 221324 DEBUG oslo_concurrency.lockutils [req-b73af6fa-ca7b-4c60-a69a-74f627117e0e req-27f5e898-2d5b-40a0-8de0-c5ae96c7bc71 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:03.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:04.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:04 np0005603622 nova_compute[221301]: 2026-01-31 08:14:04.297 221324 DEBUG nova.network.neutron [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updated VIF entry in instance network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:14:04 np0005603622 nova_compute[221301]: 2026-01-31 08:14:04.298 221324 DEBUG nova.network.neutron [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:04 np0005603622 nova_compute[221301]: 2026-01-31 08:14:04.530 221324 DEBUG oslo_concurrency.lockutils [req-c2037279-94fc-44a9-ab63-04d0bf1a1c75 req-58505e5d-b04b-489a-96c4-56c56039e321 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:05 np0005603622 nova_compute[221301]: 2026-01-31 08:14:05.149 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:05 np0005603622 nova_compute[221301]: 2026-01-31 08:14:05.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:14:05Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:f6:b6 10.100.0.3
Jan 31 03:14:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:14:05Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:f6:b6 10.100.0.3
Jan 31 03:14:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:05.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:06.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:07.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:14:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:08.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:14:08 np0005603622 nova_compute[221301]: 2026-01-31 08:14:08.475 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:08 np0005603622 nova_compute[221301]: 2026-01-31 08:14:08.476 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:08 np0005603622 nova_compute[221301]: 2026-01-31 08:14:08.477 221324 DEBUG nova.network.neutron [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:14:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:09.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:10 np0005603622 nova_compute[221301]: 2026-01-31 08:14:10.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:10.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:10 np0005603622 nova_compute[221301]: 2026-01-31 08:14:10.168 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:11.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:14:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:12.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:12 np0005603622 nova_compute[221301]: 2026-01-31 08:14:12.227 221324 DEBUG nova.network.neutron [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:12 np0005603622 nova_compute[221301]: 2026-01-31 08:14:12.513 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.199 221324 DEBUG nova.virt.libvirt.driver [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.199 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Creating file /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/24dd566ca2b8485690a99872e69c68d1.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.200 221324 DEBUG oslo_concurrency.processutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/24dd566ca2b8485690a99872e69c68d1.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.621 221324 DEBUG oslo_concurrency.processutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/24dd566ca2b8485690a99872e69c68d1.tmp" returned: 1 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.623 221324 DEBUG oslo_concurrency.processutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797/24dd566ca2b8485690a99872e69c68d1.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.624 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Creating directory /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.624 221324 DEBUG oslo_concurrency.processutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.836 221324 DEBUG oslo_concurrency.processutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/8a8d8223-9051-487a-a4d6-a33911813797" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:13 np0005603622 nova_compute[221301]: 2026-01-31 08:14:13.843 221324 DEBUG nova.virt.libvirt.driver [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:14:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:13.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:14.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:15 np0005603622 nova_compute[221301]: 2026-01-31 08:14:15.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:15 np0005603622 nova_compute[221301]: 2026-01-31 08:14:15.170 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:14:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:15.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:14:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:16 np0005603622 kernel: tapf1b92dea-fc (unregistering): left promiscuous mode
Jan 31 03:14:16 np0005603622 NetworkManager[49080]: <info>  [1769847256.1900] device (tapf1b92dea-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.197 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:14:16Z|00354|binding|INFO|Releasing lport f1b92dea-fcbf-4fdc-a875-d4273610d4c5 from this chassis (sb_readonly=0)
Jan 31 03:14:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:14:16Z|00355|binding|INFO|Setting lport f1b92dea-fcbf-4fdc-a875-d4273610d4c5 down in Southbound
Jan 31 03:14:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:14:16Z|00356|binding|INFO|Removing iface tapf1b92dea-fc ovn-installed in OVS
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 31 03:14:16 np0005603622 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005c.scope: Consumed 14.774s CPU time.
Jan 31 03:14:16 np0005603622 systemd-machined[190406]: Machine qemu-39-instance-0000005c terminated.
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.383 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:f6:b6 10.100.0.3'], port_security=['fa:16:3e:33:f6:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8a8d8223-9051-487a-a4d6-a33911813797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4d0e926d-7b9d-4115-9719-7f0d71edaace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f1b92dea-fcbf-4fdc-a875-d4273610d4c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.386 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.389 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f564452-5f08-4a1c-921e-f2daee9ec936#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.403 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c94f1445-71d9-4dd6-8f45-bcf626ff8ae2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.442 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[68e78c30-bc76-43c9-b226-b05e61782bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.446 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5c14dbbf-eff2-49f0-9b80-c498e88b3026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.465 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7deeeafc-d1bf-4c42-8954-b8a6aebd20a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.476 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d725ae70-a533-49ac-ba54-1488798fa9ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f564452-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:23:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625349, 'reachable_time': 18174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254659, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.491 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7826bd-e0ab-47c0-9722-7b98c777f129]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f564452-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625360, 'tstamp': 625360}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254660, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f564452-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625363, 'tstamp': 625363}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254660, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.493 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.532 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.538 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f564452-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.538 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.539 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f564452-50, col_values=(('external_ids', {'iface-id': '5bb8c1b5-edce-4f6a-8164-58b7d89a3330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:16.540 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.864 221324 INFO nova.virt.libvirt.driver [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.870 221324 INFO nova.virt.libvirt.driver [-] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Instance destroyed successfully.#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.872 221324 DEBUG nova.virt.libvirt.vif [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:13:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-865768110',display_name='tempest-ServerActionsTestOtherA-server-865768110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-865768110',id=92,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:13:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-y7gogkds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:14:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=8a8d8223-9051-487a-a4d6-a33911813797,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2006849245-network", "vif_mac": "fa:16:3e:33:f6:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.872 221324 DEBUG nova.network.os_vif_util [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2006849245-network", "vif_mac": "fa:16:3e:33:f6:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.874 221324 DEBUG nova.network.os_vif_util [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.874 221324 DEBUG os_vif [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.876 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.877 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1b92dea-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.885 221324 INFO os_vif [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc')#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.892 221324 DEBUG nova.virt.libvirt.driver [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] skipping disk for instance-0000005c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:16 np0005603622 nova_compute[221301]: 2026-01-31 08:14:16.892 221324 DEBUG nova.virt.libvirt.driver [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] skipping disk for instance-0000005c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.634 221324 DEBUG nova.compute.manager [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-unplugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.634 221324 DEBUG oslo_concurrency.lockutils [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.635 221324 DEBUG oslo_concurrency.lockutils [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.636 221324 DEBUG oslo_concurrency.lockutils [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.636 221324 DEBUG nova.compute.manager [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] No waiting events found dispatching network-vif-unplugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:17 np0005603622 nova_compute[221301]: 2026-01-31 08:14:17.637 221324 WARNING nova.compute.manager [req-b8c142cd-7f56-4e1a-a12c-40ca6b02efe7 req-91c9f679-b17a-4028-9735-3be4f57fc29a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received unexpected event network-vif-unplugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:14:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:14:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:17.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:14:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:18.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:14:18 np0005603622 nova_compute[221301]: 2026-01-31 08:14:18.201 221324 DEBUG neutronclient.v2_0.client [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:14:18 np0005603622 nova_compute[221301]: 2026-01-31 08:14:18.960 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:18 np0005603622 nova_compute[221301]: 2026-01-31 08:14:18.961 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:18 np0005603622 nova_compute[221301]: 2026-01-31 08:14:18.961 221324 DEBUG oslo_concurrency.lockutils [None req-660a93ce-32a9-4519-a769-89f3f0907077 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.823 221324 DEBUG nova.compute.manager [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.824 221324 DEBUG oslo_concurrency.lockutils [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.825 221324 DEBUG oslo_concurrency.lockutils [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.825 221324 DEBUG oslo_concurrency.lockutils [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.826 221324 DEBUG nova.compute.manager [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] No waiting events found dispatching network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:19 np0005603622 nova_compute[221301]: 2026-01-31 08:14:19.826 221324 WARNING nova.compute.manager [req-24beae67-d7a8-4637-b383-ce0eed0eac73 req-3be78623-ab1e-4f5c-af60-f815084b43bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received unexpected event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:14:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:19.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:20.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:20 np0005603622 nova_compute[221301]: 2026-01-31 08:14:20.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:21 np0005603622 nova_compute[221301]: 2026-01-31 08:14:21.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:21.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:22.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:22 np0005603622 nova_compute[221301]: 2026-01-31 08:14:22.595 221324 DEBUG nova.compute.manager [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:22 np0005603622 nova_compute[221301]: 2026-01-31 08:14:22.596 221324 DEBUG nova.compute.manager [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing instance network info cache due to event network-changed-f1b92dea-fcbf-4fdc-a875-d4273610d4c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:14:22 np0005603622 nova_compute[221301]: 2026-01-31 08:14:22.596 221324 DEBUG oslo_concurrency.lockutils [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:22 np0005603622 nova_compute[221301]: 2026-01-31 08:14:22.597 221324 DEBUG oslo_concurrency.lockutils [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:22 np0005603622 nova_compute[221301]: 2026-01-31 08:14:22.597 221324 DEBUG nova.network.neutron [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Refreshing network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:14:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:23.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:24.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:25 np0005603622 nova_compute[221301]: 2026-01-31 08:14:25.205 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:25 np0005603622 nova_compute[221301]: 2026-01-31 08:14:25.837 221324 DEBUG nova.network.neutron [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updated VIF entry in instance network info cache for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:14:25 np0005603622 nova_compute[221301]: 2026-01-31 08:14:25.838 221324 DEBUG nova.network.neutron [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:25.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:25 np0005603622 nova_compute[221301]: 2026-01-31 08:14:25.980 221324 DEBUG oslo_concurrency.lockutils [req-30966721-4382-4f62-8300-367b0ad30715 req-deb04a2a-0fee-43bd-ac14-13651554520a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:26.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:26 np0005603622 nova_compute[221301]: 2026-01-31 08:14:26.882 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:27 np0005603622 nova_compute[221301]: 2026-01-31 08:14:27.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:27.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:28.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:29 np0005603622 nova_compute[221301]: 2026-01-31 08:14:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:29 np0005603622 nova_compute[221301]: 2026-01-31 08:14:29.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:14:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:29.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:30.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:30 np0005603622 nova_compute[221301]: 2026-01-31 08:14:30.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:30.867 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:30.869 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:14:30.870 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:31 np0005603622 podman[254712]: 2026-01-31 08:14:31.215883053 +0000 UTC m=+0.121336698 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.327 221324 DEBUG nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.328 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.328 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.328 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.329 221324 DEBUG nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] No waiting events found dispatching network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.329 221324 WARNING nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received unexpected event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.329 221324 DEBUG nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.329 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.330 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.330 221324 DEBUG oslo_concurrency.lockutils [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.330 221324 DEBUG nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] No waiting events found dispatching network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.331 221324 WARNING nova.compute.manager [req-584e6873-c3a5-4c1c-9abd-dc23819e64e7 req-00abe8c0-7a82-4de6-aa3c-ab86629a23a7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Received unexpected event network-vif-plugged-f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.432 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847256.4305525, 8a8d8223-9051-487a-a4d6-a33911813797 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.433 221324 INFO nova.compute.manager [-] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.720 221324 DEBUG nova.compute.manager [None req-2716b4cf-b7c0-410c-99d9-7e4c117c8f5d - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.724 221324 DEBUG nova.compute.manager [None req-2716b4cf-b7c0-410c-99d9-7e4c117c8f5d - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:31 np0005603622 nova_compute[221301]: 2026-01-31 08:14:31.944 221324 INFO nova.compute.manager [None req-2716b4cf-b7c0-410c-99d9-7e4c117c8f5d - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:14:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:31.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:32.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 31 03:14:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:33.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:33 np0005603622 nova_compute[221301]: 2026-01-31 08:14:33.992 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "8a8d8223-9051-487a-a4d6-a33911813797" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:33 np0005603622 nova_compute[221301]: 2026-01-31 08:14:33.993 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:33 np0005603622 nova_compute[221301]: 2026-01-31 08:14:33.994 221324 DEBUG nova.compute.manager [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:14:34 np0005603622 podman[254738]: 2026-01-31 08:14:34.161225314 +0000 UTC m=+0.077694029 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:14:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:34.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.565 221324 DEBUG neutronclient.v2_0.client [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f1b92dea-fcbf-4fdc-a875-d4273610d4c5 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.566 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.567 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquired lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.567 221324 DEBUG nova.network.neutron [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:14:34 np0005603622 nova_compute[221301]: 2026-01-31 08:14:34.568 221324 DEBUG nova.objects.instance [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'info_cache' on Instance uuid 8a8d8223-9051-487a-a4d6-a33911813797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:35 np0005603622 nova_compute[221301]: 2026-01-31 08:14:35.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:35.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:36.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.550 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.551 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.551 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.551 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.552 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.886 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4107901374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:36 np0005603622 nova_compute[221301]: 2026-01-31 08:14:36.998 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.188 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.188 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.192 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.193 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:14:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.374 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.375 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4419MB free_disk=20.851505279541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.376 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.376 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.515 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration for instance 8a8d8223-9051-487a-a4d6-a33911813797 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.553 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating resource usage from migration a5724d08-b5e3-438b-b19e-d4b60aac6c87#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.553 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Starting to track outgoing migration a5724d08-b5e3-438b-b19e-d4b60aac6c87 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.672 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 183fba4a-610d-4b96-a12f-996f9f2dc30b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.673 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration a5724d08-b5e3-438b-b19e-d4b60aac6c87 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.673 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.674 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:14:37 np0005603622 nova_compute[221301]: 2026-01-31 08:14:37.742 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:37.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1715194954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.166 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.173 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:14:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:38.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.309 221324 DEBUG nova.network.neutron [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 8a8d8223-9051-487a-a4d6-a33911813797] Updating instance_info_cache with network_info: [{"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.314 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.393 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Releasing lock "refresh_cache-8a8d8223-9051-487a-a4d6-a33911813797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.395 221324 DEBUG nova.objects.instance [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a8d8223-9051-487a-a4d6-a33911813797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.429 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.523 221324 DEBUG nova.storage.rbd_utils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] rbd image 8a8d8223-9051-487a-a4d6-a33911813797_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.535 221324 DEBUG nova.virt.libvirt.vif [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:13:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-865768110',display_name='tempest-ServerActionsTestOtherA-server-865768110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-865768110',id=92,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:14:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-y7gogkds',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:14:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=8a8d8223-9051-487a-a4d6-a33911813797,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.536 221324 DEBUG nova.network.os_vif_util [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "address": "fa:16:3e:33:f6:b6", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1b92dea-fc", "ovs_interfaceid": "f1b92dea-fcbf-4fdc-a875-d4273610d4c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.537 221324 DEBUG nova.network.os_vif_util [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.537 221324 DEBUG os_vif [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.540 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1b92dea-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.541 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.545 221324 INFO os_vif [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:f6:b6,bridge_name='br-int',has_traffic_filtering=True,id=f1b92dea-fcbf-4fdc-a875-d4273610d4c5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1b92dea-fc')#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.546 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.546 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:14:38 np0005603622 nova_compute[221301]: 2026-01-31 08:14:38.731 221324 DEBUG oslo_concurrency.processutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:14:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:14:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2789774918' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.185 221324 DEBUG oslo_concurrency.processutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.192 221324 DEBUG nova.compute.provider_tree [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.223 221324 DEBUG nova.scheduler.client.report [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.429 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.440 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.628 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.629 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.634 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.635 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:14:39 np0005603622 nova_compute[221301]: 2026-01-31 08:14:39.676 221324 INFO nova.scheduler.client.report [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Deleted allocation for migration a5724d08-b5e3-438b-b19e-d4b60aac6c87#033[00m
Jan 31 03:14:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:39.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:40 np0005603622 nova_compute[221301]: 2026-01-31 08:14:40.071 221324 DEBUG oslo_concurrency.lockutils [None req-59a20bc5-b0f1-4d10-9d8f-a346a5765e91 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "8a8d8223-9051-487a-a4d6-a33911813797" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:14:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:40.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:40 np0005603622 nova_compute[221301]: 2026-01-31 08:14:40.275 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:41 np0005603622 nova_compute[221301]: 2026-01-31 08:14:41.895 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:41.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:14:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:42.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:14:43 np0005603622 nova_compute[221301]: 2026-01-31 08:14:43.886 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [{"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:14:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:43.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:44 np0005603622 nova_compute[221301]: 2026-01-31 08:14:44.100 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-183fba4a-610d-4b96-a12f-996f9f2dc30b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:14:44 np0005603622 nova_compute[221301]: 2026-01-31 08:14:44.101 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:14:44 np0005603622 nova_compute[221301]: 2026-01-31 08:14:44.102 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:44.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 31 03:14:45 np0005603622 nova_compute[221301]: 2026-01-31 08:14:45.063 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:14:45 np0005603622 nova_compute[221301]: 2026-01-31 08:14:45.277 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:46.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:46 np0005603622 nova_compute[221301]: 2026-01-31 08:14:46.900 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.094153) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287094193, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 853, "num_deletes": 251, "total_data_size": 1558234, "memory_usage": 1575544, "flush_reason": "Manual Compaction"}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287108819, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 1016668, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43906, "largest_seqno": 44754, "table_properties": {"data_size": 1012588, "index_size": 1796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9627, "raw_average_key_size": 20, "raw_value_size": 1004193, "raw_average_value_size": 2100, "num_data_blocks": 77, "num_entries": 478, "num_filter_entries": 478, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847238, "oldest_key_time": 1769847238, "file_creation_time": 1769847287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 14741 microseconds, and 4156 cpu microseconds.
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.108889) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 1016668 bytes OK
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.108918) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.110580) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.110605) EVENT_LOG_v1 {"time_micros": 1769847287110598, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.110631) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 1553812, prev total WAL file size 1553812, number of live WAL files 2.
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.111591) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(992KB)], [84(10MB)]
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287111648, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11592382, "oldest_snapshot_seqno": -1}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6749 keys, 9709306 bytes, temperature: kUnknown
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287165449, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 9709306, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9664976, "index_size": 26362, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 175025, "raw_average_key_size": 25, "raw_value_size": 9545083, "raw_average_value_size": 1414, "num_data_blocks": 1039, "num_entries": 6749, "num_filter_entries": 6749, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.165744) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 9709306 bytes
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.167138) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 215.2 rd, 180.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 10.1 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(21.0) write-amplify(9.6) OK, records in: 7272, records dropped: 523 output_compression: NoCompression
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.167169) EVENT_LOG_v1 {"time_micros": 1769847287167155, "job": 52, "event": "compaction_finished", "compaction_time_micros": 53877, "compaction_time_cpu_micros": 35142, "output_level": 6, "num_output_files": 1, "total_output_size": 9709306, "num_input_records": 7272, "num_output_records": 6749, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287167452, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847287168986, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.111436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.169116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.169124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.169127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.169130) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:14:47.169133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:14:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:47.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:48.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:49.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:50.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:50 np0005603622 nova_compute[221301]: 2026-01-31 08:14:50.279 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:51 np0005603622 nova_compute[221301]: 2026-01-31 08:14:51.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:52.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 31 03:14:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:54.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:55 np0005603622 nova_compute[221301]: 2026-01-31 08:14:55.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:56.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:56 np0005603622 nova_compute[221301]: 2026-01-31 08:14:56.904 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:14:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:14:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:14:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:14:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:14:58.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:14:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:14:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:14:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:14:59.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:00.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:00 np0005603622 nova_compute[221301]: 2026-01-31 08:15:00.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:01 np0005603622 nova_compute[221301]: 2026-01-31 08:15:01.906 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:01.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:02 np0005603622 podman[254847]: 2026-01-31 08:15:02.180963858 +0000 UTC m=+0.092066127 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:15:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:02.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:04.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:05 np0005603622 podman[254873]: 2026-01-31 08:15:05.146273119 +0000 UTC m=+0.070860095 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:15:05 np0005603622 nova_compute[221301]: 2026-01-31 08:15:05.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:05 np0005603622 nova_compute[221301]: 2026-01-31 08:15:05.915 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:05 np0005603622 nova_compute[221301]: 2026-01-31 08:15:05.915 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:05.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.079 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.180 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.180 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:06.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.315 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.422 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.422 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.438 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.438 221324 INFO nova.compute.claims [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:15:06 np0005603622 nova_compute[221301]: 2026-01-31 08:15:06.909 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:08.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:08 np0005603622 nova_compute[221301]: 2026-01-31 08:15:08.014 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:08.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:09 np0005603622 nova_compute[221301]: 2026-01-31 08:15:09.040 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/826082927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:09 np0005603622 nova_compute[221301]: 2026-01-31 08:15:09.479 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:09 np0005603622 nova_compute[221301]: 2026-01-31 08:15:09.489 221324 DEBUG nova.compute.provider_tree [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:09 np0005603622 nova_compute[221301]: 2026-01-31 08:15:09.893 221324 DEBUG nova.scheduler.client.report [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:10.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:10.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.681 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.683 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.688 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 2.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.698 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:15:10 np0005603622 nova_compute[221301]: 2026-01-31 08:15:10.698 221324 INFO nova.compute.claims [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.239 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.240 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.522 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.910 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.935 221324 INFO nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:15:11 np0005603622 nova_compute[221301]: 2026-01-31 08:15:11.947 221324 DEBUG nova.policy [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '111fdaf79c084a91902fe37a7a502020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58e900992be7400fb940ca20f13e12d1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:15:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2085050528' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.008 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:12.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.018 221324 DEBUG nova.compute.provider_tree [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:12.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.346 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.701 221324 DEBUG nova.scheduler.client.report [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.976 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:12 np0005603622 nova_compute[221301]: 2026-01-31 08:15:12.978 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.265 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.265 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.276 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.277 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.278 221324 INFO nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Creating image(s)#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.316 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.353 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.392 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.397 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.437 221324 INFO nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.491 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.492 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.493 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.494 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.526 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.531 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 21c87211-765f-42a8-b8fd-77a11942f707_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.869 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 21c87211-765f-42a8-b8fd-77a11942f707_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:13 np0005603622 nova_compute[221301]: 2026-01-31 08:15:13.996 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] resizing rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:15:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:14.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.150 221324 DEBUG nova.objects.instance [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 21c87211-765f-42a8-b8fd-77a11942f707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.251 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:15:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:14.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.339 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Successfully created port: 4bb2940b-9690-4921-a86c-daf5eefed7dd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.483 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.484 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Ensure instance console log exists: /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.484 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.485 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.485 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.530 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:14.531 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:14.534 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.576 221324 INFO nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Booting with volume 20fd9e7c-9700-482e-a6ae-9d5671e6d553 at /dev/vda#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.731 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.734 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.749 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.750 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[64444972-a3bb-41fe-bcfe-f282a4aeeda3]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.752 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.761 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.762 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[6c75cc7b-6ae8-4c7f-81c1-936a6644869e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.764 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.773 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.774 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[8035d6cd-e599-4425-aa53-4c830065481d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.776 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[c318995d-e8e2-44ef-b229-45c2fc2811c2]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.776 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.805 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.808 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.808 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.808 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.809 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] <== get_connector_properties: return (76ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:15:14 np0005603622 nova_compute[221301]: 2026-01-31 08:15:14.809 221324 DEBUG nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating existing volume attachment record: e6217452-8282-4e11-a078-f705b6777630 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:15:15 np0005603622 nova_compute[221301]: 2026-01-31 08:15:15.051 221324 DEBUG nova.policy [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '469a76c175b64f84b29a4a44c8eaf3c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '587e3df6fc904afc99c9c6411095e3ef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:15:15 np0005603622 nova_compute[221301]: 2026-01-31 08:15:15.372 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:16.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:16 np0005603622 nova_compute[221301]: 2026-01-31 08:15:16.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:16 np0005603622 nova_compute[221301]: 2026-01-31 08:15:16.913 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:15:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:18.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:18.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:18 np0005603622 nova_compute[221301]: 2026-01-31 08:15:18.737 221324 INFO nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Booting with volume e24cf252-b35c-4390-a954-7f28d6e0eaed at /dev/vdb#033[00m
Jan 31 03:15:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:15:18 np0005603622 nova_compute[221301]: 2026-01-31 08:15:18.995 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:15:18 np0005603622 nova_compute[221301]: 2026-01-31 08:15:18.996 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.008 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.009 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[3f78227a-061d-43c0-9131-c273da635365]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.010 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.018 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.018 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b56576bf-656f-4385-a5e7-6c2d9dfb9d83]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.020 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.028 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.029 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[06766ff8-06ed-4009-9efa-8ab733c04ce7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.030 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[5073c374-5fa1-4737-a71c-b96788ed4ce0]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.030 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.052 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully created port: ad571df1-872d-4024-8955-346236aa8e90 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.055 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.057 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.058 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.058 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.058 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] <== get_connector_properties: return (63ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.059 221324 DEBUG nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating existing volume attachment record: f60e9547-c27e-4ed8-83d9-7d8315e3b075 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.535 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Successfully updated port: 4bb2940b-9690-4921-a86c-daf5eefed7dd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:19.537 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.652 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.653 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquired lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.653 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.744 221324 DEBUG nova.compute.manager [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-changed-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.744 221324 DEBUG nova.compute.manager [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Refreshing instance network info cache due to event network-changed-4bb2940b-9690-4921-a86c-daf5eefed7dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:19 np0005603622 nova_compute[221301]: 2026-01-31 08:15:19.744 221324 DEBUG oslo_concurrency.lockutils [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.034 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.291 221324 INFO nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Booting with volume 785fba22-460b-4d01-b053-ffec518020b7 at /dev/vdc#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.306 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully created port: 0df677ac-7f81-4a86-bf52-3028b2dda46e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:20.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.403 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.407 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.419 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.419 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[50a9c97f-da91-4a89-8e69-f8dea94a3470]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.420 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.429 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.429 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf499db-4043-42f9-beb1-1533ccfa88d0]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.432 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.440 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.440 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[90f1ce6d-4ba3-4f83-92b8-406377ff1f36]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.441 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[75693ba6-192e-4c53-8001-6f9907aaba69]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.442 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.468 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.471 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.471 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.472 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.472 221324 DEBUG os_brick.utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:15:20 np0005603622 nova_compute[221301]: 2026-01-31 08:15:20.473 221324 DEBUG nova.virt.block_device [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating existing volume attachment record: 639386f7-bb08-40e2-b25b-04efc71b9a52 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.031 221324 DEBUG nova.network.neutron [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Updating instance_info_cache with network_info: [{"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.103 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Releasing lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.103 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Instance network_info: |[{"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.104 221324 DEBUG oslo_concurrency.lockutils [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.104 221324 DEBUG nova.network.neutron [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Refreshing network info cache for port 4bb2940b-9690-4921-a86c-daf5eefed7dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.107 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Start _get_guest_xml network_info=[{"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.113 221324 WARNING nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.120 221324 DEBUG nova.virt.libvirt.host [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.121 221324 DEBUG nova.virt.libvirt.host [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.124 221324 DEBUG nova.virt.libvirt.host [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.125 221324 DEBUG nova.virt.libvirt.host [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.126 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.127 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.127 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.127 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.128 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.128 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.128 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.128 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.128 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.129 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.129 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.129 221324 DEBUG nova.virt.hardware [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.133 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:15:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3430486554' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.468 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully created port: 2333e158-ceb0-4a1f-898b-191f9da4e5c7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:15:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3067909551' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.599 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.626 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.633 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:21 np0005603622 nova_compute[221301]: 2026-01-31 08:15:21.916 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:22.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:15:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3265761866' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.095 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.097 221324 DEBUG nova.virt.libvirt.vif [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1685354513',display_name='tempest-ServerDiskConfigTestJSON-server-1685354513',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1685354513',id=95,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-frj7dm28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:12Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=21c87211-765f-42a8-b8fd-77a11942f707,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.098 221324 DEBUG nova.network.os_vif_util [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.099 221324 DEBUG nova.network.os_vif_util [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.101 221324 DEBUG nova.objects.instance [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21c87211-765f-42a8-b8fd-77a11942f707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.214 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <uuid>21c87211-765f-42a8-b8fd-77a11942f707</uuid>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <name>instance-0000005f</name>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1685354513</nova:name>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:15:21</nova:creationTime>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:user uuid="111fdaf79c084a91902fe37a7a502020">tempest-ServerDiskConfigTestJSON-855158150-project-member</nova:user>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:project uuid="58e900992be7400fb940ca20f13e12d1">tempest-ServerDiskConfigTestJSON-855158150</nova:project>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <nova:port uuid="4bb2940b-9690-4921-a86c-daf5eefed7dd">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="serial">21c87211-765f-42a8-b8fd-77a11942f707</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="uuid">21c87211-765f-42a8-b8fd-77a11942f707</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/21c87211-765f-42a8-b8fd-77a11942f707_disk">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/21c87211-765f-42a8-b8fd-77a11942f707_disk.config">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:10:7e:ca"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <target dev="tap4bb2940b-96"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/console.log" append="off"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:15:22 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:15:22 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:15:22 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:15:22 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.216 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Preparing to wait for external event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.216 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.217 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.217 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.219 221324 DEBUG nova.virt.libvirt.vif [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1685354513',display_name='tempest-ServerDiskConfigTestJSON-server-1685354513',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1685354513',id=95,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-frj7dm28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:12Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=21c87211-765f-42a8-b8fd-77a11942f707,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.219 221324 DEBUG nova.network.os_vif_util [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.221 221324 DEBUG nova.network.os_vif_util [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.221 221324 DEBUG os_vif [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.223 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.224 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.231 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bb2940b-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.232 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4bb2940b-96, col_values=(('external_ids', {'iface-id': '4bb2940b-9690-4921-a86c-daf5eefed7dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:7e:ca', 'vm-uuid': '21c87211-765f-42a8-b8fd-77a11942f707'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.234 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 NetworkManager[49080]: <info>  [1769847322.2383] manager: (tap4bb2940b-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.243 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.245 221324 INFO os_vif [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96')#033[00m
Jan 31 03:15:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:22.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.455 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.458 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.459 221324 INFO nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Creating image(s)#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.459 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.460 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Ensure instance console log exists: /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.460 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.461 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.461 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.583 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.584 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.584 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] No VIF found with MAC fa:16:3e:10:7e:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.585 221324 INFO nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Using config drive#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.622 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.631 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.631 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.631 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.632 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.632 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.634 221324 INFO nova.compute.manager [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Terminating instance#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.636 221324 DEBUG nova.compute.manager [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:15:22 np0005603622 kernel: tapbde2310b-21 (unregistering): left promiscuous mode
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 NetworkManager[49080]: <info>  [1769847322.7459] device (tapbde2310b-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:15:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:22Z|00357|binding|INFO|Releasing lport bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 from this chassis (sb_readonly=0)
Jan 31 03:15:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:22Z|00358|binding|INFO|Setting lport bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 down in Southbound
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.747 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:22Z|00359|binding|INFO|Removing iface tapbde2310b-21 ovn-installed in OVS
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.754 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:22.768 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ec:3b 10.100.0.6'], port_security=['fa:16:3e:5f:ec:3b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '183fba4a-610d-4b96-a12f-996f9f2dc30b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f564452-5f08-4a1c-921e-f2daee9ec936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c03fec1b3664105996aa979e226d8f8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4d0e926d-7b9d-4115-9719-7f0d71edaace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d620dc35-e1b1-4011-a8c1-0995d2048b09, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:22.770 139793 INFO neutron.agent.ovn.metadata.agent [-] Port bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 in datapath 1f564452-5f08-4a1c-921e-f2daee9ec936 unbound from our chassis#033[00m
Jan 31 03:15:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:22.772 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f564452-5f08-4a1c-921e-f2daee9ec936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:15:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:22.774 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ab8058-b8ab-4446-83d9-a1404ab8728a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:22.776 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 namespace which is not needed anymore#033[00m
Jan 31 03:15:22 np0005603622 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 31 03:15:22 np0005603622 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000051.scope: Consumed 27.726s CPU time.
Jan 31 03:15:22 np0005603622 systemd-machined[190406]: Machine qemu-33-instance-00000051 terminated.
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.877 221324 INFO nova.virt.libvirt.driver [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Instance destroyed successfully.#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.877 221324 DEBUG nova.objects.instance [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lazy-loading 'resources' on Instance uuid 183fba4a-610d-4b96-a12f-996f9f2dc30b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.914 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully created port: c1738855-9732-40f2-88e3-e42f04270b02 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [NOTICE]   (250191) : haproxy version is 2.8.14-c23fe91
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [NOTICE]   (250191) : path to executable is /usr/sbin/haproxy
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [WARNING]  (250191) : Exiting Master process...
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [WARNING]  (250191) : Exiting Master process...
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [ALERT]    (250191) : Current worker (250193) exited with code 143 (Terminated)
Jan 31 03:15:22 np0005603622 neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936[250187]: [WARNING]  (250191) : All workers exited. Exiting... (0)
Jan 31 03:15:22 np0005603622 systemd[1]: libpod-b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d.scope: Deactivated successfully.
Jan 31 03:15:22 np0005603622 podman[255371]: 2026-01-31 08:15:22.942097706 +0000 UTC m=+0.050858845 container died b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.942 221324 DEBUG nova.virt.libvirt.vif [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1838803378',display_name='tempest-ServerActionsTestOtherA-server-1838803378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1838803378',id=81,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBmeoiqwLriErudm3CCwwTVCZJNSn8sMBdf3DG0cLKOiUOsjd6g3ELaDiv5VtlA1MtIeSB0EtvnrgQQVESwaz68a/c+EzXdmxnZNxj//jq+4bu6dBh/9tuewDagOu34T9w==',key_name='tempest-keypair-710425515',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:10:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c03fec1b3664105996aa979e226d8f8',ramdisk_id='',reservation_id='r-12td3w0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1768827668',owner_user_name='tempest-ServerActionsTestOtherA-1768827668-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:10:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='12a823bd7c6e4cf492ebf6c1d002a91f',uuid=183fba4a-610d-4b96-a12f-996f9f2dc30b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.943 221324 DEBUG nova.network.os_vif_util [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converting VIF {"id": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "address": "fa:16:3e:5f:ec:3b", "network": {"id": "1f564452-5f08-4a1c-921e-f2daee9ec936", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2006849245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c03fec1b3664105996aa979e226d8f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde2310b-21", "ovs_interfaceid": "bde2310b-2169-48e8-8a6f-c3f25dcd8fd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.944 221324 DEBUG nova.network.os_vif_util [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.944 221324 DEBUG os_vif [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.946 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.946 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbde2310b-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.947 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.951 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.953 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:22 np0005603622 nova_compute[221301]: 2026-01-31 08:15:22.956 221324 INFO os_vif [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:ec:3b,bridge_name='br-int',has_traffic_filtering=True,id=bde2310b-2169-48e8-8a6f-c3f25dcd8fd5,network=Network(1f564452-5f08-4a1c-921e-f2daee9ec936),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbde2310b-21')#033[00m
Jan 31 03:15:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d-userdata-shm.mount: Deactivated successfully.
Jan 31 03:15:22 np0005603622 systemd[1]: var-lib-containers-storage-overlay-ba85ee115b71a5449b3f6210383c98442e140ebe6c4b2778465058f33109bc2a-merged.mount: Deactivated successfully.
Jan 31 03:15:22 np0005603622 podman[255371]: 2026-01-31 08:15:22.987543373 +0000 UTC m=+0.096304542 container cleanup b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:15:23 np0005603622 systemd[1]: libpod-conmon-b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d.scope: Deactivated successfully.
Jan 31 03:15:23 np0005603622 podman[255416]: 2026-01-31 08:15:23.060307288 +0000 UTC m=+0.051064990 container remove b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.066 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dd85d228-e68e-4460-8187-04c2c936484f]: (4, ('Sat Jan 31 08:15:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d)\nb650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d\nSat Jan 31 08:15:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 (b650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d)\nb650ba7b9f5bc27cd509807e5ca0ce5036a9f273d1ea8330f9d68c8922fc817d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.068 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b35557d6-4ebb-4a49-96bc-935410ead93b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.069 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f564452-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:23 np0005603622 kernel: tap1f564452-50: left promiscuous mode
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.091 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ae55bb-1c16-4eeb-9612-a3ca6aa0aae0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.110 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[95945138-41af-42bd-af92-3290250fb58e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.112 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[97854252-5797-482f-9c78-32c51048689d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.126 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[937fa83b-506b-4907-8184-22a6a38cfc58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625343, 'reachable_time': 15550, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255440, 'error': None, 'target': 'ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1f564452\x2d5f08\x2d4a1c\x2d921e\x2df2daee9ec936.mount: Deactivated successfully.
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.132 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f564452-5f08-4a1c-921e-f2daee9ec936 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:15:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:23.133 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0aba8a-c395-44b5-b360-cce17285926e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.292 221324 INFO nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Creating config drive at /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.301 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4wdznvkr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.435 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4wdznvkr" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.485 221324 DEBUG nova.storage.rbd_utils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] rbd image 21c87211-765f-42a8-b8fd-77a11942f707_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.490 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config 21c87211-765f-42a8-b8fd-77a11942f707_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.523 221324 DEBUG nova.network.neutron [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Updated VIF entry in instance network info cache for port 4bb2940b-9690-4921-a86c-daf5eefed7dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.524 221324 DEBUG nova.network.neutron [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Updating instance_info_cache with network_info: [{"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.536 221324 INFO nova.virt.libvirt.driver [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Deleting instance files /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b_del#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.538 221324 INFO nova.virt.libvirt.driver [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Deletion of /var/lib/nova/instances/183fba4a-610d-4b96-a12f-996f9f2dc30b_del complete#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.600 221324 DEBUG oslo_concurrency.lockutils [req-487087d6-025b-45dc-8791-ff0c29977dfc req-af6a4486-4be3-46f0-8d90-ce91cdfcb752 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-21c87211-765f-42a8-b8fd-77a11942f707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.643 221324 DEBUG nova.compute.manager [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-unplugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.644 221324 DEBUG oslo_concurrency.lockutils [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.645 221324 DEBUG oslo_concurrency.lockutils [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.645 221324 DEBUG oslo_concurrency.lockutils [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.645 221324 DEBUG nova.compute.manager [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] No waiting events found dispatching network-vif-unplugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.646 221324 DEBUG nova.compute.manager [req-2eca5ba4-9562-42c5-8a84-cf7d2de5e23e req-cc5d1c61-a3b7-47bc-a56b-731601f85025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-unplugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.686 221324 DEBUG oslo_concurrency.processutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config 21c87211-765f-42a8-b8fd-77a11942f707_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.196s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.687 221324 INFO nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Deleting local config drive /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707/disk.config because it was imported into RBD.#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.741 221324 INFO nova.compute.manager [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.742 221324 DEBUG oslo.service.loopingcall [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.743 221324 DEBUG nova.compute.manager [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.744 221324 DEBUG nova.network.neutron [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:15:23 np0005603622 kernel: tap4bb2940b-96: entered promiscuous mode
Jan 31 03:15:23 np0005603622 NetworkManager[49080]: <info>  [1769847323.7630] manager: (tap4bb2940b-96): new Tun device (/org/freedesktop/NetworkManager/Devices/168)
Jan 31 03:15:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:23Z|00360|binding|INFO|Claiming lport 4bb2940b-9690-4921-a86c-daf5eefed7dd for this chassis.
Jan 31 03:15:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:23Z|00361|binding|INFO|4bb2940b-9690-4921-a86c-daf5eefed7dd: Claiming fa:16:3e:10:7e:ca 10.100.0.6
Jan 31 03:15:23 np0005603622 systemd-udevd[255340]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.762 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:23Z|00362|binding|INFO|Setting lport 4bb2940b-9690-4921-a86c-daf5eefed7dd ovn-installed in OVS
Jan 31 03:15:23 np0005603622 nova_compute[221301]: 2026-01-31 08:15:23.776 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:23 np0005603622 NetworkManager[49080]: <info>  [1769847323.7782] device (tap4bb2940b-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:15:23 np0005603622 NetworkManager[49080]: <info>  [1769847323.7790] device (tap4bb2940b-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:15:23 np0005603622 systemd-machined[190406]: New machine qemu-40-instance-0000005f.
Jan 31 03:15:23 np0005603622 systemd[1]: Started Virtual Machine qemu-40-instance-0000005f.
Jan 31 03:15:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:24Z|00363|binding|INFO|Setting lport 4bb2940b-9690-4921-a86c-daf5eefed7dd up in Southbound
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.015 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:7e:ca 10.100.0.6'], port_security=['fa:16:3e:10:7e:ca 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '21c87211-765f-42a8-b8fd-77a11942f707', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4bb2940b-9690-4921-a86c-daf5eefed7dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.018 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4bb2940b-9690-4921-a86c-daf5eefed7dd in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 bound to our chassis#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.025 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f218695f-c744-4bd8-b2d8-122a920c7ca0#033[00m
Jan 31 03:15:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:24.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.043 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab1bc33-f19b-4daa-97eb-e09120945e9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.045 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf218695f-c1 in ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.048 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf218695f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.049 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b9d840-ee93-4743-9120-31eda741f767]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.050 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a25ac5fa-a973-41c7-89cb-54cf3ce1c073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.068 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1241df6c-bd27-4fb8-8df6-1407e22af6b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.086 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2315fccc-32ce-4ef5-b6d5-5261f659f2db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.124 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6afec2b8-5990-4818-acaa-9ae5296b2d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.135 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8772d1-9dd7-4c0f-8f42-a6e5ea0f8a28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 NetworkManager[49080]: <info>  [1769847324.1363] manager: (tapf218695f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/169)
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.178 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe07acd-7a14-4c8f-84ff-11c64a351cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.181 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[657f0429-91a4-4324-b9a2-72abe334e853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 NetworkManager[49080]: <info>  [1769847324.1972] device (tapf218695f-c0): carrier: link connected
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.199 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80fc9e67-13c7-477b-9bf1-61afaaebe85a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.215 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[033bbd6c-7202-4a43-b9a4-191cce8e8c1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657553, 'reachable_time': 23036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255575, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.228 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8b839735-c785-4453-a0f6-4f560400a45e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 657553, 'tstamp': 657553}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255576, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.244 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[59947560-923b-4230-9cfd-89cf15103ba0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf218695f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657553, 'reachable_time': 23036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255577, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.275 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea7e43f-2e5b-4ece-b6d7-7d0a84f00d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:15:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:24.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.343 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[996c4819-695f-482e-8310-0fea79c8bfe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.345 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.346 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.347 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf218695f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.350 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:24 np0005603622 kernel: tapf218695f-c0: entered promiscuous mode
Jan 31 03:15:24 np0005603622 NetworkManager[49080]: <info>  [1769847324.3530] manager: (tapf218695f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.357 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf218695f-c0, col_values=(('external_ids', {'iface-id': 'd3a551a2-38e3-48d3-bdee-f2493a79eca0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:24Z|00364|binding|INFO|Releasing lport d3a551a2-38e3-48d3-bdee-f2493a79eca0 from this chassis (sb_readonly=1)
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.358 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.361 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.362 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8d813f-544d-47a7-afdf-4ac76a65ecd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.363 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f218695f-c744-4bd8-b2d8-122a920c7ca0.pid.haproxy
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f218695f-c744-4bd8-b2d8-122a920c7ca0
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:15:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:24.364 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'env', 'PROCESS_TAG=haproxy-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f218695f-c744-4bd8-b2d8-122a920c7ca0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.366 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.598 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847324.5979118, 21c87211-765f-42a8-b8fd-77a11942f707 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:24 np0005603622 nova_compute[221301]: 2026-01-31 08:15:24.599 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] VM Started (Lifecycle Event)#033[00m
Jan 31 03:15:24 np0005603622 podman[255651]: 2026-01-31 08:15:24.75940438 +0000 UTC m=+0.072902060 container create c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:15:24 np0005603622 podman[255651]: 2026-01-31 08:15:24.712438981 +0000 UTC m=+0.025936701 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:15:24 np0005603622 systemd[1]: Started libpod-conmon-c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc.scope.
Jan 31 03:15:24 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:15:24 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ba2f661765668262b5bc8e881335b42c38057669de492d3fe6caef7f6aee308/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:15:24 np0005603622 podman[255651]: 2026-01-31 08:15:24.867184591 +0000 UTC m=+0.180682311 container init c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:15:24 np0005603622 podman[255651]: 2026-01-31 08:15:24.874694164 +0000 UTC m=+0.188191844 container start c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:15:24 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [NOTICE]   (255671) : New worker (255673) forked
Jan 31 03:15:24 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [NOTICE]   (255671) : Loading success.
Jan 31 03:15:25 np0005603622 nova_compute[221301]: 2026-01-31 08:15:25.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:26.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.144 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.150 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847324.5981371, 21c87211-765f-42a8-b8fd-77a11942f707 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.150 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.177 221324 DEBUG nova.compute.manager [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.178 221324 DEBUG oslo_concurrency.lockutils [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.178 221324 DEBUG oslo_concurrency.lockutils [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.178 221324 DEBUG oslo_concurrency.lockutils [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.178 221324 DEBUG nova.compute.manager [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] No waiting events found dispatching network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.179 221324 WARNING nova.compute.manager [req-7eca85c3-fd15-4922-9bd4-89785ad614e6 req-99b5f3bb-ea6a-492a-ace7-5208c753c0fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received unexpected event network-vif-plugged-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.243 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.247 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:15:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:26.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.349 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully created port: df26bd91-1f94-4407-962c-1dd9ca4032d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.356 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.385 221324 DEBUG nova.network.neutron [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.537 221324 INFO nova.compute.manager [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Took 2.79 seconds to deallocate network for instance.#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.668 221324 DEBUG nova.compute.manager [req-2b5c78a2-00ea-4329-9ed0-cdcf091ea2f7 req-82bd04b3-f024-4337-a30d-40122aad523e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Received event network-vif-deleted-bde2310b-2169-48e8-8a6f-c3f25dcd8fd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.690 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.690 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.830 221324 DEBUG oslo_concurrency.processutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.997 221324 DEBUG nova.compute.manager [req-65c5cdeb-fda9-4aa8-964d-ef6c7da3a28f req-26ba1a9f-f22b-4800-8def-a2d602be83b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:26 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.998 221324 DEBUG oslo_concurrency.lockutils [req-65c5cdeb-fda9-4aa8-964d-ef6c7da3a28f req-26ba1a9f-f22b-4800-8def-a2d602be83b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:26.999 221324 DEBUG oslo_concurrency.lockutils [req-65c5cdeb-fda9-4aa8-964d-ef6c7da3a28f req-26ba1a9f-f22b-4800-8def-a2d602be83b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.000 221324 DEBUG oslo_concurrency.lockutils [req-65c5cdeb-fda9-4aa8-964d-ef6c7da3a28f req-26ba1a9f-f22b-4800-8def-a2d602be83b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.000 221324 DEBUG nova.compute.manager [req-65c5cdeb-fda9-4aa8-964d-ef6c7da3a28f req-26ba1a9f-f22b-4800-8def-a2d602be83b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Processing event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.001 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.005 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847327.005467, 21c87211-765f-42a8-b8fd-77a11942f707 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.006 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.008 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.012 221324 INFO nova.virt.libvirt.driver [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Instance spawned successfully.#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.013 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.088 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.102 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.108 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.109 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.110 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.110 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.111 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.112 221324 DEBUG nova.virt.libvirt.driver [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:15:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1925582256' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.310 221324 DEBUG oslo_concurrency.processutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.317 221324 DEBUG nova.compute.provider_tree [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.337 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.400 221324 DEBUG nova.scheduler.client.report [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.407 221324 INFO nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Took 14.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.408 221324 DEBUG nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.466 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.526 221324 INFO nova.scheduler.client.report [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Deleted allocations for instance 183fba4a-610d-4b96-a12f-996f9f2dc30b#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.553 221324 INFO nova.compute.manager [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Took 21.17 seconds to build instance.#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.613 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.645 221324 DEBUG oslo_concurrency.lockutils [None req-fd8930f9-27dc-40ab-84f7-5b7dbc70b0c6 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.701 221324 DEBUG oslo_concurrency.lockutils [None req-e1007a57-3701-4852-a1f3-15f7d1f02251 12a823bd7c6e4cf492ebf6c1d002a91f 9c03fec1b3664105996aa979e226d8f8 - - default default] Lock "183fba4a-610d-4b96-a12f-996f9f2dc30b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:27 np0005603622 nova_compute[221301]: 2026-01-31 08:15:27.951 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:28.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:28.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:28 np0005603622 nova_compute[221301]: 2026-01-31 08:15:28.862 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: ad571df1-872d-4024-8955-346236aa8e90 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.049 221324 DEBUG nova.compute.manager [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.050 221324 DEBUG nova.compute.manager [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-ad571df1-872d-4024-8955-346236aa8e90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.050 221324 DEBUG oslo_concurrency.lockutils [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.050 221324 DEBUG oslo_concurrency.lockutils [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.050 221324 DEBUG nova.network.neutron [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port ad571df1-872d-4024-8955-346236aa8e90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.572 221324 DEBUG nova.network.neutron [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.670 221324 DEBUG nova.compute.manager [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.671 221324 DEBUG oslo_concurrency.lockutils [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.671 221324 DEBUG oslo_concurrency.lockutils [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.672 221324 DEBUG oslo_concurrency.lockutils [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.672 221324 DEBUG nova.compute.manager [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] No waiting events found dispatching network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:29 np0005603622 nova_compute[221301]: 2026-01-31 08:15:29.673 221324 WARNING nova.compute.manager [req-67a5ca01-a9ab-4fda-ae5f-0804b5f30ed9 req-326a33d3-961b-423c-8285-ea4797755278 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received unexpected event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd for instance with vm_state active and task_state None.#033[00m
Jan 31 03:15:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:30 np0005603622 nova_compute[221301]: 2026-01-31 08:15:30.149 221324 DEBUG nova.network.neutron [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:30 np0005603622 nova_compute[221301]: 2026-01-31 08:15:30.276 221324 DEBUG oslo_concurrency.lockutils [req-4fdd5661-2d70-4d06-afae-4a47f467b57a req-346015fc-bec1-41d5-8445-595a1c70e867 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:30.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:30 np0005603622 nova_compute[221301]: 2026-01-31 08:15:30.463 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:30.868 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:30.869 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:30.870 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:32.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:32.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:32 np0005603622 nova_compute[221301]: 2026-01-31 08:15:32.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:33 np0005603622 podman[255704]: 2026-01-31 08:15:33.183543268 +0000 UTC m=+0.108042279 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:15:33 np0005603622 nova_compute[221301]: 2026-01-31 08:15:33.939 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: 9ff14940-7ad5-4fc4-852a-08f78695ba7a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.285 221324 DEBUG nova.compute.manager [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-9ff14940-7ad5-4fc4-852a-08f78695ba7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.286 221324 DEBUG nova.compute.manager [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-9ff14940-7ad5-4fc4-852a-08f78695ba7a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.287 221324 DEBUG oslo_concurrency.lockutils [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.287 221324 DEBUG oslo_concurrency.lockutils [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.288 221324 DEBUG nova.network.neutron [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port 9ff14940-7ad5-4fc4-852a-08f78695ba7a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:34.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.565 221324 DEBUG nova.network.neutron [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:34 np0005603622 nova_compute[221301]: 2026-01-31 08:15:34.973 221324 DEBUG nova.network.neutron [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:35 np0005603622 nova_compute[221301]: 2026-01-31 08:15:35.013 221324 DEBUG oslo_concurrency.lockutils [req-80375309-92a3-493f-9762-c0ddfab68bc0 req-88018f51-1801-47b9-9391-28ddd230351c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:35 np0005603622 nova_compute[221301]: 2026-01-31 08:15:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:35 np0005603622 nova_compute[221301]: 2026-01-31 08:15:35.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:35 np0005603622 nova_compute[221301]: 2026-01-31 08:15:35.877 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: ea3924fc-85a0-42f4-98f9-62f431992e9a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:36.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:36 np0005603622 podman[255730]: 2026-01-31 08:15:36.143194615 +0000 UTC m=+0.070809723 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 31 03:15:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:36.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.421 221324 DEBUG nova.compute.manager [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-ea3924fc-85a0-42f4-98f9-62f431992e9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.421 221324 DEBUG nova.compute.manager [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-ea3924fc-85a0-42f4-98f9-62f431992e9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.422 221324 DEBUG oslo_concurrency.lockutils [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.422 221324 DEBUG oslo_concurrency.lockutils [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.422 221324 DEBUG nova.network.neutron [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port ea3924fc-85a0-42f4-98f9-62f431992e9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:36 np0005603622 nova_compute[221301]: 2026-01-31 08:15:36.914 221324 DEBUG nova.network.neutron [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.411 221324 DEBUG nova.network.neutron [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.435 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.435 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.436 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.479 221324 DEBUG oslo_concurrency.lockutils [req-7c6d75f2-15cf-4d13-9143-053684c559a7 req-307500ee-1d1c-4b86-9586-8a38931268ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.531 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.532 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.532 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.533 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.533 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.535 221324 INFO nova.compute.manager [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Terminating instance#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.536 221324 DEBUG nova.compute.manager [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.542 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.544 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.544 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:15:37 np0005603622 kernel: tap4bb2940b-96 (unregistering): left promiscuous mode
Jan 31 03:15:37 np0005603622 NetworkManager[49080]: <info>  [1769847337.5939] device (tap4bb2940b-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:37Z|00365|binding|INFO|Releasing lport 4bb2940b-9690-4921-a86c-daf5eefed7dd from this chassis (sb_readonly=0)
Jan 31 03:15:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:37Z|00366|binding|INFO|Setting lport 4bb2940b-9690-4921-a86c-daf5eefed7dd down in Southbound
Jan 31 03:15:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:15:37Z|00367|binding|INFO|Removing iface tap4bb2940b-96 ovn-installed in OVS
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.612 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.625 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:7e:ca 10.100.0.6'], port_security=['fa:16:3e:10:7e:ca 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '21c87211-765f-42a8-b8fd-77a11942f707', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58e900992be7400fb940ca20f13e12d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '596ab0fa-9144-4a59-97b9-1afd98634ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae8797c-8cfa-434b-94e1-deeda92af05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=4bb2940b-9690-4921-a86c-daf5eefed7dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.628 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 4bb2940b-9690-4921-a86c-daf5eefed7dd in datapath f218695f-c744-4bd8-b2d8-122a920c7ca0 unbound from our chassis#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.632 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f218695f-c744-4bd8-b2d8-122a920c7ca0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.633 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[acd894af-f120-4d82-a431-11a4c7af9bfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.633 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 namespace which is not needed anymore#033[00m
Jan 31 03:15:37 np0005603622 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 31 03:15:37 np0005603622 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005f.scope: Consumed 11.582s CPU time.
Jan 31 03:15:37 np0005603622 systemd-machined[190406]: Machine qemu-40-instance-0000005f terminated.
Jan 31 03:15:37 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [NOTICE]   (255671) : haproxy version is 2.8.14-c23fe91
Jan 31 03:15:37 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [NOTICE]   (255671) : path to executable is /usr/sbin/haproxy
Jan 31 03:15:37 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [WARNING]  (255671) : Exiting Master process...
Jan 31 03:15:37 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [ALERT]    (255671) : Current worker (255673) exited with code 143 (Terminated)
Jan 31 03:15:37 np0005603622 neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0[255667]: [WARNING]  (255671) : All workers exited. Exiting... (0)
Jan 31 03:15:37 np0005603622 systemd[1]: libpod-c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc.scope: Deactivated successfully.
Jan 31 03:15:37 np0005603622 podman[255774]: 2026-01-31 08:15:37.779374136 +0000 UTC m=+0.058407468 container died c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.788 221324 INFO nova.virt.libvirt.driver [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Instance destroyed successfully.#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.793 221324 DEBUG nova.objects.instance [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lazy-loading 'resources' on Instance uuid 21c87211-765f-42a8-b8fd-77a11942f707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:15:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc-userdata-shm.mount: Deactivated successfully.
Jan 31 03:15:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8ba2f661765668262b5bc8e881335b42c38057669de492d3fe6caef7f6aee308-merged.mount: Deactivated successfully.
Jan 31 03:15:37 np0005603622 podman[255774]: 2026-01-31 08:15:37.836437168 +0000 UTC m=+0.115470510 container cleanup c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:15:37 np0005603622 systemd[1]: libpod-conmon-c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc.scope: Deactivated successfully.
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.850 221324 DEBUG nova.virt.libvirt.vif [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1685354513',display_name='tempest-ServerDiskConfigTestJSON-server-1685354513',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1685354513',id=95,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:15:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58e900992be7400fb940ca20f13e12d1',ramdisk_id='',reservation_id='r-frj7dm28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-855158150',owner_user_name='tempest-ServerDiskConfigTestJSON-855158150-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:15:34Z,user_data=None,user_id='111fdaf79c084a91902fe37a7a502020',uuid=21c87211-765f-42a8-b8fd-77a11942f707,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.851 221324 DEBUG nova.network.os_vif_util [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converting VIF {"id": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "address": "fa:16:3e:10:7e:ca", "network": {"id": "f218695f-c744-4bd8-b2d8-122a920c7ca0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1189208428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58e900992be7400fb940ca20f13e12d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bb2940b-96", "ovs_interfaceid": "4bb2940b-9690-4921-a86c-daf5eefed7dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.852 221324 DEBUG nova.network.os_vif_util [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.852 221324 DEBUG os_vif [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.854 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bb2940b-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.859 221324 INFO os_vif [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7e:ca,bridge_name='br-int',has_traffic_filtering=True,id=4bb2940b-9690-4921-a86c-daf5eefed7dd,network=Network(f218695f-c744-4bd8-b2d8-122a920c7ca0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bb2940b-96')#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.880 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847322.8741333, 183fba4a-610d-4b96-a12f-996f9f2dc30b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.881 221324 INFO nova.compute.manager [-] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:15:37 np0005603622 podman[255814]: 2026-01-31 08:15:37.911684 +0000 UTC m=+0.052492369 container remove c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.918 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ee1905ce-78d4-49f0-b529-04951982d987]: (4, ('Sat Jan 31 08:15:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc)\nc687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc\nSat Jan 31 08:15:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 (c687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc)\nc687feacab9f6f995ba0698f2a393bc372cb962a89d8e852df73284493929efc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.921 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c437ac8f-91b4-40f7-b875-a36a6375c7ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.922 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf218695f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 kernel: tapf218695f-c0: left promiscuous mode
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.936 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e61076b0-282f-4c85-996c-94689796879a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.947 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8741e5b5-7bde-4731-b90b-65e56843f845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.949 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[354c47b6-b486-4db0-a732-e07b3205cb0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 nova_compute[221301]: 2026-01-31 08:15:37.956 221324 DEBUG nova.compute.manager [None req-64b7a32a-087e-4a33-b4d7-325756f43b3b - - - - - -] [instance: 183fba4a-610d-4b96-a12f-996f9f2dc30b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.970 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c7de6009-004f-4065-9a9b-1c8dc07bc627]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657545, 'reachable_time': 34467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255847, 'error': None, 'target': 'ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.973 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f218695f-c744-4bd8-b2d8-122a920c7ca0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:15:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:15:37.973 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[ffacac11-b084-4497-b602-b730c724ebc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:15:37 np0005603622 systemd[1]: run-netns-ovnmeta\x2df218695f\x2dc744\x2d4bd8\x2db2d8\x2d122a920c7ca0.mount: Deactivated successfully.
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.026 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: 0df677ac-7f81-4a86-bf52-3028b2dda46e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:38.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:38.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.437 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.438 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.438 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.438 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.439 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.523 221324 DEBUG nova.compute.manager [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.524 221324 DEBUG nova.compute.manager [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-0df677ac-7f81-4a86-bf52-3028b2dda46e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.525 221324 DEBUG oslo_concurrency.lockutils [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.525 221324 DEBUG oslo_concurrency.lockutils [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.526 221324 DEBUG nova.network.neutron [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port 0df677ac-7f81-4a86-bf52-3028b2dda46e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1767297335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.918 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:38 np0005603622 nova_compute[221301]: 2026-01-31 08:15:38.940 221324 DEBUG nova.network.neutron [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.017 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.018 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000005f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.172 221324 DEBUG nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-unplugged-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.172 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.173 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.173 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.174 221324 DEBUG nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] No waiting events found dispatching network-vif-unplugged-4bb2940b-9690-4921-a86c-daf5eefed7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.174 221324 DEBUG nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-unplugged-4bb2940b-9690-4921-a86c-daf5eefed7dd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.174 221324 DEBUG nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.175 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "21c87211-765f-42a8-b8fd-77a11942f707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.175 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.175 221324 DEBUG oslo_concurrency.lockutils [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.176 221324 DEBUG nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] No waiting events found dispatching network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.176 221324 WARNING nova.compute.manager [req-837365c2-7d44-454f-aa56-f5b02c54970f req-f85af098-fe2c-47ed-b99d-a8f624dce7fe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received unexpected event network-vif-plugged-4bb2940b-9690-4921-a86c-daf5eefed7dd for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.235 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.237 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4621MB free_disk=20.967376708984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.237 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.238 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.425 221324 DEBUG nova.network.neutron [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.432 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 21c87211-765f-42a8-b8fd-77a11942f707 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.433 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance b49dd2dc-476a-42c6-89f3-868e238698f2 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.433 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.433 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.452 221324 DEBUG oslo_concurrency.lockutils [req-70e77953-e184-42fd-b9b6-16760e4983ad req-2c856875-637b-4edf-8740-5f4c8b1ffe66 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.467 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.494 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.495 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.511 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.536 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.612 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.676 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.756 221324 INFO nova.virt.libvirt.driver [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Deleting instance files /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707_del#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.758 221324 INFO nova.virt.libvirt.driver [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Deletion of /var/lib/nova/instances/21c87211-765f-42a8-b8fd-77a11942f707_del complete#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.861 221324 INFO nova.compute.manager [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Took 2.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.863 221324 DEBUG oslo.service.loopingcall [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.863 221324 DEBUG nova.compute.manager [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.864 221324 DEBUG nova.network.neutron [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:15:39 np0005603622 nova_compute[221301]: 2026-01-31 08:15:39.984 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: 2333e158-ceb0-4a1f-898b-191f9da4e5c7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/182041244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.079 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.086 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.127 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.174 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.175 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:40.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.563 221324 DEBUG nova.network.neutron [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.694 221324 DEBUG nova.compute.manager [req-b7badf16-0c1e-466b-8df4-b729f40a8885 req-0c1e6f12-c706-4688-a133-cc41a875d9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Received event network-vif-deleted-4bb2940b-9690-4921-a86c-daf5eefed7dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.695 221324 INFO nova.compute.manager [req-b7badf16-0c1e-466b-8df4-b729f40a8885 req-0c1e6f12-c706-4688-a133-cc41a875d9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Neutron deleted interface 4bb2940b-9690-4921-a86c-daf5eefed7dd; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.695 221324 DEBUG nova.network.neutron [req-b7badf16-0c1e-466b-8df4-b729f40a8885 req-0c1e6f12-c706-4688-a133-cc41a875d9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.697 221324 DEBUG nova.compute.manager [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.698 221324 DEBUG nova.compute.manager [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-2333e158-ceb0-4a1f-898b-191f9da4e5c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.698 221324 DEBUG oslo_concurrency.lockutils [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.699 221324 DEBUG oslo_concurrency.lockutils [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.699 221324 DEBUG nova.network.neutron [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port 2333e158-ceb0-4a1f-898b-191f9da4e5c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.703 221324 INFO nova.compute.manager [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Took 0.84 seconds to deallocate network for instance.#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.898 221324 DEBUG nova.compute.manager [req-b7badf16-0c1e-466b-8df4-b729f40a8885 req-0c1e6f12-c706-4688-a133-cc41a875d9c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Detach interface failed, port_id=4bb2940b-9690-4921-a86c-daf5eefed7dd, reason: Instance 21c87211-765f-42a8-b8fd-77a11942f707 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.954 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:15:40 np0005603622 nova_compute[221301]: 2026-01-31 08:15:40.955 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.087 221324 DEBUG oslo_concurrency.processutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.176 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.180 221324 DEBUG nova.network.neutron [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:15:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/940362567' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.536 221324 DEBUG oslo_concurrency.processutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.540 221324 DEBUG nova.compute.provider_tree [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:15:41 np0005603622 nova_compute[221301]: 2026-01-31 08:15:41.687 221324 DEBUG nova.scheduler.client.report [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.047 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: c1738855-9732-40f2-88e3-e42f04270b02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:42.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.078 221324 DEBUG nova.network.neutron [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:42.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.492 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.498 221324 DEBUG oslo_concurrency.lockutils [req-27244a82-9281-44cb-b23d-87bdc6952d6d req-916de88f-6f60-4b42-b7ed-58348f082310 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.536 221324 INFO nova.scheduler.client.report [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Deleted allocations for instance 21c87211-765f-42a8-b8fd-77a11942f707#033[00m
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.777 221324 DEBUG oslo_concurrency.lockutils [None req-b8a3c45a-c936-4f82-ade4-01a78fce3928 111fdaf79c084a91902fe37a7a502020 58e900992be7400fb940ca20f13e12d1 - - default default] Lock "21c87211-765f-42a8-b8fd-77a11942f707" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:15:42 np0005603622 nova_compute[221301]: 2026-01-31 08:15:42.858 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:43 np0005603622 nova_compute[221301]: 2026-01-31 08:15:43.448 221324 DEBUG nova.compute.manager [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:43 np0005603622 nova_compute[221301]: 2026-01-31 08:15:43.449 221324 DEBUG nova.compute.manager [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-c1738855-9732-40f2-88e3-e42f04270b02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:43 np0005603622 nova_compute[221301]: 2026-01-31 08:15:43.449 221324 DEBUG oslo_concurrency.lockutils [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:43 np0005603622 nova_compute[221301]: 2026-01-31 08:15:43.450 221324 DEBUG oslo_concurrency.lockutils [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:43 np0005603622 nova_compute[221301]: 2026-01-31 08:15:43.450 221324 DEBUG nova.network.neutron [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port c1738855-9732-40f2-88e3-e42f04270b02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:44.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:44.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:44 np0005603622 nova_compute[221301]: 2026-01-31 08:15:44.426 221324 DEBUG nova.network.neutron [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:44 np0005603622 nova_compute[221301]: 2026-01-31 08:15:44.900 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Successfully updated port: df26bd91-1f94-4407-962c-1dd9ca4032d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.276 221324 DEBUG nova.network.neutron [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.468 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.561 221324 DEBUG nova.compute.manager [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.562 221324 DEBUG nova.compute.manager [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-df26bd91-1f94-4407-962c-1dd9ca4032d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.562 221324 DEBUG oslo_concurrency.lockutils [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.567 221324 DEBUG oslo_concurrency.lockutils [req-77ddc80a-2da1-4505-9f94-0cbaefddd758 req-18069c8c-c2fc-4044-acc0-d9a3e0460955 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.567 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:45 np0005603622 nova_compute[221301]: 2026-01-31 08:15:45.568 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:15:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:46.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:46.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:47 np0005603622 nova_compute[221301]: 2026-01-31 08:15:47.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:47 np0005603622 nova_compute[221301]: 2026-01-31 08:15:47.921 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:15:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:48.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:48.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:50.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:50.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:50 np0005603622 nova_compute[221301]: 2026-01-31 08:15:50.591 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:52.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:15:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:52.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:15:52 np0005603622 nova_compute[221301]: 2026-01-31 08:15:52.779 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847337.7778826, 21c87211-765f-42a8-b8fd-77a11942f707 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:15:52 np0005603622 nova_compute[221301]: 2026-01-31 08:15:52.780 221324 INFO nova.compute.manager [-] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:15:52 np0005603622 nova_compute[221301]: 2026-01-31 08:15:52.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:52 np0005603622 nova_compute[221301]: 2026-01-31 08:15:52.925 221324 DEBUG nova.compute.manager [None req-cce8b434-a4f8-4fa2-ba24-d7b8a103e86a - - - - - -] [instance: 21c87211-765f-42a8-b8fd-77a11942f707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:15:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:15:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:54.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:15:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:54.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:55 np0005603622 nova_compute[221301]: 2026-01-31 08:15:55.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:56.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:56.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:15:57 np0005603622 nova_compute[221301]: 2026-01-31 08:15:57.909 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:15:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:15:58.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.109 221324 DEBUG nova.network.neutron [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:15:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:15:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:15:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:15:58.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.537 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.537 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance network_info: |[{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.539 221324 DEBUG oslo_concurrency.lockutils [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.539 221324 DEBUG nova.network.neutron [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port df26bd91-1f94-4407-962c-1dd9ca4032d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.550 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Start _get_guest_xml network_info=[{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T
Jan 31 03:15:58 np0005603622 nova_compute[221301]: min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': 0, 'attachment_id': 'e6217452-8282-4e11-a078-f705b6777630', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-20fd9e7c-9700-482e-a6ae-9d5671e6d553', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '20fd9e7c-9700-482e-a6ae-9d5671e6d553', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'attached_at': '', 'detached_at': '', 'volume_id': '20fd9e7c-9700-482e-a6ae-9d5671e6d553', 'serial': '20fd9e7c-9700-482e-a6ae-9d5671e6d553'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}, {'delete_on_termination': False, 'boot_index': 1, 'attachment_id': 'f60e9547-c27e-4ed8-83d9-7d8315e3b075', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-e24cf252-b35c-4390-a954-7f28d6e0eaed', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'e24cf252-b35c-4390-a954-7f28d6e0eaed', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'attached_at': '', 'detached_at': '', 'volume_id': 'e24cf252-b35c-4390-a954-7f28d6e0eaed', 'serial': 'e24cf252-b35c-4390-a954-7f28d6e0eaed'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}, {'delete_on_termination': False, 'boot_index': 2, 'attachment_id': '639386f7-bb08-40e2-b25b-04efc71b9a52', 'mount_device': '/dev/vdc', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-785fba22-460b-4d01-b053-ffec518020b7', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '785fba22-460b-4d01-b053-ffec518020b7', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'attached_at': '', 'detached_at': '', 'volume_id': '785fba22-460b-4d01-b053-ffec518020b7', 'serial': '785fba22-460b-4d01-b053-ffec518020b7'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.557 221324 WARNING nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.566 221324 DEBUG nova.virt.libvirt.host [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.566 221324 DEBUG nova.virt.libvirt.host [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.570 221324 DEBUG nova.virt.libvirt.host [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.570 221324 DEBUG nova.virt.libvirt.host [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.571 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.571 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.572 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.572 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.572 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.573 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.573 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.573 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.573 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.573 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.574 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.574 221324 DEBUG nova.virt.hardware [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.609 221324 DEBUG nova.storage.rbd_utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] rbd image b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:15:58 np0005603622 nova_compute[221301]: 2026-01-31 08:15:58.615 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:15:58 np0005603622 rsyslogd[1003]: message too long (8192) with configured size 8096, begin of message is: 2026-01-31 08:15:58.550 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 31 03:15:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:15:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2762930043' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:15:59 np0005603622 nova_compute[221301]: 2026-01-31 08:15:59.062 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:00.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:00.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.647 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.648 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.650 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.651 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.652 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.653 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.654 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.654 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.655 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.657 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.657 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.658 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.659 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.660 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.661 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.662 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.663 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.664 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.665 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.666 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.666 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.668 221324 DEBUG nova.objects.instance [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lazy-loading 'pci_devices' on Instance uuid b49dd2dc-476a-42c6-89f3-868e238698f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:16:00 np0005603622 nova_compute[221301]: 2026-01-31 08:16:00.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.197 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <uuid>b49dd2dc-476a-42c6-89f3-868e238698f2</uuid>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <name>instance-00000060</name>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:name>tempest-device-tagging-server-897253457</nova:name>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:15:58</nova:creationTime>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:user uuid="469a76c175b64f84b29a4a44c8eaf3c4">tempest-TaggedBootDevicesTest-3446815-project-member</nova:user>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:project uuid="587e3df6fc904afc99c9c6411095e3ef">tempest-TaggedBootDevicesTest-3446815</nova:project>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="ad571df1-872d-4024-8955-346236aa8e90">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="9ff14940-7ad5-4fc4-852a-08f78695ba7a">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.1.1.54" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="ea3924fc-85a0-42f4-98f9-62f431992e9a">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.1.1.180" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="0df677ac-7f81-4a86-bf52-3028b2dda46e">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.1.1.70" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="2333e158-ceb0-4a1f-898b-191f9da4e5c7">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.1.1.208" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="c1738855-9732-40f2-88e3-e42f04270b02">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <nova:port uuid="df26bd91-1f94-4407-962c-1dd9ca4032d0">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="serial">b49dd2dc-476a-42c6-89f3-868e238698f2</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="uuid">b49dd2dc-476a-42c6-89f3-868e238698f2</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-20fd9e7c-9700-482e-a6ae-9d5671e6d553">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <serial>20fd9e7c-9700-482e-a6ae-9d5671e6d553</serial>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-e24cf252-b35c-4390-a954-7f28d6e0eaed">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <serial>e24cf252-b35c-4390-a954-7f28d6e0eaed</serial>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-785fba22-460b-4d01-b053-ffec518020b7">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="vdc" bus="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <serial>785fba22-460b-4d01-b053-ffec518020b7</serial>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:06:fb:b1"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tapad571df1-87"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:1d:8f:c0"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tap9ff14940-7a"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:09:eb:67"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tapea3924fc-85"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:7e:73:4e"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tap0df677ac-7f"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:15:24:89"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tap2333e158-ce"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:8b:4d:5a"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tapc1738855-97"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:57:e5:27"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <target dev="tapdf26bd91-1f"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/console.log" append="off"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:16:01 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:16:01 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:16:01 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:16:01 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.200 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.200 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.200 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.201 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.201 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.201 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.202 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.202 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.202 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.203 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.203 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.203 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.203 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.204 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.204 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.204 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.205 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.205 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.205 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.205 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.206 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.206 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.206 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.207 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.207 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Preparing to wait for external event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.207 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.207 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.208 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.209 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.209 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.210 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.211 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.211 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.212 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.212 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.216 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.217 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad571df1-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.217 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad571df1-87, col_values=(('external_ids', {'iface-id': 'ad571df1-872d-4024-8955-346236aa8e90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:fb:b1', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.234 221324 DEBUG nova.network.neutron [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updated VIF entry in instance network info cache for port df26bd91-1f94-4407-962c-1dd9ca4032d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.235 221324 DEBUG nova.network.neutron [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.3952] manager: (tapad571df1-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.396 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.401 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.403 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.405 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.406 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.407 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.408 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.409 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.410 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.413 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.414 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ff14940-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.414 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ff14940-7a, col_values=(('external_ids', {'iface-id': '9ff14940-7ad5-4fc4-852a-08f78695ba7a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:8f:c0', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.4177] manager: (tap9ff14940-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.419 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.427 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.429 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.429 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.431 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.431 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.433 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.433 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.436 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea3924fc-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.437 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea3924fc-85, col_values=(('external_ids', {'iface-id': 'ea3924fc-85a0-42f4-98f9-62f431992e9a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:eb:67', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.4405] manager: (tapea3924fc-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.442 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.450 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.452 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.453 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.454 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.454 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.456 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.456 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.459 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0df677ac-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.460 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0df677ac-7f, col_values=(('external_ids', {'iface-id': '0df677ac-7f81-4a86-bf52-3028b2dda46e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:73:4e', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.4632] manager: (tap0df677ac-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.477 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.479 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.479 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.480 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.481 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.482 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.483 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.486 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.486 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2333e158-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.487 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2333e158-ce, col_values=(('external_ids', {'iface-id': '2333e158-ceb0-4a1f-898b-191f9da4e5c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:24:89', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.4900] manager: (tap2333e158-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.507 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.508 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.509 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.510 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.511 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.512 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.512 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.515 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1738855-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.516 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc1738855-97, col_values=(('external_ids', {'iface-id': 'c1738855-9732-40f2-88e3-e42f04270b02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:4d:5a', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.5192] manager: (tapc1738855-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.538 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.539 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.540 221324 DEBUG nova.virt.libvirt.vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:15:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.541 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.541 221324 DEBUG nova.network.os_vif_util [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.542 221324 DEBUG os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.542 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.542 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.543 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.545 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf26bd91-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.546 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf26bd91-1f, col_values=(('external_ids', {'iface-id': 'df26bd91-1f94-4407-962c-1dd9ca4032d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:e5:27', 'vm-uuid': 'b49dd2dc-476a-42c6-89f3-868e238698f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.547 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 NetworkManager[49080]: <info>  [1769847361.5491] manager: (tapdf26bd91-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.564 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.565 221324 INFO os_vif [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f')#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.672 221324 DEBUG oslo_concurrency.lockutils [req-1374fa75-1709-4efa-b21e-1c9dfae03fc2 req-00fbad82-344f-476e-927b-8e43f883eafb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.922 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.923 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.923 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] No VIF found with MAC fa:16:3e:06:fb:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.924 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] No VIF found with MAC fa:16:3e:15:24:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.925 221324 INFO nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Using config drive#033[00m
Jan 31 03:16:01 np0005603622 nova_compute[221301]: 2026-01-31 08:16:01.966 221324 DEBUG nova.storage.rbd_utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] rbd image b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:02.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:02.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:02 np0005603622 nova_compute[221301]: 2026-01-31 08:16:02.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:02 np0005603622 nova_compute[221301]: 2026-01-31 08:16:02.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:16:02 np0005603622 nova_compute[221301]: 2026-01-31 08:16:02.533 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:16:03 np0005603622 nova_compute[221301]: 2026-01-31 08:16:03.723 221324 INFO nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Creating config drive at /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config#033[00m
Jan 31 03:16:03 np0005603622 nova_compute[221301]: 2026-01-31 08:16:03.735 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6e40poz_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:03 np0005603622 nova_compute[221301]: 2026-01-31 08:16:03.874 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6e40poz_" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:03 np0005603622 nova_compute[221301]: 2026-01-31 08:16:03.918 221324 DEBUG nova.storage.rbd_utils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] rbd image b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:16:03 np0005603622 nova_compute[221301]: 2026-01-31 08:16:03.923 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:04 np0005603622 podman[256042]: 2026-01-31 08:16:04.220158629 +0000 UTC m=+0.138137493 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:16:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:04.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.449 221324 DEBUG oslo_concurrency.processutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config b49dd2dc-476a-42c6-89f3-868e238698f2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.450 221324 INFO nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Deleting local config drive /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2/disk.config because it was imported into RBD.#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.5209] manager: (tapad571df1-87): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Jan 31 03:16:04 np0005603622 kernel: tapad571df1-87: entered promiscuous mode
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.5336] manager: (tap9ff14940-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00368|binding|INFO|Claiming lport ad571df1-872d-4024-8955-346236aa8e90 for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00369|binding|INFO|ad571df1-872d-4024-8955-346236aa8e90: Claiming fa:16:3e:06:fb:b1 10.100.0.13
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6062] manager: (tapea3924fc-85): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 31 03:16:04 np0005603622 systemd-udevd[256095]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:16:04 np0005603622 kernel: tapea3924fc-85: entered promiscuous mode
Jan 31 03:16:04 np0005603622 kernel: tap9ff14940-7a: entered promiscuous mode
Jan 31 03:16:04 np0005603622 systemd-udevd[256098]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:16:04 np0005603622 systemd-udevd[256097]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6187] device (tapea3924fc-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6191] device (tapad571df1-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6195] device (tapea3924fc-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6198] device (tapad571df1-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6229] device (tap9ff14940-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6245] manager: (tap0df677ac-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6254] device (tap9ff14940-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00370|if_status|INFO|Not updating pb chassis for ea3924fc-85a0-42f4-98f9-62f431992e9a now as sb is readonly
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6381] manager: (tap2333e158-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Jan 31 03:16:04 np0005603622 kernel: tap2333e158-ce: entered promiscuous mode
Jan 31 03:16:04 np0005603622 kernel: tap0df677ac-7f: entered promiscuous mode
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6432] device (tap0df677ac-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6439] device (tap0df677ac-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.646 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6487] device (tap2333e158-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6491] device (tap2333e158-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6508] manager: (tapc1738855-97): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Jan 31 03:16:04 np0005603622 kernel: tapc1738855-97: entered promiscuous mode
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.654 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6630] device (tapc1738855-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6637] device (tapc1738855-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6657] manager: (tapdf26bd91-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Jan 31 03:16:04 np0005603622 kernel: tapdf26bd91-1f: entered promiscuous mode
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.666 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6718] device (tapdf26bd91-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.6723] device (tapdf26bd91-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00371|binding|INFO|Claiming lport c1738855-9732-40f2-88e3-e42f04270b02 for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00372|binding|INFO|c1738855-9732-40f2-88e3-e42f04270b02: Claiming fa:16:3e:8b:4d:5a 10.2.2.100
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00373|binding|INFO|Claiming lport 0df677ac-7f81-4a86-bf52-3028b2dda46e for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00374|binding|INFO|0df677ac-7f81-4a86-bf52-3028b2dda46e: Claiming fa:16:3e:7e:73:4e 10.1.1.70
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00375|binding|INFO|Claiming lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00376|binding|INFO|9ff14940-7ad5-4fc4-852a-08f78695ba7a: Claiming fa:16:3e:1d:8f:c0 10.1.1.54
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00377|binding|INFO|Claiming lport ea3924fc-85a0-42f4-98f9-62f431992e9a for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00378|binding|INFO|ea3924fc-85a0-42f4-98f9-62f431992e9a: Claiming fa:16:3e:09:eb:67 10.1.1.180
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00379|binding|INFO|Claiming lport df26bd91-1f94-4407-962c-1dd9ca4032d0 for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00380|binding|INFO|df26bd91-1f94-4407-962c-1dd9ca4032d0: Claiming fa:16:3e:57:e5:27 10.2.2.200
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00381|binding|INFO|Claiming lport 2333e158-ceb0-4a1f-898b-191f9da4e5c7 for this chassis.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00382|binding|INFO|2333e158-ceb0-4a1f-898b-191f9da4e5c7: Claiming fa:16:3e:15:24:89 10.1.1.208
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.679 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:fb:b1 10.100.0.13'], port_security=['fa:16:3e:06:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b4cc557-dcfc-4a4a-b710-9e93bc5854ac, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ad571df1-872d-4024-8955-346236aa8e90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.680 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ad571df1-872d-4024-8955-346236aa8e90 in datapath c9a825ca-0a1d-4321-b7ad-502bf00f7f45 bound to our chassis#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.682 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c9a825ca-0a1d-4321-b7ad-502bf00f7f45#033[00m
Jan 31 03:16:04 np0005603622 systemd-machined[190406]: New machine qemu-41-instance-00000060.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00383|binding|INFO|Setting lport ad571df1-872d-4024-8955-346236aa8e90 ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00384|binding|INFO|Setting lport ad571df1-872d-4024-8955-346236aa8e90 up in Southbound
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.694 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[179a8fb8-fe4d-4efe-a2de-ec8f474e62b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.695 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc9a825ca-01 in ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.697 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc9a825ca-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.698 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fec55132-7e0f-4e5b-aeeb-3ccfa40c01c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.699 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06610fd7-4a53-45ac-8379-ffc142302208]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.708 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a98eb67e-d9db-463b-bd0e-ece72398d24e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 systemd[1]: Started Virtual Machine qemu-41-instance-00000060.
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00385|binding|INFO|Setting lport 2333e158-ceb0-4a1f-898b-191f9da4e5c7 ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00386|binding|INFO|Setting lport df26bd91-1f94-4407-962c-1dd9ca4032d0 ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00387|binding|INFO|Setting lport 0df677ac-7f81-4a86-bf52-3028b2dda46e ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00388|binding|INFO|Setting lport ea3924fc-85a0-42f4-98f9-62f431992e9a ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00389|binding|INFO|Setting lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a ovn-installed in OVS
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00390|binding|INFO|Setting lport c1738855-9732-40f2-88e3-e42f04270b02 ovn-installed in OVS
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.718 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.721 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b59c7e-219d-4b5f-82be-aa7d2e4d8d08]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.746 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d7327e-badb-4d73-9b58-c047e46ce54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.750 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3154552d-830a-4283-9bb9-ee6ee99fde25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.7518] manager: (tapc9a825ca-00): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.783 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[251998fa-328b-439d-8481-3016c0d59ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.789 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bed3ceb9-6959-46fc-baf1-06f214b37fc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.8080] device (tapc9a825ca-00): carrier: link connected
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.814 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[16f8bd45-2268-489e-90b5-a05c4f296ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.832 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2a615a87-2081-429a-b620-d4e336af04ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc9a825ca-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:4c:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661614, 'reachable_time': 21513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256158, 'error': None, 'target': 'ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.845 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c0a9cd-a0bd-4134-ae2e-56b6c164284a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:4c7f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661614, 'tstamp': 661614}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256159, 'error': None, 'target': 'ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.863 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d2362d-b47f-4b0f-9c12-c8c5015364f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc9a825ca-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:4c:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661614, 'reachable_time': 21513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256160, 'error': None, 'target': 'ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.891 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c945234d-6ed7-456a-a18a-8c6613442b90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.956 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:24:89 10.1.1.208'], port_security=['fa:16:3e:15:24:89 10.1.1.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.208/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2333e158-ceb0-4a1f-898b-191f9da4e5c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00391|binding|INFO|Setting lport 2333e158-ceb0-4a1f-898b-191f9da4e5c7 up in Southbound
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00392|binding|INFO|Setting lport df26bd91-1f94-4407-962c-1dd9ca4032d0 up in Southbound
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00393|binding|INFO|Setting lport 0df677ac-7f81-4a86-bf52-3028b2dda46e up in Southbound
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00394|binding|INFO|Setting lport ea3924fc-85a0-42f4-98f9-62f431992e9a up in Southbound
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00395|binding|INFO|Setting lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a up in Southbound
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00396|binding|INFO|Setting lport c1738855-9732-40f2-88e3-e42f04270b02 up in Southbound
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.959 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:4d:5a 10.2.2.100'], port_security=['fa:16:3e:8b:4d:5a 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd48c619-e7d0-4c6c-96f1-6aa28a7e2333, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=c1738855-9732-40f2-88e3-e42f04270b02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.961 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:73:4e 10.1.1.70'], port_security=['fa:16:3e:7e:73:4e 10.1.1.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.70/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0df677ac-7f81-4a86-bf52-3028b2dda46e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.963 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:8f:c0 10.1.1.54'], port_security=['fa:16:3e:1d:8f:c0 10.1.1.54'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-254413178', 'neutron:cidrs': '10.1.1.54/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-254413178', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': '621f9777-60f7-4278-a85f-2b28bb26d990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9ff14940-7ad5-4fc4-852a-08f78695ba7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.965 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:eb:67 10.1.1.180'], port_security=['fa:16:3e:09:eb:67 10.1.1.180'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1412251406', 'neutron:cidrs': '10.1.1.180/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1412251406', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': '621f9777-60f7-4278-a85f-2b28bb26d990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ea3924fc-85a0-42f4-98f9-62f431992e9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.967 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:e5:27 10.2.2.200'], port_security=['fa:16:3e:57:e5:27 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd48c619-e7d0-4c6c-96f1-6aa28a7e2333, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=df26bd91-1f94-4407-962c-1dd9ca4032d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.961 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e22783d1-b6ea-43ca-9b1a-ee7fd7322e4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.969 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9a825ca-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.969 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.970 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9a825ca-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:04 np0005603622 kernel: tapc9a825ca-00: entered promiscuous mode
Jan 31 03:16:04 np0005603622 NetworkManager[49080]: <info>  [1769847364.9733] manager: (tapc9a825ca-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.976 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc9a825ca-00, col_values=(('external_ids', {'iface-id': '50772627-ac3b-4fc1-a904-fd6001405199'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:04 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:04Z|00397|binding|INFO|Releasing lport 50772627-ac3b-4fc1-a904-fd6001405199 from this chassis (sb_readonly=1)
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 nova_compute[221301]: 2026-01-31 08:16:04.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.987 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c9a825ca-0a1d-4321-b7ad-502bf00f7f45.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c9a825ca-0a1d-4321-b7ad-502bf00f7f45.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.988 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1c79da05-7c99-49e4-8543-41c0485c3d0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.989 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-c9a825ca-0a1d-4321-b7ad-502bf00f7f45
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/c9a825ca-0a1d-4321-b7ad-502bf00f7f45.pid.haproxy
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID c9a825ca-0a1d-4321-b7ad-502bf00f7f45
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:16:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:04.991 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'env', 'PROCESS_TAG=haproxy-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c9a825ca-0a1d-4321-b7ad-502bf00f7f45.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:16:05 np0005603622 podman[256227]: 2026-01-31 08:16:05.411947558 +0000 UTC m=+0.073054014 container create ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:16:05 np0005603622 systemd[1]: Started libpod-conmon-ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d.scope.
Jan 31 03:16:05 np0005603622 podman[256227]: 2026-01-31 08:16:05.369935033 +0000 UTC m=+0.031041519 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:16:05 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:16:05 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abcf94ee27acd80af0fc310d629aef8bc894b372721bb5f055f79cdf7995969d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:16:05 np0005603622 podman[256227]: 2026-01-31 08:16:05.572795941 +0000 UTC m=+0.233902467 container init ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:16:05 np0005603622 podman[256227]: 2026-01-31 08:16:05.582432572 +0000 UTC m=+0.243539068 container start ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:16:05 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [NOTICE]   (256282) : New worker (256284) forked
Jan 31 03:16:05 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [NOTICE]   (256282) : Loading success.
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.648 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2333e158-ceb0-4a1f-898b-191f9da4e5c7 in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.655 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89bb50d1-4b51-4284-b511-c21064f9c3e1#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.667 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4861a0cc-1927-4a95-934b-58ce10e01e15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.669 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89bb50d1-41 in ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.671 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89bb50d1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.672 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ffaed6b1-da97-49a6-9c67-69f6fda9c7b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.673 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dee3197c-0000-4413-a703-eec58b60ee0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.674 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.688 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[582d91b4-0de5-4252-be5c-8b1556cfdfa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.700 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca520f59-269a-47cc-9a38-64a04227c3f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.737 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d220ffea-33a8-45fe-8fb0-2a2247a73c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.742 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0d1b263e-e9dc-4c84-925e-7826c8657a69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 systemd-udevd[256141]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:16:05 np0005603622 NetworkManager[49080]: <info>  [1769847365.7456] manager: (tap89bb50d1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/187)
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.776 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c9581027-6fd5-4874-9b00-5699e8fe4b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.780 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a9028082-d580-4ee0-ae07-cf5632e64944]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 NetworkManager[49080]: <info>  [1769847365.8035] device (tap89bb50d1-40): carrier: link connected
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.810 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[98cef5f1-8103-41c9-9492-0ba0e19299c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.816 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847365.8153691, b49dd2dc-476a-42c6-89f3-868e238698f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.817 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] VM Started (Lifecycle Event)#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.832 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c64b0e62-afb1-44d6-a326-9902c099c3df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bb50d1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:7b:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661714, 'reachable_time': 37138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256315, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.849 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[11d7b33b-107b-40c3-bc31-0ea2df4eda0c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:7b3c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661714, 'tstamp': 661714}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256316, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.868 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a04f060c-d15b-437d-9e82-7ff7a67b9cf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bb50d1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:7b:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661714, 'reachable_time': 37138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256317, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.902 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb0eca7-afa1-4667-b8f2-198c36dfa7f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.968 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0beb2428-ff1e-4002-8025-c5baf0b6b920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.970 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bb50d1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.971 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.972 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89bb50d1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.975 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:05 np0005603622 NetworkManager[49080]: <info>  [1769847365.9763] manager: (tap89bb50d1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 31 03:16:05 np0005603622 kernel: tap89bb50d1-40: entered promiscuous mode
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.979 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.980 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89bb50d1-40, col_values=(('external_ids', {'iface-id': '89f08470-513b-43e2-b94b-37e89d331b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:05Z|00398|binding|INFO|Releasing lport 89f08470-513b-43e2-b94b-37e89d331b1d from this chassis (sb_readonly=0)
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.983 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89bb50d1-4b51-4284-b511-c21064f9c3e1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89bb50d1-4b51-4284-b511-c21064f9c3e1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.984 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09d4c446-4b7b-4d9e-aeaf-104727861d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.985 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-89bb50d1-4b51-4284-b511-c21064f9c3e1
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/89bb50d1-4b51-4284-b511-c21064f9c3e1.pid.haproxy
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 89bb50d1-4b51-4284-b511-c21064f9c3e1
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:16:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:05.985 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'env', 'PROCESS_TAG=haproxy-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89bb50d1-4b51-4284-b511-c21064f9c3e1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:16:05 np0005603622 nova_compute[221301]: 2026-01-31 08:16:05.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.043 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.051 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847365.8157165, b49dd2dc-476a-42c6-89f3-868e238698f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.051 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:16:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.184 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.189 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:16:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:06.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:06 np0005603622 podman[256350]: 2026-01-31 08:16:06.409836319 +0000 UTC m=+0.068994064 container create fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:06 np0005603622 systemd[1]: Started libpod-conmon-fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd.scope.
Jan 31 03:16:06 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:16:06 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e5ee4d372ca182c6debbe6ca4f8d6a85ca00ddd82d6b2457bc85e28cdedd956/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:16:06 np0005603622 podman[256350]: 2026-01-31 08:16:06.375164833 +0000 UTC m=+0.034322678 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:16:06 np0005603622 podman[256350]: 2026-01-31 08:16:06.478859844 +0000 UTC m=+0.138017679 container init fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 03:16:06 np0005603622 podman[256350]: 2026-01-31 08:16:06.487921768 +0000 UTC m=+0.147079543 container start fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.488 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:16:06 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [NOTICE]   (256380) : New worker (256384) forked
Jan 31 03:16:06 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [NOTICE]   (256380) : Loading success.
Jan 31 03:16:06 np0005603622 podman[256363]: 2026-01-31 08:16:06.532259976 +0000 UTC m=+0.078875361 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.554 139793 INFO neutron.agent.ovn.metadata.agent [-] Port c1738855-9732-40f2-88e3-e42f04270b02 in datapath fcd4b12f-9cb6-45af-9713-c30c45ef51d9 unbound from our chassis#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.558 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fcd4b12f-9cb6-45af-9713-c30c45ef51d9#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.568 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ea48637e-d664-442f-8fdc-1f310b93fb92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.569 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfcd4b12f-91 in ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.572 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfcd4b12f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74ec79d4-0429-4821-83d7-504f57c18f6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.573 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cbc6a99-eee2-48f4-9ec1-004b16a8020b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.583 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[701410f7-ab65-4284-b784-c521ca4f787a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.637 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5eedf0a5-c151-4d44-b6d8-d13492f1f2c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.667 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[31fa3de8-28c8-47d8-a5eb-2716b66f58b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.672 221324 DEBUG nova.compute.manager [req-43453c84-7525-49fd-8c8e-bf8f61bff590 req-a5056225-f8dd-40b3-be66-e2bc87912f41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.673 221324 DEBUG oslo_concurrency.lockutils [req-43453c84-7525-49fd-8c8e-bf8f61bff590 req-a5056225-f8dd-40b3-be66-e2bc87912f41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.673 221324 DEBUG oslo_concurrency.lockutils [req-43453c84-7525-49fd-8c8e-bf8f61bff590 req-a5056225-f8dd-40b3-be66-e2bc87912f41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.673 221324 DEBUG oslo_concurrency.lockutils [req-43453c84-7525-49fd-8c8e-bf8f61bff590 req-a5056225-f8dd-40b3-be66-e2bc87912f41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.673 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0b469c59-79ee-4e2b-a2d1-2ab36f02b274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 NetworkManager[49080]: <info>  [1769847366.6754] manager: (tapfcd4b12f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.674 221324 DEBUG nova.compute.manager [req-43453c84-7525-49fd-8c8e-bf8f61bff590 req-a5056225-f8dd-40b3-be66-e2bc87912f41 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.716 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[344b2839-b0ab-4873-b8fb-b91b3e7696dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.721 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1e944f-ab26-46dc-a2e5-a9ba238ccafc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 NetworkManager[49080]: <info>  [1769847366.7489] device (tapfcd4b12f-90): carrier: link connected
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.757 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[551968b2-9da2-49bc-9f11-64f6f42a28b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.780 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1425fc-7650-4aed-83e3-e4defe5127fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfcd4b12f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:b7:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661808, 'reachable_time': 22675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256407, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.797 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc3e5d3-a7f3-40ca-8d65-6156f26a830e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:b766'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661808, 'tstamp': 661808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256408, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.818 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d5430aeb-1343-4879-a583-caad6b6345a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfcd4b12f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:b7:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661808, 'reachable_time': 22675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256409, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.853 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4d6df9-f64a-4905-9181-39877d69c573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.863 221324 DEBUG nova.compute.manager [req-2b7c82e0-5fae-42f0-93f5-d5bdca151455 req-7bf00417-3913-4d04-aa5c-6a8ef3b3da18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.864 221324 DEBUG oslo_concurrency.lockutils [req-2b7c82e0-5fae-42f0-93f5-d5bdca151455 req-7bf00417-3913-4d04-aa5c-6a8ef3b3da18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.864 221324 DEBUG oslo_concurrency.lockutils [req-2b7c82e0-5fae-42f0-93f5-d5bdca151455 req-7bf00417-3913-4d04-aa5c-6a8ef3b3da18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.865 221324 DEBUG oslo_concurrency.lockutils [req-2b7c82e0-5fae-42f0-93f5-d5bdca151455 req-7bf00417-3913-4d04-aa5c-6a8ef3b3da18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.865 221324 DEBUG nova.compute.manager [req-2b7c82e0-5fae-42f0-93f5-d5bdca151455 req-7bf00417-3913-4d04-aa5c-6a8ef3b3da18 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.922 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7119eb35-665d-43e3-93c0-5a55af6cb693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.923 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfcd4b12f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.924 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.925 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfcd4b12f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 kernel: tapfcd4b12f-90: entered promiscuous mode
Jan 31 03:16:06 np0005603622 NetworkManager[49080]: <info>  [1769847366.9296] manager: (tapfcd4b12f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.932 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.936 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfcd4b12f-90, col_values=(('external_ids', {'iface-id': '062e4b19-1435-45a4-9a04-775059a336eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:06Z|00399|binding|INFO|Releasing lport 062e4b19-1435-45a4-9a04-775059a336eb from this chassis (sb_readonly=0)
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.939 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.940 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fcd4b12f-9cb6-45af-9713-c30c45ef51d9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fcd4b12f-9cb6-45af-9713-c30c45ef51d9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.941 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09c7a4d8-9825-4002-a854-b6fe294b5a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.942 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-fcd4b12f-9cb6-45af-9713-c30c45ef51d9
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/fcd4b12f-9cb6-45af-9713-c30c45ef51d9.pid.haproxy
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID fcd4b12f-9cb6-45af-9713-c30c45ef51d9
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:16:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:06.943 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'env', 'PROCESS_TAG=haproxy-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fcd4b12f-9cb6-45af-9713-c30c45ef51d9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:16:06 np0005603622 nova_compute[221301]: 2026-01-31 08:16:06.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:07 np0005603622 podman[256441]: 2026-01-31 08:16:07.357336901 +0000 UTC m=+0.064174235 container create 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:16:07 np0005603622 systemd[1]: Started libpod-conmon-59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94.scope.
Jan 31 03:16:07 np0005603622 podman[256441]: 2026-01-31 08:16:07.322392467 +0000 UTC m=+0.029229801 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:16:07 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:16:07 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1927e96672e84e510cb720652144f81aebe83d2c29b17749e8eb251b5f923517/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:16:07 np0005603622 podman[256441]: 2026-01-31 08:16:07.454221957 +0000 UTC m=+0.161059291 container init 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:16:07 np0005603622 podman[256441]: 2026-01-31 08:16:07.46319174 +0000 UTC m=+0.170029094 container start 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:16:07 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [NOTICE]   (256461) : New worker (256463) forked
Jan 31 03:16:07 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [NOTICE]   (256461) : Loading success.
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.541 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0df677ac-7f81-4a86-bf52-3028b2dda46e in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.546 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89bb50d1-4b51-4284-b511-c21064f9c3e1#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.558 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[af31c769-9e91-4181-8e67-088aeaa45e02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.589 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6a44a102-70be-453b-8244-9658c43cc5d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.596 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ac342cc2-e162-4115-b95f-abc0e9c09d05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.621 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[de5b4e52-2df8-4f87-92ba-821d70f75d35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.639 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf04739-f84e-4cde-bf81-202aa377a0aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bb50d1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:7b:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 5, 'tx_packets': 6, 'rx_bytes': 442, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 5, 'tx_packets': 6, 'rx_bytes': 442, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661714, 'reachable_time': 37138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 5, 'inoctets': 372, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 5, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 372, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 5, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256477, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.655 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[98c12d38-733c-48d4-bc82-e6bb882feb46]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661726, 'tstamp': 661726}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256478, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661730, 'tstamp': 661730}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256478, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.657 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bb50d1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.700 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.702 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89bb50d1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.703 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.704 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89bb50d1-40, col_values=(('external_ids', {'iface-id': '89f08470-513b-43e2-b94b-37e89d331b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.705 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.707 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff14940-7ad5-4fc4-852a-08f78695ba7a in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.712 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89bb50d1-4b51-4284-b511-c21064f9c3e1#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.731 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9e704fb9-021b-426d-b433-4b4468f74d47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.760 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[178b26df-4eca-4e35-8859-0abf00a4d78d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.763 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4257ba-5152-44e8-b32e-af63c6987bef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.789 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3299126e-9d56-43d1-8868-6a30e5309d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.804 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[626df88e-786a-479b-974b-cf60559feee9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bb50d1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:7b:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 5, 'tx_packets': 8, 'rx_bytes': 442, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 5, 'tx_packets': 8, 'rx_bytes': 442, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661714, 'reachable_time': 37138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 5, 'inoctets': 372, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 5, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 372, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 5, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256484, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[82f0a0e9-ce84-4aa2-acfa-31d703d05aee]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661726, 'tstamp': 661726}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256485, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661730, 'tstamp': 661730}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256485, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.827 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bb50d1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.831 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89bb50d1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.832 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89bb50d1-40, col_values=(('external_ids', {'iface-id': '89f08470-513b-43e2-b94b-37e89d331b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.833 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.834 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ea3924fc-85a0-42f4-98f9-62f431992e9a in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.836 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89bb50d1-4b51-4284-b511-c21064f9c3e1#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.856 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f0af3dec-b8ed-46fd-9ce3-d0a60228bbf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.878 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a41e3688-53d9-4787-9984-bc654ac2bd77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.882 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f98d58ce-5529-43b9-b1f2-ed3ba16e0527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.900 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dd213f7a-a7f8-4d3b-bd4f-3bb15c1a6bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.917 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7ff9d1-04df-4786-ba6d-07e55e423699]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89bb50d1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:7b:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661714, 'reachable_time': 37138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256491, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.932 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e0feee83-2f3e-4ba1-9627-f2e90c08aa9a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661726, 'tstamp': 661726}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256492, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap89bb50d1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661730, 'tstamp': 661730}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256492, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.934 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bb50d1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 nova_compute[221301]: 2026-01-31 08:16:07.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.939 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89bb50d1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.939 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.939 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89bb50d1-40, col_values=(('external_ids', {'iface-id': '89f08470-513b-43e2-b94b-37e89d331b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.940 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.941 139793 INFO neutron.agent.ovn.metadata.agent [-] Port df26bd91-1f94-4407-962c-1dd9ca4032d0 in datapath fcd4b12f-9cb6-45af-9713-c30c45ef51d9 unbound from our chassis#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.943 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fcd4b12f-9cb6-45af-9713-c30c45ef51d9#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.957 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7570abef-24bb-475d-a405-afb38ecb6baa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.986 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[04088afe-fd2a-4079-85b5-8e06f6072a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:07.990 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7087b6bc-819f-42b5-aa53-5ae713fe1eba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.021 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[983b5b0e-a96d-4918-8fd5-6c70a6fbf72d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.040 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb24a38c-3e39-42fd-9c81-1949e7adb41c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfcd4b12f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:b7:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661808, 'reachable_time': 22675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256498, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.063 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[27de0690-fdd5-4d28-bbb0-c2e3a8c0bfb7]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tapfcd4b12f-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661822, 'tstamp': 661822}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256499, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfcd4b12f-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661825, 'tstamp': 661825}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256499, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.066 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfcd4b12f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:08 np0005603622 nova_compute[221301]: 2026-01-31 08:16:08.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:08 np0005603622 nova_compute[221301]: 2026-01-31 08:16:08.070 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.070 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfcd4b12f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.071 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.071 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfcd4b12f-90, col_values=(('external_ids', {'iface-id': '062e4b19-1435-45a4-9a04-775059a336eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:08.072 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:16:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:08.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:08.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.011 221324 DEBUG nova.compute.manager [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.012 221324 DEBUG oslo_concurrency.lockutils [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.012 221324 DEBUG oslo_concurrency.lockutils [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.012 221324 DEBUG oslo_concurrency.lockutils [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.013 221324 DEBUG nova.compute.manager [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 in dict_keys([('network-vif-plugged', 'ad571df1-872d-4024-8955-346236aa8e90'), ('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a'), ('network-vif-plugged', 'ea3924fc-85a0-42f4-98f9-62f431992e9a'), ('network-vif-plugged', '0df677ac-7f81-4a86-bf52-3028b2dda46e'), ('network-vif-plugged', 'c1738855-9732-40f2-88e3-e42f04270b02')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.013 221324 WARNING nova.compute.manager [req-8d2675dc-e622-4502-8cd5-d432bed574c6 req-f53923c1-289a-45e8-9b66-1b8c39edd25a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.226 221324 DEBUG nova.compute.manager [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.226 221324 DEBUG oslo_concurrency.lockutils [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.227 221324 DEBUG oslo_concurrency.lockutils [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.227 221324 DEBUG oslo_concurrency.lockutils [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.228 221324 DEBUG nova.compute.manager [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 in dict_keys([('network-vif-plugged', 'ad571df1-872d-4024-8955-346236aa8e90'), ('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a'), ('network-vif-plugged', 'ea3924fc-85a0-42f4-98f9-62f431992e9a'), ('network-vif-plugged', '0df677ac-7f81-4a86-bf52-3028b2dda46e'), ('network-vif-plugged', 'c1738855-9732-40f2-88e3-e42f04270b02')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:09 np0005603622 nova_compute[221301]: 2026-01-31 08:16:09.228 221324 WARNING nova.compute.manager [req-ca492576-bd07-40eb-b633-d33f594a406e req-4748efec-a585-455f-84ff-5e5a185838c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:10.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:10.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:10 np0005603622 nova_compute[221301]: 2026-01-31 08:16:10.682 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:11 np0005603622 nova_compute[221301]: 2026-01-31 08:16:11.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:12.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:12.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.068 221324 DEBUG nova.compute.manager [req-7c8b8e6d-f3e4-40c9-be59-18f7213abd28 req-51fdb304-be52-44b3-8860-d2fe0a7080fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.068 221324 DEBUG oslo_concurrency.lockutils [req-7c8b8e6d-f3e4-40c9-be59-18f7213abd28 req-51fdb304-be52-44b3-8860-d2fe0a7080fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.069 221324 DEBUG oslo_concurrency.lockutils [req-7c8b8e6d-f3e4-40c9-be59-18f7213abd28 req-51fdb304-be52-44b3-8860-d2fe0a7080fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.069 221324 DEBUG oslo_concurrency.lockutils [req-7c8b8e6d-f3e4-40c9-be59-18f7213abd28 req-51fdb304-be52-44b3-8860-d2fe0a7080fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.070 221324 DEBUG nova.compute.manager [req-7c8b8e6d-f3e4-40c9-be59-18f7213abd28 req-51fdb304-be52-44b3-8860-d2fe0a7080fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:15 np0005603622 nova_compute[221301]: 2026-01-31 08:16:15.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:16:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:16.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:16:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:16.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:16 np0005603622 nova_compute[221301]: 2026-01-31 08:16:16.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.537 221324 DEBUG nova.compute.manager [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.538 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.538 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.539 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.539 221324 DEBUG nova.compute.manager [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 in dict_keys([('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a'), ('network-vif-plugged', 'ea3924fc-85a0-42f4-98f9-62f431992e9a'), ('network-vif-plugged', '0df677ac-7f81-4a86-bf52-3028b2dda46e'), ('network-vif-plugged', 'c1738855-9732-40f2-88e3-e42f04270b02')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.540 221324 WARNING nova.compute.manager [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.541 221324 DEBUG nova.compute.manager [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.541 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.542 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.542 221324 DEBUG oslo_concurrency.lockutils [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:17 np0005603622 nova_compute[221301]: 2026-01-31 08:16:17.543 221324 DEBUG nova.compute.manager [req-3e760f66-4af4-4628-8cfe-81db79d756f7 req-1883d296-9a32-4562-863f-62b8416659b2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:18.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:18.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:20.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:20.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:20 np0005603622 nova_compute[221301]: 2026-01-31 08:16:20.782 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:21 np0005603622 nova_compute[221301]: 2026-01-31 08:16:21.641 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:21.661 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:16:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:21.661 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:16:21 np0005603622 nova_compute[221301]: 2026-01-31 08:16:21.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:21.662 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:16:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:22.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:24.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:24.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:25 np0005603622 nova_compute[221301]: 2026-01-31 08:16:25.786 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:26.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:26 np0005603622 nova_compute[221301]: 2026-01-31 08:16:26.643 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:28.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:28.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:16:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:16:29 np0005603622 nova_compute[221301]: 2026-01-31 08:16:29.534 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:30.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:30 np0005603622 nova_compute[221301]: 2026-01-31 08:16:30.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:30.869 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:30.870 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:16:30.871 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.415 221324 DEBUG nova.compute.manager [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.416 221324 DEBUG oslo_concurrency.lockutils [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.416 221324 DEBUG oslo_concurrency.lockutils [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.417 221324 DEBUG oslo_concurrency.lockutils [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.417 221324 DEBUG nova.compute.manager [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 in dict_keys([('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a'), ('network-vif-plugged', 'ea3924fc-85a0-42f4-98f9-62f431992e9a'), ('network-vif-plugged', '0df677ac-7f81-4a86-bf52-3028b2dda46e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.417 221324 WARNING nova.compute.manager [req-ed84e6f4-df15-44fa-9714-215d5ddf3a45 req-77206c8f-7dba-4111-943c-f95a0118d4b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:31 np0005603622 nova_compute[221301]: 2026-01-31 08:16:31.646 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:32.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:32.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:34.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:34 np0005603622 nova_compute[221301]: 2026-01-31 08:16:34.362 221324 DEBUG nova.compute.manager [req-3e067096-3947-4a5a-86dc-cd0a7580bc76 req-063d347a-e039-49a0-841b-146f3e0b8a74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:34 np0005603622 nova_compute[221301]: 2026-01-31 08:16:34.363 221324 DEBUG oslo_concurrency.lockutils [req-3e067096-3947-4a5a-86dc-cd0a7580bc76 req-063d347a-e039-49a0-841b-146f3e0b8a74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:34 np0005603622 nova_compute[221301]: 2026-01-31 08:16:34.364 221324 DEBUG oslo_concurrency.lockutils [req-3e067096-3947-4a5a-86dc-cd0a7580bc76 req-063d347a-e039-49a0-841b-146f3e0b8a74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:34 np0005603622 nova_compute[221301]: 2026-01-31 08:16:34.364 221324 DEBUG oslo_concurrency.lockutils [req-3e067096-3947-4a5a-86dc-cd0a7580bc76 req-063d347a-e039-49a0-841b-146f3e0b8a74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:34 np0005603622 nova_compute[221301]: 2026-01-31 08:16:34.365 221324 DEBUG nova.compute.manager [req-3e067096-3947-4a5a-86dc-cd0a7580bc76 req-063d347a-e039-49a0-841b-146f3e0b8a74 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:34.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:16:35 np0005603622 podman[256682]: 2026-01-31 08:16:35.206503755 +0000 UTC m=+0.129611352 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 31 03:16:35 np0005603622 nova_compute[221301]: 2026-01-31 08:16:35.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:36.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:36 np0005603622 nova_compute[221301]: 2026-01-31 08:16:36.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:36 np0005603622 nova_compute[221301]: 2026-01-31 08:16:36.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:36.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:36 np0005603622 nova_compute[221301]: 2026-01-31 08:16:36.648 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:37 np0005603622 podman[256710]: 2026-01-31 08:16:37.146538674 +0000 UTC m=+0.070144265 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:16:37 np0005603622 nova_compute[221301]: 2026-01-31 08:16:37.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:37 np0005603622 nova_compute[221301]: 2026-01-31 08:16:37.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:16:37 np0005603622 nova_compute[221301]: 2026-01-31 08:16:37.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:16:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:38.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:38.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:40.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.280 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.280 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.281 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.281 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.281 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a in dict_keys([('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a'), ('network-vif-plugged', '0df677ac-7f81-4a86-bf52-3028b2dda46e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.281 221324 WARNING nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.282 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.282 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.282 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.282 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.283 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.283 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.283 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.283 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.283 221324 DEBUG oslo_concurrency.lockutils [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.284 221324 DEBUG nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No event matching network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e in dict_keys([('network-vif-plugged', '9ff14940-7ad5-4fc4-852a-08f78695ba7a')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.284 221324 WARNING nova.compute.manager [req-4ae669c4-d5c9-4019-83f3-886c2af1a773 req-b23430a3-4f8e-4c9b-aebf-3178ec9abcc7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.307 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.307 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.308 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.308 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.308 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:40.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.497 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.498 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.498 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.498 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.499 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:40 np0005603622 nova_compute[221301]: 2026-01-31 08:16:40.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:16:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3687362386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:16:41 np0005603622 nova_compute[221301]: 2026-01-31 08:16:41.087 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:41 np0005603622 nova_compute[221301]: 2026-01-31 08:16:41.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.058 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.058 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.058 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.058 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:16:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:42.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.257 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.258 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4554MB free_disk=20.96734619140625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.258 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.259 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:42.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.703 221324 DEBUG nova.compute.manager [req-114e83e0-efaf-4364-96e6-bd60bb797539 req-3d5d00cb-bc1f-4dcc-9f45-dc046b6cc07e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.704 221324 DEBUG oslo_concurrency.lockutils [req-114e83e0-efaf-4364-96e6-bd60bb797539 req-3d5d00cb-bc1f-4dcc-9f45-dc046b6cc07e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.705 221324 DEBUG oslo_concurrency.lockutils [req-114e83e0-efaf-4364-96e6-bd60bb797539 req-3d5d00cb-bc1f-4dcc-9f45-dc046b6cc07e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.706 221324 DEBUG oslo_concurrency.lockutils [req-114e83e0-efaf-4364-96e6-bd60bb797539 req-3d5d00cb-bc1f-4dcc-9f45-dc046b6cc07e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.706 221324 DEBUG nova.compute.manager [req-114e83e0-efaf-4364-96e6-bd60bb797539 req-3d5d00cb-bc1f-4dcc-9f45-dc046b6cc07e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Processing event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.708 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance event wait completed in 36 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.714 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847402.7146502, b49dd2dc-476a-42c6-89f3-868e238698f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.715 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.718 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.721 221324 INFO nova.virt.libvirt.driver [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance spawned successfully.#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.721 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.775 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance b49dd2dc-476a-42c6-89f3-868e238698f2 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.776 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.776 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.888 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.897 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.902 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.902 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.903 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.903 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.904 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:42 np0005603622 nova_compute[221301]: 2026-01-31 08:16:42.904 221324 DEBUG nova.virt.libvirt.driver [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.073 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.389 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.531 221324 INFO nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Took 81.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.533 221324 DEBUG nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.690 221324 INFO nova.compute.manager [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Took 95.71 seconds to build instance.#033[00m
Jan 31 03:16:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:16:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817834152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.888 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.895 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:16:43 np0005603622 nova_compute[221301]: 2026-01-31 08:16:43.984 221324 DEBUG oslo_concurrency.lockutils [None req-f1e9f874-68a2-456f-9e00-cfa97b82cb94 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 97.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:44.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:44 np0005603622 nova_compute[221301]: 2026-01-31 08:16:44.273 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:16:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:44.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:44 np0005603622 nova_compute[221301]: 2026-01-31 08:16:44.575 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:16:44 np0005603622 nova_compute[221301]: 2026-01-31 08:16:44.576 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:44 np0005603622 nova_compute[221301]: 2026-01-31 08:16:44.663 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:44 np0005603622 nova_compute[221301]: 2026-01-31 08:16:44.814 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:16:45 np0005603622 nova_compute[221301]: 2026-01-31 08:16:45.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:46.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:46.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.496 221324 DEBUG nova.compute.manager [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.497 221324 DEBUG oslo_concurrency.lockutils [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.497 221324 DEBUG oslo_concurrency.lockutils [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.498 221324 DEBUG oslo_concurrency.lockutils [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.499 221324 DEBUG nova.compute.manager [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.499 221324 WARNING nova.compute.manager [req-10d6807a-b0fb-4c0e-9f3e-8428b36edf1d req-2c10c01f-2f2f-4d59-8e3d-11f268411e72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a for instance with vm_state active and task_state None.#033[00m
Jan 31 03:16:46 np0005603622 nova_compute[221301]: 2026-01-31 08:16:46.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:48.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:48.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:49 np0005603622 nova_compute[221301]: 2026-01-31 08:16:49.428 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603622 NetworkManager[49080]: <info>  [1769847409.4307] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 31 03:16:49 np0005603622 NetworkManager[49080]: <info>  [1769847409.4318] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Jan 31 03:16:49 np0005603622 nova_compute[221301]: 2026-01-31 08:16:49.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:49Z|00400|binding|INFO|Releasing lport 062e4b19-1435-45a4-9a04-775059a336eb from this chassis (sb_readonly=0)
Jan 31 03:16:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:49Z|00401|binding|INFO|Releasing lport 89f08470-513b-43e2-b94b-37e89d331b1d from this chassis (sb_readonly=0)
Jan 31 03:16:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:49Z|00402|binding|INFO|Releasing lport 50772627-ac3b-4fc1-a904-fd6001405199 from this chassis (sb_readonly=0)
Jan 31 03:16:49 np0005603622 nova_compute[221301]: 2026-01-31 08:16:49.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:50.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:50.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:50 np0005603622 nova_compute[221301]: 2026-01-31 08:16:50.941 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:51 np0005603622 nova_compute[221301]: 2026-01-31 08:16:51.700 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:52.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:52 np0005603622 nova_compute[221301]: 2026-01-31 08:16:52.240 221324 DEBUG nova.compute.manager [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-changed-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:16:52 np0005603622 nova_compute[221301]: 2026-01-31 08:16:52.241 221324 DEBUG nova.compute.manager [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing instance network info cache due to event network-changed-ad571df1-872d-4024-8955-346236aa8e90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:16:52 np0005603622 nova_compute[221301]: 2026-01-31 08:16:52.241 221324 DEBUG oslo_concurrency.lockutils [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:16:52 np0005603622 nova_compute[221301]: 2026-01-31 08:16:52.241 221324 DEBUG oslo_concurrency.lockutils [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:16:52 np0005603622 nova_compute[221301]: 2026-01-31 08:16:52.241 221324 DEBUG nova.network.neutron [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Refreshing network info cache for port ad571df1-872d-4024-8955-346236aa8e90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:16:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:52.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:53Z|00403|memory|INFO|peak resident set size grew 51% in last 2703.9 seconds, from 16512 kB to 24892 kB
Jan 31 03:16:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:53Z|00404|memory|INFO|idl-cells-OVN_Southbound:10710 idl-cells-Open_vSwitch:1326 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:2 lflow-cache-entries-cache-expr:352 lflow-cache-entries-cache-matches:283 lflow-cache-size-KB:1518 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:711 ofctrl_installed_flow_usage-KB:517 ofctrl_sb_flow_ref_usage-KB:264
Jan 31 03:16:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:54.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:54.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:54 np0005603622 nova_compute[221301]: 2026-01-31 08:16:54.655 221324 DEBUG nova.network.neutron [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updated VIF entry in instance network info cache for port ad571df1-872d-4024-8955-346236aa8e90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:16:54 np0005603622 nova_compute[221301]: 2026-01-31 08:16:54.656 221324 DEBUG nova.network.neutron [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:16:55 np0005603622 nova_compute[221301]: 2026-01-31 08:16:55.944 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:56 np0005603622 nova_compute[221301]: 2026-01-31 08:16:56.040 221324 DEBUG oslo_concurrency.lockutils [req-5234f1c2-a858-4755-bb71-9d8fd4b95d88 req-bc60f16a-9738-4f1a-9509-729ec3faad73 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b49dd2dc-476a-42c6-89f3-868e238698f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:16:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:16:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:56.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:16:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:56.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:56 np0005603622 nova_compute[221301]: 2026-01-31 08:16:56.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:16:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:16:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:16:58.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7e:73:4e 10.1.1.70
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7e:73:4e 10.1.1.70
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:4d:5a 10.2.2.100
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:4d:5a 10.2.2.100
Jan 31 03:16:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:16:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:16:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:16:58.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:24:89 10.1.1.208
Jan 31 03:16:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:58Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:24:89 10.1.1.208
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1d:8f:c0 10.1.1.54
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:8f:c0 10.1.1.54
Jan 31 03:16:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:eb:67 10.1.1.180
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:eb:67 10.1.1.180
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:e5:27 10.2.2.200
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:e5:27 10.2.2.200
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:fb:b1 10.100.0.13
Jan 31 03:16:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:16:59Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:fb:b1 10.100.0.13
Jan 31 03:17:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:00.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:00.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:00 np0005603622 nova_compute[221301]: 2026-01-31 08:17:00.985 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:01 np0005603622 nova_compute[221301]: 2026-01-31 08:17:01.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:02.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:02.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:03 np0005603622 nova_compute[221301]: 2026-01-31 08:17:03.396 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:03 np0005603622 nova_compute[221301]: 2026-01-31 08:17:03.398 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:03 np0005603622 nova_compute[221301]: 2026-01-31 08:17:03.643 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:17:04 np0005603622 nova_compute[221301]: 2026-01-31 08:17:04.051 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:04 np0005603622 nova_compute[221301]: 2026-01-31 08:17:04.052 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:04 np0005603622 nova_compute[221301]: 2026-01-31 08:17:04.063 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:17:04 np0005603622 nova_compute[221301]: 2026-01-31 08:17:04.064 221324 INFO nova.compute.claims [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:17:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:04.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:04.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:04 np0005603622 nova_compute[221301]: 2026-01-31 08:17:04.960 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1413775207' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.429 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.438 221324 DEBUG nova.compute.provider_tree [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.503 221324 DEBUG nova.scheduler.client.report [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.637 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.638 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.856 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.857 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.977 221324 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:17:05 np0005603622 nova_compute[221301]: 2026-01-31 08:17:05.989 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.119 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:17:06 np0005603622 podman[256799]: 2026-01-31 08:17:06.187579255 +0000 UTC m=+0.108973654 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:17:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:06.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:06.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.500 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.502 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.503 221324 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Creating image(s)#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.544 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.583 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.627 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.633 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.659 221324 DEBUG nova.policy [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db5a8acb6d04c988f9dd4f74380c487', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eafe22d6cfcb41d4b31597087498a565', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.700 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.702 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.703 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.703 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.755 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.844 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:06 np0005603622 nova_compute[221301]: 2026-01-31 08:17:06.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.224 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.381s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.317 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] resizing rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.526 221324 DEBUG nova.objects.instance [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'migration_context' on Instance uuid 82cdb8e3-12f4-4363-948c-8d9b88d4cced obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.653 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.654 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Ensure instance console log exists: /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.654 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.654 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:07 np0005603622 nova_compute[221301]: 2026-01-31 08:17:07.655 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:08 np0005603622 podman[256993]: 2026-01-31 08:17:08.136335559 +0000 UTC m=+0.064273887 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:17:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:08.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:08.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:08 np0005603622 nova_compute[221301]: 2026-01-31 08:17:08.687 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Successfully created port: 61657494-08d3-460a-bc24-a6856464d636 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:17:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:09.265 140048 DEBUG eventlet.wsgi.server [-] (140048) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:09.267 140048 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: Accept: */*#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: Connection: close#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: Content-Type: text/plain#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: Host: 169.254.169.254#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: User-Agent: curl/7.84.0#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: X-Forwarded-For: 10.100.0.13#015
Jan 31 03:17:09 np0005603622 ovn_metadata_agent[139788]: X-Ovn-Network-Id: c9a825ca-0a1d-4321-b7ad-502bf00f7f45 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 31 03:17:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:10.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:10.429 140048 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 31 03:17:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:10.430 140048 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2548 time: 1.1632318#033[00m
Jan 31 03:17:10 np0005603622 haproxy-metadata-proxy-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256284]: 10.100.0.13:35606 [31/Jan/2026:08:17:09.263] listener listener/metadata 0/0/0/1166/1166 200 2532 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 31 03:17:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:10 np0005603622 nova_compute[221301]: 2026-01-31 08:17:10.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:11 np0005603622 nova_compute[221301]: 2026-01-31 08:17:11.876 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.082 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.083 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.084 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.084 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.085 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.087 221324 INFO nova.compute.manager [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Terminating instance#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.090 221324 DEBUG nova.compute.manager [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:17:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:12.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:12.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:12 np0005603622 kernel: tapad571df1-87 (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.5057] device (tapad571df1-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00405|binding|INFO|Releasing lport ad571df1-872d-4024-8955-346236aa8e90 from this chassis (sb_readonly=0)
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.519 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00406|binding|INFO|Setting lport ad571df1-872d-4024-8955-346236aa8e90 down in Southbound
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00407|binding|INFO|Removing iface tapad571df1-87 ovn-installed in OVS
Jan 31 03:17:12 np0005603622 kernel: tap9ff14940-7a (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.5293] device (tap9ff14940-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00408|binding|INFO|Releasing lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a from this chassis (sb_readonly=1)
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00409|binding|INFO|Removing iface tap9ff14940-7a ovn-installed in OVS
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00410|if_status|INFO|Dropped 2 log messages in last 254 seconds (most recently, 254 seconds ago) due to excessive rate
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00411|if_status|INFO|Not setting lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a down as sb is readonly
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.547 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 kernel: tapea3924fc-85 (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.5616] device (tapea3924fc-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.564 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00412|binding|INFO|Releasing lport ea3924fc-85a0-42f4-98f9-62f431992e9a from this chassis (sb_readonly=1)
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00413|binding|INFO|Removing iface tapea3924fc-85 ovn-installed in OVS
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 kernel: tap0df677ac-7f (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.5917] device (tap0df677ac-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.593 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00414|binding|INFO|Releasing lport 0df677ac-7f81-4a86-bf52-3028b2dda46e from this chassis (sb_readonly=0)
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00415|binding|INFO|Setting lport 0df677ac-7f81-4a86-bf52-3028b2dda46e down in Southbound
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.609 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00416|binding|INFO|Removing iface tap0df677ac-7f ovn-installed in OVS
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00417|binding|INFO|Setting lport 9ff14940-7ad5-4fc4-852a-08f78695ba7a down in Southbound
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00418|binding|INFO|Setting lport ea3924fc-85a0-42f4-98f9-62f431992e9a down in Southbound
Jan 31 03:17:12 np0005603622 kernel: tap2333e158-ce (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.610 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:fb:b1 10.100.0.13'], port_security=['fa:16:3e:06:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b4cc557-dcfc-4a4a-b710-9e93bc5854ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ad571df1-872d-4024-8955-346236aa8e90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.613 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ad571df1-872d-4024-8955-346236aa8e90 in datapath c9a825ca-0a1d-4321-b7ad-502bf00f7f45 unbound from our chassis#033[00m
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.6154] device (tap2333e158-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.616 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c9a825ca-0a1d-4321-b7ad-502bf00f7f45, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.618 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[27f34165-72bd-49f1-bbe2-4a5cd06be712]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.619 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45 namespace which is not needed anymore#033[00m
Jan 31 03:17:12 np0005603622 kernel: tapc1738855-97 (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.6454] device (tapc1738855-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.645 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:8f:c0 10.1.1.54'], port_security=['fa:16:3e:1d:8f:c0 10.1.1.54'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-254413178', 'neutron:cidrs': '10.1.1.54/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-254413178', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': '621f9777-60f7-4278-a85f-2b28bb26d990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=9ff14940-7ad5-4fc4-852a-08f78695ba7a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00419|binding|INFO|Releasing lport 2333e158-ceb0-4a1f-898b-191f9da4e5c7 from this chassis (sb_readonly=0)
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00420|binding|INFO|Setting lport 2333e158-ceb0-4a1f-898b-191f9da4e5c7 down in Southbound
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.647 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.649 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:73:4e 10.1.1.70'], port_security=['fa:16:3e:7e:73:4e 10.1.1.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.70/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0df677ac-7f81-4a86-bf52-3028b2dda46e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00421|binding|INFO|Removing iface tap2333e158-ce ovn-installed in OVS
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.651 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.652 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:eb:67 10.1.1.180'], port_security=['fa:16:3e:09:eb:67 10.1.1.180'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1412251406', 'neutron:cidrs': '10.1.1.180/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1412251406', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': '621f9777-60f7-4278-a85f-2b28bb26d990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ea3924fc-85a0-42f4-98f9-62f431992e9a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 kernel: tapdf26bd91-1f (unregistering): left promiscuous mode
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.6721] device (tapdf26bd91-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.676 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00422|binding|INFO|Releasing lport c1738855-9732-40f2-88e3-e42f04270b02 from this chassis (sb_readonly=1)
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00423|binding|INFO|Removing iface tapc1738855-97 ovn-installed in OVS
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.680 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.699 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00424|binding|INFO|Releasing lport df26bd91-1f94-4407-962c-1dd9ca4032d0 from this chassis (sb_readonly=1)
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00425|binding|INFO|Removing iface tapdf26bd91-1f ovn-installed in OVS
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00426|binding|INFO|Setting lport df26bd91-1f94-4407-962c-1dd9ca4032d0 down in Southbound
Jan 31 03:17:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:12Z|00427|binding|INFO|Setting lport c1738855-9732-40f2-88e3-e42f04270b02 down in Southbound
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.714 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:24:89 10.1.1.208'], port_security=['fa:16:3e:15:24:89 10.1.1.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.208/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90c25bd5-b880-4690-a735-203ba0cecd55, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2333e158-ceb0-4a1f-898b-191f9da4e5c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000060.scope: Deactivated successfully.
Jan 31 03:17:12 np0005603622 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000060.scope: Consumed 17.635s CPU time.
Jan 31 03:17:12 np0005603622 systemd-machined[190406]: Machine qemu-41-instance-00000060 terminated.
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.749 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:4d:5a 10.2.2.100'], port_security=['fa:16:3e:8b:4d:5a 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd48c619-e7d0-4c6c-96f1-6aa28a7e2333, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=c1738855-9732-40f2-88e3-e42f04270b02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.751 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:e5:27 10.2.2.200'], port_security=['fa:16:3e:57:e5:27 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'b49dd2dc-476a-42c6-89f3-868e238698f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '587e3df6fc904afc99c9c6411095e3ef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ffd1593d-1c79-4018-8f02-72acf6fb8b85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd48c619-e7d0-4c6c-96f1-6aa28a7e2333, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=df26bd91-1f94-4407-962c-1dd9ca4032d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:12 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [NOTICE]   (256282) : haproxy version is 2.8.14-c23fe91
Jan 31 03:17:12 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [NOTICE]   (256282) : path to executable is /usr/sbin/haproxy
Jan 31 03:17:12 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [WARNING]  (256282) : Exiting Master process...
Jan 31 03:17:12 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [ALERT]    (256282) : Current worker (256284) exited with code 143 (Terminated)
Jan 31 03:17:12 np0005603622 neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45[256274]: [WARNING]  (256282) : All workers exited. Exiting... (0)
Jan 31 03:17:12 np0005603622 systemd[1]: libpod-ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d.scope: Deactivated successfully.
Jan 31 03:17:12 np0005603622 podman[257072]: 2026-01-31 08:17:12.801672024 +0000 UTC m=+0.063684941 container died ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:17:12 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d-userdata-shm.mount: Deactivated successfully.
Jan 31 03:17:12 np0005603622 systemd[1]: var-lib-containers-storage-overlay-abcf94ee27acd80af0fc310d629aef8bc894b372721bb5f055f79cdf7995969d-merged.mount: Deactivated successfully.
Jan 31 03:17:12 np0005603622 podman[257072]: 2026-01-31 08:17:12.867343177 +0000 UTC m=+0.129356094 container cleanup ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:17:12 np0005603622 systemd[1]: libpod-conmon-ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d.scope: Deactivated successfully.
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.9134] manager: (tapad571df1-87): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.9275] manager: (tap9ff14940-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.9418] manager: (tapea3924fc-85): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 31 03:17:12 np0005603622 NetworkManager[49080]: <info>  [1769847432.9525] manager: (tap0df677ac-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Jan 31 03:17:12 np0005603622 podman[257104]: 2026-01-31 08:17:12.979410084 +0000 UTC m=+0.090954607 container remove ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.985 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7d1afa-62f4-494d-8cd3-62853f69a49b]: (4, ('Sat Jan 31 08:17:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45 (ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d)\nef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d\nSat Jan 31 08:17:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45 (ef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d)\nef029b7668187dd4b485162eb94f6285895c6a696cf471649c505bd7b57dfa5d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.989 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[28f3fe36-2405-46f1-8f1d-48186b5c5fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:12.991 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9a825ca-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:12 np0005603622 nova_compute[221301]: 2026-01-31 08:17:12.994 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.003 221324 INFO nova.virt.libvirt.driver [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Instance destroyed successfully.#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.003 221324 DEBUG nova.objects.instance [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lazy-loading 'resources' on Instance uuid b49dd2dc-476a-42c6-89f3-868e238698f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.017 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 kernel: tapc9a825ca-00: left promiscuous mode
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.046 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b44ace73-413f-4bc4-b970-8ecdd0ba656c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.060 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[268eb208-fde8-4eb2-8850-f07d4e6f1845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.062 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f4afbc-5e59-4132-9dcf-b4df929fde4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.080 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02f13d93-7df9-4e9e-8b67-fcb14e983274]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661608, 'reachable_time': 32781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257212, 'error': None, 'target': 'ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.081 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.082 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.082 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.083 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c9a825ca-0a1d-4321-b7ad-502bf00f7f45 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.084 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 systemd[1]: run-netns-ovnmeta\x2dc9a825ca\x2d0a1d\x2d4321\x2db7ad\x2d502bf00f7f45.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.085 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e177fb12-c654-489a-98e2-a83dff043221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.086 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff14940-7ad5-4fc4-852a-08f78695ba7a in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.087 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.088 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad571df1-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.088 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89bb50d1-4b51-4284-b511-c21064f9c3e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.089 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[80549415-b931-4a8c-8569-42aeb599fba6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.090 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1 namespace which is not needed anymore#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.093 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.105 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.111 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=ad571df1-872d-4024-8955-346236aa8e90,network=Network(c9a825ca-0a1d-4321-b7ad-502bf00f7f45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad571df1-87')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.112 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.113 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.114 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.115 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.119 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ff14940-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.123 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.133 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.136 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:8f:c0,bridge_name='br-int',has_traffic_filtering=True,id=9ff14940-7ad5-4fc4-852a-08f78695ba7a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9ff14940-7a')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.138 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.139 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.140 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.141 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.143 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.143 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea3924fc-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.157 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.159 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:eb:67,bridge_name='br-int',has_traffic_filtering=True,id=ea3924fc-85a0-42f4-98f9-62f431992e9a,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapea3924fc-85')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.160 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.160 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.161 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.161 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.163 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0df677ac-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.164 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.175 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.178 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:73:4e,bridge_name='br-int',has_traffic_filtering=True,id=0df677ac-7f81-4a86-bf52-3028b2dda46e,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0df677ac-7f')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.179 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.180 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.180 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.181 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.182 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.183 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2333e158-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.185 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.214 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.217 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:24:89,bridge_name='br-int',has_traffic_filtering=True,id=2333e158-ceb0-4a1f-898b-191f9da4e5c7,network=Network(89bb50d1-4b51-4284-b511-c21064f9c3e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2333e158-ce')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.218 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.218 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "c1738855-9732-40f2-88e3-e42f04270b02", "address": "fa:16:3e:8b:4d:5a", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1738855-97", "ovs_interfaceid": "c1738855-9732-40f2-88e3-e42f04270b02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.219 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.219 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.221 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1738855-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.223 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.230 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4d:5a,bridge_name='br-int',has_traffic_filtering=True,id=c1738855-9732-40f2-88e3-e42f04270b02,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1738855-97')#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.231 221324 DEBUG nova.virt.libvirt.vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:14:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-897253457',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-897253457',id=96,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMbo5jKcv9BMJKgaeNZeXEwiD9OhBHCjoyu+NckIZil/pqbMHt6pHDhnbAJpqhK6nmL1qqeGbyg41g1J438ebOl8S1LXx5FiS/TUkzxKzvpFXJXH3hrj3hYxa95hekqQA==',key_name='tempest-keypair-629445673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:16:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='587e3df6fc904afc99c9c6411095e3ef',ramdisk_id='',reservation_id='r-0tn40yfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-3446815',owner_user_name='tempest-TaggedBootDevicesTest-3446815-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:16:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='469a76c175b64f84b29a4a44c8eaf3c4',uuid=b49dd2dc-476a-42c6-89f3-868e238698f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.231 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converting VIF {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.232 221324 DEBUG nova.network.os_vif_util [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.232 221324 DEBUG os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.233 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf26bd91-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.237 221324 INFO os_vif [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:e5:27,bridge_name='br-int',has_traffic_filtering=True,id=df26bd91-1f94-4407-962c-1dd9ca4032d0,network=Network(fcd4b12f-9cb6-45af-9713-c30c45ef51d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf26bd91-1f')#033[00m
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [NOTICE]   (256380) : haproxy version is 2.8.14-c23fe91
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [NOTICE]   (256380) : path to executable is /usr/sbin/haproxy
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [WARNING]  (256380) : Exiting Master process...
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [ALERT]    (256380) : Current worker (256384) exited with code 143 (Terminated)
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1[256366]: [WARNING]  (256380) : All workers exited. Exiting... (0)
Jan 31 03:17:13 np0005603622 systemd[1]: libpod-fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd.scope: Deactivated successfully.
Jan 31 03:17:13 np0005603622 podman[257255]: 2026-01-31 08:17:13.295210304 +0000 UTC m=+0.072039267 container died fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:17:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd-userdata-shm.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8e5ee4d372ca182c6debbe6ca4f8d6a85ca00ddd82d6b2457bc85e28cdedd956-merged.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 podman[257255]: 2026-01-31 08:17:13.397886087 +0000 UTC m=+0.174715050 container cleanup fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 03:17:13 np0005603622 systemd[1]: libpod-conmon-fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd.scope: Deactivated successfully.
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.487 221324 INFO nova.virt.libvirt.driver [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Deleting instance files /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2_del#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.488 221324 INFO nova.virt.libvirt.driver [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Deletion of /var/lib/nova/instances/b49dd2dc-476a-42c6-89f3-868e238698f2_del complete#033[00m
Jan 31 03:17:13 np0005603622 podman[257305]: 2026-01-31 08:17:13.489566683 +0000 UTC m=+0.065607873 container remove fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.495 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[906f46f5-a4f9-4421-b96e-7151a0b63330]: (4, ('Sat Jan 31 08:17:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1 (fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd)\nfb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd\nSat Jan 31 08:17:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1 (fb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd)\nfb624b9fc6471719d033edd9c9917b24e04f8859849e63f00d88c91debca0ecd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.498 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f73a7c40-b3e8-49d4-b434-85e3f3502c5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89bb50d1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.502 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 kernel: tap89bb50d1-40: left promiscuous mode
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.508 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.512 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef8d3e6-4f81-48b4-a5cd-bd5a9334cdb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.531 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe4187d-f3cb-4104-b368-70c7f1eb0d06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.532 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[51e564ac-27a3-4f24-b2a0-c52ba2ac1b86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.552 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7283225e-8290-40c2-b4ee-dcc2c515e9c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661707, 'reachable_time': 25000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257321, 'error': None, 'target': 'ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.556 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89bb50d1-4b51-4284-b511-c21064f9c3e1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.556 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[707ff3c8-4ca3-470d-8008-e7f8f41211f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.557 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0df677ac-7f81-4a86-bf52-3028b2dda46e in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.561 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89bb50d1-4b51-4284-b511-c21064f9c3e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.562 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5e93a771-9e7e-4e6d-b7c7-07c8585c85ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.563 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ea3924fc-85a0-42f4-98f9-62f431992e9a in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.566 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89bb50d1-4b51-4284-b511-c21064f9c3e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e74e8abf-c304-4527-bbcf-a409811176b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.568 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2333e158-ceb0-4a1f-898b-191f9da4e5c7 in datapath 89bb50d1-4b51-4284-b511-c21064f9c3e1 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.571 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89bb50d1-4b51-4284-b511-c21064f9c3e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca27d3a6-bc8d-4b42-818b-5dd7f789e86e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.572 139793 INFO neutron.agent.ovn.metadata.agent [-] Port c1738855-9732-40f2-88e3-e42f04270b02 in datapath fcd4b12f-9cb6-45af-9713-c30c45ef51d9 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.575 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fcd4b12f-9cb6-45af-9713-c30c45ef51d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.576 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd1f7ad-030b-404e-ac7b-4ccf41695160]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.577 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9 namespace which is not needed anymore#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.711 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Successfully updated port: 61657494-08d3-460a-bc24-a6856464d636 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [NOTICE]   (256461) : haproxy version is 2.8.14-c23fe91
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [NOTICE]   (256461) : path to executable is /usr/sbin/haproxy
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [WARNING]  (256461) : Exiting Master process...
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [ALERT]    (256461) : Current worker (256463) exited with code 143 (Terminated)
Jan 31 03:17:13 np0005603622 neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9[256457]: [WARNING]  (256461) : All workers exited. Exiting... (0)
Jan 31 03:17:13 np0005603622 systemd[1]: libpod-59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94.scope: Deactivated successfully.
Jan 31 03:17:13 np0005603622 podman[257337]: 2026-01-31 08:17:13.74457146 +0000 UTC m=+0.061896893 container died 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.768 221324 INFO nova.compute.manager [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Took 1.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.769 221324 DEBUG oslo.service.loopingcall [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.769 221324 DEBUG nova.compute.manager [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.770 221324 DEBUG nova.network.neutron [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:17:13 np0005603622 podman[257337]: 2026-01-31 08:17:13.801685532 +0000 UTC m=+0.119010915 container cleanup 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:17:13 np0005603622 systemd[1]: libpod-conmon-59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94.scope: Deactivated successfully.
Jan 31 03:17:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay-1927e96672e84e510cb720652144f81aebe83d2c29b17749e8eb251b5f923517-merged.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94-userdata-shm.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 systemd[1]: run-netns-ovnmeta\x2d89bb50d1\x2d4b51\x2d4284\x2db511\x2dc21064f9c3e1.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.855 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.855 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquired lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.855 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.859 221324 DEBUG nova.compute.manager [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.860 221324 DEBUG oslo_concurrency.lockutils [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.860 221324 DEBUG oslo_concurrency.lockutils [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.860 221324 DEBUG oslo_concurrency.lockutils [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.860 221324 DEBUG nova.compute.manager [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.861 221324 DEBUG nova.compute.manager [req-24c684bc-c9e6-4af5-82cc-0c0847ac9ae7 req-3e734b59-7e43-466b-9d73-b950899c9f98 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:13 np0005603622 podman[257368]: 2026-01-31 08:17:13.876145463 +0000 UTC m=+0.054574325 container remove 59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.882 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c73e424-5052-4500-a974-10259df99b55]: (4, ('Sat Jan 31 08:17:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9 (59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94)\n59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94\nSat Jan 31 08:17:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9 (59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94)\n59134e9e613f06fca678256efbd20d318c6d5258ed0d833ed44f39ebd1316c94\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.885 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[75abbefb-761d-48e8-9d56-f198b478f861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.887 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfcd4b12f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.890 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 kernel: tapfcd4b12f-90: left promiscuous mode
Jan 31 03:17:13 np0005603622 nova_compute[221301]: 2026-01-31 08:17:13.895 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.898 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ba54c6c2-b728-48cb-8ffd-7ce92d2c04c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.914 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5d92133d-097f-48c9-993b-46ed8e5ce07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.916 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[91899e66-6aa7-4aee-8ee2-c7b697253792]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.933 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9465b4d5-b3f7-47e9-b97d-b77244ad9a91]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661799, 'reachable_time': 31405, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257383, 'error': None, 'target': 'ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 systemd[1]: run-netns-ovnmeta\x2dfcd4b12f\x2d9cb6\x2d45af\x2d9713\x2dc30c45ef51d9.mount: Deactivated successfully.
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.936 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fcd4b12f-9cb6-45af-9713-c30c45ef51d9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.936 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[f9aa05a7-0d5d-4486-8fa0-bd71d99f7f2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.938 139793 INFO neutron.agent.ovn.metadata.agent [-] Port df26bd91-1f94-4407-962c-1dd9ca4032d0 in datapath fcd4b12f-9cb6-45af-9713-c30c45ef51d9 unbound from our chassis#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.941 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fcd4b12f-9cb6-45af-9713-c30c45ef51d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:17:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:13.942 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[80a6b282-c757-47d8-a253-156a6698bc60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:17:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:14.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:17:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:14.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:14 np0005603622 nova_compute[221301]: 2026-01-31 08:17:14.762 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:16.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.222 221324 DEBUG nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.223 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.223 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.223 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.224 221324 DEBUG nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.224 221324 DEBUG nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.224 221324 DEBUG nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.225 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.225 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.225 221324 DEBUG oslo_concurrency.lockutils [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.225 221324 DEBUG nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.226 221324 WARNING nova.compute.manager [req-ac1dee3b-6c9a-44f1-8f89-1fb24116ff16 req-442df5b7-6554-4777-aaa0-faeaaffba46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-2333e158-ceb0-4a1f-898b-191f9da4e5c7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.412 221324 DEBUG nova.compute.manager [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.413 221324 DEBUG oslo_concurrency.lockutils [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.413 221324 DEBUG oslo_concurrency.lockutils [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.414 221324 DEBUG oslo_concurrency.lockutils [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.414 221324 DEBUG nova.compute.manager [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.414 221324 WARNING nova.compute.manager [req-e3201789-eede-4bd4-a662-f0f77aa518d0 req-e699c6c2-7743-41cd-8cbc-6f84395892fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-df26bd91-1f94-4407-962c-1dd9ca4032d0 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:16.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.804 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.805 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.806 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.806 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.807 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-ad571df1-872d-4024-8955-346236aa8e90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.807 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-ad571df1-872d-4024-8955-346236aa8e90 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.807 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-changed-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.808 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Refreshing instance network info cache due to event network-changed-61657494-08d3-460a-bc24-a6856464d636. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:17:16 np0005603622 nova_compute[221301]: 2026-01-31 08:17:16.808 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:18 np0005603622 nova_compute[221301]: 2026-01-31 08:17:18.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:18.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.090 221324 DEBUG nova.compute.manager [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-ea3924fc-85a0-42f4-98f9-62f431992e9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.091 221324 DEBUG oslo_concurrency.lockutils [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.091 221324 DEBUG oslo_concurrency.lockutils [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.091 221324 DEBUG oslo_concurrency.lockutils [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.091 221324 DEBUG nova.compute.manager [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-ea3924fc-85a0-42f4-98f9-62f431992e9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.091 221324 DEBUG nova.compute.manager [req-d42bbc2d-c630-47a9-b8d1-228824057c3f req-d3b70d0c-d854-4fd5-a5fa-db1b078456d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-ea3924fc-85a0-42f4-98f9-62f431992e9a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:19 np0005603622 nova_compute[221301]: 2026-01-31 08:17:19.879 221324 DEBUG nova.network.neutron [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updating instance_info_cache with network_info: [{"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.136 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Releasing lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.137 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Instance network_info: |[{"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.137 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.138 221324 DEBUG nova.network.neutron [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Refreshing network info cache for port 61657494-08d3-460a-bc24-a6856464d636 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.141 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Start _get_guest_xml network_info=[{"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.146 221324 WARNING nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.153 221324 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.154 221324 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.160 221324 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.161 221324 DEBUG nova.virt.libvirt.host [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.162 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.163 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.163 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.163 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.164 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.164 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.164 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.164 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.165 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.165 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.165 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.165 221324 DEBUG nova.virt.hardware [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.169 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:20.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:20.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:17:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2731945667' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.618 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.665 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:20 np0005603622 nova_compute[221301]: 2026-01-31 08:17:20.672 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:17:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1776789330' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.108 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.110 221324 DEBUG nova.virt.libvirt.vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-3',id=100,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:17:06Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=82cdb8e3-12f4-4363-948c-8d9b88d4cced,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.110 221324 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.111 221324 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.113 221324 DEBUG nova.objects.instance [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'pci_devices' on Instance uuid 82cdb8e3-12f4-4363-948c-8d9b88d4cced obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.299 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <uuid>82cdb8e3-12f4-4363-948c-8d9b88d4cced</uuid>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <name>instance-00000064</name>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-ListServersNegativeTestJSON-server-603829389-3</nova:name>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:17:20</nova:creationTime>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:user uuid="8db5a8acb6d04c988f9dd4f74380c487">tempest-ListServersNegativeTestJSON-79577656-project-member</nova:user>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:project uuid="eafe22d6cfcb41d4b31597087498a565">tempest-ListServersNegativeTestJSON-79577656</nova:project>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <nova:port uuid="61657494-08d3-460a-bc24-a6856464d636">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="serial">82cdb8e3-12f4-4363-948c-8d9b88d4cced</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="uuid">82cdb8e3-12f4-4363-948c-8d9b88d4cced</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:e2:73:e1"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <target dev="tap61657494-08"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/console.log" append="off"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:17:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:17:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:17:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:17:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.300 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Preparing to wait for external event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.301 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.301 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.301 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.302 221324 DEBUG nova.virt.libvirt.vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:16:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-3',id=100,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:17:06Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=82cdb8e3-12f4-4363-948c-8d9b88d4cced,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.303 221324 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.304 221324 DEBUG nova.network.os_vif_util [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.304 221324 DEBUG os_vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.305 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.305 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.306 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.310 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61657494-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.311 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61657494-08, col_values=(('external_ids', {'iface-id': '61657494-08d3-460a-bc24-a6856464d636', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:73:e1', 'vm-uuid': '82cdb8e3-12f4-4363-948c-8d9b88d4cced'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.313 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:21 np0005603622 NetworkManager[49080]: <info>  [1769847441.3152] manager: (tap61657494-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.317 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.319 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.320 221324 INFO os_vif [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08')#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.747 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.748 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.748 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] No VIF found with MAC fa:16:3e:e2:73:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.749 221324 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Using config drive#033[00m
Jan 31 03:17:21 np0005603622 nova_compute[221301]: 2026-01-31 08:17:21.787 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.176 221324 DEBUG nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.177 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.178 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.178 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.179 221324 DEBUG nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.179 221324 WARNING nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-ea3924fc-85a0-42f4-98f9-62f431992e9a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.179 221324 DEBUG nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.180 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.180 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.181 221324 DEBUG oslo_concurrency.lockutils [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.181 221324 DEBUG nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-0df677ac-7f81-4a86-bf52-3028b2dda46e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:22 np0005603622 nova_compute[221301]: 2026-01-31 08:17:22.182 221324 DEBUG nova.compute.manager [req-87c71eb1-5aec-434a-8c34-b43cca4adddf req-7ce6b81d-b22e-42f3-a22f-41e5685f6025 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-0df677ac-7f81-4a86-bf52-3028b2dda46e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:22.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:22.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:24.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.315 221324 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Creating config drive at /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.320 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyoh4_8zb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.443 221324 DEBUG nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.444 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.446 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.446 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.447 221324 DEBUG nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.448 221324 WARNING nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-0df677ac-7f81-4a86-bf52-3028b2dda46e for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.448 221324 DEBUG nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.449 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.449 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.450 221324 DEBUG oslo_concurrency.lockutils [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.451 221324 DEBUG nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.451 221324 DEBUG nova.compute.manager [req-c5f04ff7-9a9e-48b4-b6d9-99caf91c6fe7 req-5cd05770-6736-4534-ab66-b9296c3a030b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.453 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyoh4_8zb" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.493 221324 DEBUG nova.storage.rbd_utils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] rbd image 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.499 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:24.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.686 221324 DEBUG oslo_concurrency.processutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config 82cdb8e3-12f4-4363-948c-8d9b88d4cced_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.688 221324 INFO nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Deleting local config drive /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced/disk.config because it was imported into RBD.#033[00m
Jan 31 03:17:24 np0005603622 kernel: tap61657494-08: entered promiscuous mode
Jan 31 03:17:24 np0005603622 NetworkManager[49080]: <info>  [1769847444.7382] manager: (tap61657494-08): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 31 03:17:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:24Z|00428|binding|INFO|Claiming lport 61657494-08d3-460a-bc24-a6856464d636 for this chassis.
Jan 31 03:17:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:24Z|00429|binding|INFO|61657494-08d3-460a-bc24-a6856464d636: Claiming fa:16:3e:e2:73:e1 10.100.0.5
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.740 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:24Z|00430|binding|INFO|Setting lport 61657494-08d3-460a-bc24-a6856464d636 ovn-installed in OVS
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.753 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:24 np0005603622 nova_compute[221301]: 2026-01-31 08:17:24.756 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:24 np0005603622 systemd-udevd[257519]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:17:24 np0005603622 systemd-machined[190406]: New machine qemu-42-instance-00000064.
Jan 31 03:17:24 np0005603622 NetworkManager[49080]: <info>  [1769847444.7811] device (tap61657494-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:17:24 np0005603622 NetworkManager[49080]: <info>  [1769847444.7821] device (tap61657494-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:17:24 np0005603622 systemd[1]: Started Virtual Machine qemu-42-instance-00000064.
Jan 31 03:17:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:25Z|00431|binding|INFO|Setting lport 61657494-08d3-460a-bc24-a6856464d636 up in Southbound
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.014 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:73:e1 10.100.0.5'], port_security=['fa:16:3e:e2:73:e1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '82cdb8e3-12f4-4363-948c-8d9b88d4cced', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eafe22d6cfcb41d4b31597087498a565', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2748ecb7-7ea4-47e3-84b3-eb7c4d3ddc31', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27d5be35-4a5e-4d77-b9b0-f9a41d41dd18, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=61657494-08d3-460a-bc24-a6856464d636) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.016 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 61657494-08d3-460a-bc24-a6856464d636 in datapath 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 bound to our chassis#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.019 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.031 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc9495f-6435-436d-b5f0-20d2f0448a1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.032 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4ba4d6d9-c1 in ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.035 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4ba4d6d9-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.035 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[386a3409-03b7-4438-8c29-5c540f0f99a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.036 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f31e3045-308c-4d03-8a56-c8413b0b1e25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.064 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec67309-b6bb-4954-b701-d4285cec2736]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.086 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a054a50d-7dbf-4621-981e-4284771015f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.112 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[804758db-d1f7-4605-ba23-d6dd6adf1b51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.118 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48a0d3ab-dac8-4906-bdef-630b60dfbca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 NetworkManager[49080]: <info>  [1769847445.1194] manager: (tap4ba4d6d9-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Jan 31 03:17:25 np0005603622 systemd-udevd[257522]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.149 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e373db3e-766a-436e-8195-a0ee55f8fde2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.154 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9d59181a-c870-472a-bc5a-78aa2d5b4f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 NetworkManager[49080]: <info>  [1769847445.1733] device (tap4ba4d6d9-c0): carrier: link connected
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.180 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8029d1-b2eb-49eb-996c-624e344f62eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.194 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b55a3872-a465-42fa-b152-b1796fd53af8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ba4d6d9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:09:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669651, 'reachable_time': 40318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257553, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.210 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[500d4221-e4be-4f87-858e-0a8659268de8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669651, 'tstamp': 669651}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257554, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.222 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1af38ad5-6bbc-4970-95d7-5d14ab789f50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ba4d6d9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:09:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669651, 'reachable_time': 40318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257555, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.257 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef461b5-057b-4831-8912-e0f4ead407e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.311 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[573e83e0-971d-4ac7-a30f-00df2202cd87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.313 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba4d6d9-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.313 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.314 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ba4d6d9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:25 np0005603622 nova_compute[221301]: 2026-01-31 08:17:25.316 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603622 kernel: tap4ba4d6d9-c0: entered promiscuous mode
Jan 31 03:17:25 np0005603622 NetworkManager[49080]: <info>  [1769847445.3177] manager: (tap4ba4d6d9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 31 03:17:25 np0005603622 nova_compute[221301]: 2026-01-31 08:17:25.319 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.321 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ba4d6d9-c0, col_values=(('external_ids', {'iface-id': 'f9d388c2-0e9b-4991-9c20-42c713a8ba0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:25 np0005603622 nova_compute[221301]: 2026-01-31 08:17:25.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:25Z|00432|binding|INFO|Releasing lport f9d388c2-0e9b-4991-9c20-42c713a8ba0d from this chassis (sb_readonly=0)
Jan 31 03:17:25 np0005603622 nova_compute[221301]: 2026-01-31 08:17:25.332 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603622 nova_compute[221301]: 2026-01-31 08:17:25.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.337 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.338 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[961ce328-f8db-48c3-950b-26d17f62d8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.339 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.pid.haproxy
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:17:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:25.340 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'env', 'PROCESS_TAG=haproxy-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:17:25 np0005603622 podman[257587]: 2026-01-31 08:17:25.762579334 +0000 UTC m=+0.076176768 container create 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:17:25 np0005603622 podman[257587]: 2026-01-31 08:17:25.726544541 +0000 UTC m=+0.040142025 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:17:25 np0005603622 systemd[1]: Started libpod-conmon-8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252.scope.
Jan 31 03:17:25 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:17:25 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39ae317c7daf4ab4be6c72d1a31668f62c0042e7d4bb8351778ba4b91dbddb37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:17:25 np0005603622 podman[257587]: 2026-01-31 08:17:25.894763624 +0000 UTC m=+0.208361118 container init 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:17:25 np0005603622 podman[257587]: 2026-01-31 08:17:25.903657685 +0000 UTC m=+0.217255129 container start 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:17:25 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [NOTICE]   (257606) : New worker (257608) forked
Jan 31 03:17:25 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [NOTICE]   (257606) : Loading success.
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.081 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.129 221324 DEBUG nova.network.neutron [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updated VIF entry in instance network info cache for port 61657494-08d3-460a-bc24-a6856464d636. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.130 221324 DEBUG nova.network.neutron [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updating instance_info_cache with network_info: [{"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.194 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847446.1931243, 82cdb8e3-12f4-4363-948c-8d9b88d4cced => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.194 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] VM Started (Lifecycle Event)#033[00m
Jan 31 03:17:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:26.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.245 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.245 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.246 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.246 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.246 221324 DEBUG oslo_concurrency.lockutils [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.247 221324 DEBUG nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.247 221324 WARNING nova.compute.manager [req-8671b8d9-8009-45e5-9b52-b1346308e735 req-d4ff5500-6475-4b54-aa82-3392ba664c83 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-ad571df1-872d-4024-8955-346236aa8e90 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.265 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.271 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847446.1935015, 82cdb8e3-12f4-4363-948c-8d9b88d4cced => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.271 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.314 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.318 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.323 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:17:26 np0005603622 nova_compute[221301]: 2026-01-31 08:17:26.372 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:17:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:26.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.857 221324 DEBUG nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.858 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.858 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.858 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.859 221324 DEBUG nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.859 221324 WARNING nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-9ff14940-7ad5-4fc4-852a-08f78695ba7a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.860 221324 DEBUG nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.860 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.861 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.861 221324 DEBUG oslo_concurrency.lockutils [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.862 221324 DEBUG nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-unplugged-c1738855-9732-40f2-88e3-e42f04270b02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:27 np0005603622 nova_compute[221301]: 2026-01-31 08:17:27.862 221324 DEBUG nova.compute.manager [req-dde9a63c-c4fe-4dfc-9143-dc57587af848 req-667c5897-1fe3-45c8-a2b4-c94b97e57d4c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-unplugged-c1738855-9732-40f2-88e3-e42f04270b02 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.000 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847432.9992998, b49dd2dc-476a-42c6-89f3-868e238698f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.001 221324 INFO nova.compute.manager [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.050 221324 DEBUG nova.compute.manager [None req-663b587e-bcc9-4592-8d7c-deafb566c051 - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:28.069 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.070 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:28.071 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:17:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:28.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:28.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:28Z|00433|binding|INFO|Releasing lport f9d388c2-0e9b-4991-9c20-42c713a8ba0d from this chassis (sb_readonly=0)
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:28Z|00434|binding|INFO|Releasing lport f9d388c2-0e9b-4991-9c20-42c713a8ba0d from this chassis (sb_readonly=0)
Jan 31 03:17:28 np0005603622 nova_compute[221301]: 2026-01-31 08:17:28.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:30.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.288 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.289 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.289 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.290 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.290 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] No waiting events found dispatching network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.291 221324 WARNING nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received unexpected event network-vif-plugged-c1738855-9732-40f2-88e3-e42f04270b02 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.291 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-deleted-c1738855-9732-40f2-88e3-e42f04270b02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.292 221324 INFO nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Neutron deleted interface c1738855-9732-40f2-88e3-e42f04270b02; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.292 221324 DEBUG nova.network.neutron [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "address": "fa:16:3e:15:24:89", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2333e158-ce", "ovs_interfaceid": "2333e158-ceb0-4a1f-898b-191f9da4e5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.511 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Detach interface failed, port_id=c1738855-9732-40f2-88e3-e42f04270b02, reason: Instance b49dd2dc-476a-42c6-89f3-868e238698f2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.511 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.512 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.512 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.512 221324 DEBUG oslo_concurrency.lockutils [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.512 221324 DEBUG nova.compute.manager [req-700e900a-185f-40f7-8bbd-c40c1b40b712 req-9f116f2b-f359-4c9f-ae91-289d7f920eea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Processing event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.513 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.518 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847450.5178225, 82cdb8e3-12f4-4363-948c-8d9b88d4cced => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.518 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.520 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.523 221324 INFO nova.virt.libvirt.driver [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Instance spawned successfully.#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.523 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:17:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:30.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.600 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.605 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.622 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.623 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.624 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.625 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.626 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.627 221324 DEBUG nova.virt.libvirt.driver [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.709 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.764 221324 INFO nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Took 24.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.764 221324 DEBUG nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:30.871 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:30.872 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:30.872 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.884 221324 INFO nova.compute.manager [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Took 26.90 seconds to build instance.#033[00m
Jan 31 03:17:30 np0005603622 nova_compute[221301]: 2026-01-31 08:17:30.994 221324 DEBUG oslo_concurrency.lockutils [None req-9dd4deff-06fe-42a3-a8fd-88cce8b1934d 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:17:31.074 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:17:31 np0005603622 nova_compute[221301]: 2026-01-31 08:17:31.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:31 np0005603622 nova_compute[221301]: 2026-01-31 08:17:31.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:31 np0005603622 nova_compute[221301]: 2026-01-31 08:17:31.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:32.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:32.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.580 221324 DEBUG nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-deleted-2333e158-ceb0-4a1f-898b-191f9da4e5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.580 221324 INFO nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Neutron deleted interface 2333e158-ceb0-4a1f-898b-191f9da4e5c7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.582 221324 DEBUG nova.network.neutron [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "ad571df1-872d-4024-8955-346236aa8e90", "address": "fa:16:3e:06:fb:b1", "network": {"id": "c9a825ca-0a1d-4321-b7ad-502bf00f7f45", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1041505113-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad571df1-87", "ovs_interfaceid": "ad571df1-872d-4024-8955-346236aa8e90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.759 221324 DEBUG nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Detach interface failed, port_id=2333e158-ceb0-4a1f-898b-191f9da4e5c7, reason: Instance b49dd2dc-476a-42c6-89f3-868e238698f2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.760 221324 DEBUG nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.761 221324 DEBUG oslo_concurrency.lockutils [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.762 221324 DEBUG oslo_concurrency.lockutils [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.763 221324 DEBUG oslo_concurrency.lockutils [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.763 221324 DEBUG nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] No waiting events found dispatching network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:17:32 np0005603622 nova_compute[221301]: 2026-01-31 08:17:32.764 221324 WARNING nova.compute.manager [req-f47e79ba-69c2-431f-8ddb-a7d20b3ca725 req-e4c03b52-d794-4d87-a737-6c25a0a4453e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received unexpected event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:17:33 np0005603622 nova_compute[221301]: 2026-01-31 08:17:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:33 np0005603622 nova_compute[221301]: 2026-01-31 08:17:33.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:17:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:34.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:34 np0005603622 podman[257832]: 2026-01-31 08:17:34.467804125 +0000 UTC m=+0.071277506 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0)
Jan 31 03:17:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:34.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:34 np0005603622 podman[257832]: 2026-01-31 08:17:34.582721299 +0000 UTC m=+0.186194700 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.430 221324 DEBUG nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-deleted-ad571df1-872d-4024-8955-346236aa8e90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.433 221324 INFO nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Neutron deleted interface ad571df1-872d-4024-8955-346236aa8e90; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.433 221324 DEBUG nova.network.neutron [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "address": "fa:16:3e:7e:73:4e", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0df677ac-7f", "ovs_interfaceid": "0df677ac-7f81-4a86-bf52-3028b2dda46e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.879 221324 DEBUG nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Detach interface failed, port_id=ad571df1-872d-4024-8955-346236aa8e90, reason: Instance b49dd2dc-476a-42c6-89f3-868e238698f2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.880 221324 DEBUG nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-deleted-0df677ac-7f81-4a86-bf52-3028b2dda46e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.880 221324 INFO nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Neutron deleted interface 0df677ac-7f81-4a86-bf52-3028b2dda46e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:17:35 np0005603622 nova_compute[221301]: 2026-01-31 08:17:35.880 221324 DEBUG nova.network.neutron [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [{"id": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "address": "fa:16:3e:1d:8f:c0", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff14940-7a", "ovs_interfaceid": "9ff14940-7ad5-4fc4-852a-08f78695ba7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "address": "fa:16:3e:09:eb:67", "network": {"id": "89bb50d1-4b51-4284-b511-c21064f9c3e1", "bridge": "br-int", "label": "tempest-device-tagging-net1-920910576", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea3924fc-85", "ovs_interfaceid": "ea3924fc-85a0-42f4-98f9-62f431992e9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "address": "fa:16:3e:57:e5:27", "network": {"id": "fcd4b12f-9cb6-45af-9713-c30c45ef51d9", "bridge": "br-int", "label": "tempest-device-tagging-net2-1477803552", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "587e3df6fc904afc99c9c6411095e3ef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf26bd91-1f", "ovs_interfaceid": "df26bd91-1f94-4407-962c-1dd9ca4032d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.135 221324 DEBUG nova.compute.manager [req-65857ffc-628b-4feb-9a2c-1f289fc20815 req-e33b7811-be45-40bf-9140-7af68abbe3c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Detach interface failed, port_id=0df677ac-7f81-4a86-bf52-3028b2dda46e, reason: Instance b49dd2dc-476a-42c6-89f3-868e238698f2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:36.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.318 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:36.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.753 221324 DEBUG nova.network.neutron [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:36 np0005603622 nova_compute[221301]: 2026-01-31 08:17:36.931 221324 INFO nova.compute.manager [-] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Took 23.16 seconds to deallocate network for instance.#033[00m
Jan 31 03:17:37 np0005603622 podman[258086]: 2026-01-31 08:17:37.217778319 +0000 UTC m=+0.138685386 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:17:37 np0005603622 nova_compute[221301]: 2026-01-31 08:17:37.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:17:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:17:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:38.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:38 np0005603622 nova_compute[221301]: 2026-01-31 08:17:38.323 221324 DEBUG nova.compute.manager [req-43ef4e5a-7aca-4951-beef-b17be6cb1825 req-d2c3d00f-9fa3-417e-9a13-49a3eb5caddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Received event network-vif-deleted-df26bd91-1f94-4407-962c-1dd9ca4032d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:17:38 np0005603622 nova_compute[221301]: 2026-01-31 08:17:38.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:38 np0005603622 nova_compute[221301]: 2026-01-31 08:17:38.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:17:38 np0005603622 nova_compute[221301]: 2026-01-31 08:17:38.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:17:38 np0005603622 nova_compute[221301]: 2026-01-31 08:17:38.460 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:17:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:38.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:39 np0005603622 podman[258112]: 2026-01-31 08:17:39.153508871 +0000 UTC m=+0.066357813 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 03:17:39 np0005603622 nova_compute[221301]: 2026-01-31 08:17:39.350 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:17:39 np0005603622 nova_compute[221301]: 2026-01-31 08:17:39.351 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:17:39 np0005603622 nova_compute[221301]: 2026-01-31 08:17:39.351 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:17:39 np0005603622 nova_compute[221301]: 2026-01-31 08:17:39.352 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 82cdb8e3-12f4-4363-948c-8d9b88d4cced obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.040 221324 INFO nova.compute.manager [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] [instance: b49dd2dc-476a-42c6-89f3-868e238698f2] Took 3.11 seconds to detach 3 volumes for instance.#033[00m
Jan 31 03:17:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:40.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.269 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.271 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.442 221324 DEBUG oslo_concurrency.processutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:40.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3528122175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.901 221324 DEBUG oslo_concurrency.processutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:40 np0005603622 nova_compute[221301]: 2026-01-31 08:17:40.909 221324 DEBUG nova.compute.provider_tree [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.014 221324 DEBUG nova.scheduler.client.report [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.167 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.309 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.549 221324 INFO nova.scheduler.client.report [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Deleted allocations for instance b49dd2dc-476a-42c6-89f3-868e238698f2#033[00m
Jan 31 03:17:41 np0005603622 nova_compute[221301]: 2026-01-31 08:17:41.795 221324 DEBUG oslo_concurrency.lockutils [None req-94d0a271-628c-4fff-86a4-c28a5873dd3d 469a76c175b64f84b29a4a44c8eaf3c4 587e3df6fc904afc99c9c6411095e3ef - - default default] Lock "b49dd2dc-476a-42c6-89f3-868e238698f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 29.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:42.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:42.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:43Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:73:e1 10.100.0.5
Jan 31 03:17:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:17:43Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:73:e1 10.100.0.5
Jan 31 03:17:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:44.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:44.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.017 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updating instance_info_cache with network_info: [{"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.239 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-82cdb8e3-12f4-4363-948c-8d9b88d4cced" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.240 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.241 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.241 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.242 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.360 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.360 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.361 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.361 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.362 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1815709461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:45 np0005603622 nova_compute[221301]: 2026-01-31 08:17:45.850 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.032 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.033 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.223 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.244 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.245 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4406MB free_disk=20.903400421142578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.245 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.246 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:17:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:46.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.322 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.435 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 82cdb8e3-12f4-4363-948c-8d9b88d4cced actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.436 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.436 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:17:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:46.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:46 np0005603622 nova_compute[221301]: 2026-01-31 08:17:46.555 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:17:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:17:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/233864320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:17:47 np0005603622 nova_compute[221301]: 2026-01-31 08:17:47.015 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:17:47 np0005603622 nova_compute[221301]: 2026-01-31 08:17:47.020 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:17:47 np0005603622 nova_compute[221301]: 2026-01-31 08:17:47.078 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:17:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:48.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:50.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:50 np0005603622 nova_compute[221301]: 2026-01-31 08:17:50.411 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:17:50 np0005603622 nova_compute[221301]: 2026-01-31 08:17:50.412 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:17:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:51 np0005603622 nova_compute[221301]: 2026-01-31 08:17:51.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:51 np0005603622 nova_compute[221301]: 2026-01-31 08:17:51.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:52.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:52.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:17:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:54.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:17:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:17:56 np0005603622 nova_compute[221301]: 2026-01-31 08:17:56.266 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:56.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:56 np0005603622 nova_compute[221301]: 2026-01-31 08:17:56.325 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:17:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:17:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:56.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:17:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:17:58.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:17:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:17:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:17:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:17:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:00.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.487 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.488 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.488 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.488 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.489 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.490 221324 INFO nova.compute.manager [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Terminating instance#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.491 221324 DEBUG nova.compute.manager [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:18:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:00.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:00 np0005603622 kernel: tap61657494-08 (unregistering): left promiscuous mode
Jan 31 03:18:00 np0005603622 NetworkManager[49080]: <info>  [1769847480.6828] device (tap61657494-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:18:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:00Z|00435|binding|INFO|Releasing lport 61657494-08d3-460a-bc24-a6856464d636 from this chassis (sb_readonly=0)
Jan 31 03:18:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:00Z|00436|binding|INFO|Setting lport 61657494-08d3-460a-bc24-a6856464d636 down in Southbound
Jan 31 03:18:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:00Z|00437|binding|INFO|Removing iface tap61657494-08 ovn-installed in OVS
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.692 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603622 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000064.scope: Deactivated successfully.
Jan 31 03:18:00 np0005603622 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000064.scope: Consumed 15.011s CPU time.
Jan 31 03:18:00 np0005603622 systemd-machined[190406]: Machine qemu-42-instance-00000064 terminated.
Jan 31 03:18:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:00.796 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:73:e1 10.100.0.5'], port_security=['fa:16:3e:e2:73:e1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '82cdb8e3-12f4-4363-948c-8d9b88d4cced', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eafe22d6cfcb41d4b31597087498a565', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2748ecb7-7ea4-47e3-84b3-eb7c4d3ddc31', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27d5be35-4a5e-4d77-b9b0-f9a41d41dd18, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=61657494-08d3-460a-bc24-a6856464d636) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:00.798 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 61657494-08d3-460a-bc24-a6856464d636 in datapath 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 unbound from our chassis#033[00m
Jan 31 03:18:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:00.800 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:18:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:00.803 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d10182a7-82ba-4fa4-ba53-357735b0af23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:00.804 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 namespace which is not needed anymore#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.920 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.926 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.939 221324 INFO nova.virt.libvirt.driver [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Instance destroyed successfully.#033[00m
Jan 31 03:18:00 np0005603622 nova_compute[221301]: 2026-01-31 08:18:00.939 221324 DEBUG nova.objects.instance [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lazy-loading 'resources' on Instance uuid 82cdb8e3-12f4-4363-948c-8d9b88d4cced obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:18:00 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [NOTICE]   (257606) : haproxy version is 2.8.14-c23fe91
Jan 31 03:18:00 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [NOTICE]   (257606) : path to executable is /usr/sbin/haproxy
Jan 31 03:18:00 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [WARNING]  (257606) : Exiting Master process...
Jan 31 03:18:00 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [ALERT]    (257606) : Current worker (257608) exited with code 143 (Terminated)
Jan 31 03:18:00 np0005603622 neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9[257602]: [WARNING]  (257606) : All workers exited. Exiting... (0)
Jan 31 03:18:01 np0005603622 systemd[1]: libpod-8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252.scope: Deactivated successfully.
Jan 31 03:18:01 np0005603622 podman[258279]: 2026-01-31 08:18:01.0084279 +0000 UTC m=+0.065433358 container died 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:18:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252-userdata-shm.mount: Deactivated successfully.
Jan 31 03:18:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay-39ae317c7daf4ab4be6c72d1a31668f62c0042e7d4bb8351778ba4b91dbddb37-merged.mount: Deactivated successfully.
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.045 221324 DEBUG nova.virt.libvirt.vif [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:16:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-603829389',display_name='tempest-ListServersNegativeTestJSON-server-603829389-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-603829389-3',id=100,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2026-01-31T08:17:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eafe22d6cfcb41d4b31597087498a565',ramdisk_id='',reservation_id='r-5jowh2i5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-79577656',owner_user_name='tempest-ListServersNegativeTestJSON-79577656-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:17:30Z,user_data=None,user_id='8db5a8acb6d04c988f9dd4f74380c487',uuid=82cdb8e3-12f4-4363-948c-8d9b88d4cced,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.046 221324 DEBUG nova.network.os_vif_util [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converting VIF {"id": "61657494-08d3-460a-bc24-a6856464d636", "address": "fa:16:3e:e2:73:e1", "network": {"id": "4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-399826021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eafe22d6cfcb41d4b31597087498a565", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61657494-08", "ovs_interfaceid": "61657494-08d3-460a-bc24-a6856464d636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.047 221324 DEBUG nova.network.os_vif_util [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.048 221324 DEBUG os_vif [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:18:01 np0005603622 podman[258279]: 2026-01-31 08:18:01.051065332 +0000 UTC m=+0.108070790 container cleanup 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.052 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61657494-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.099 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 systemd[1]: libpod-conmon-8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252.scope: Deactivated successfully.
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.102 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.108 221324 INFO os_vif [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:73:e1,bridge_name='br-int',has_traffic_filtering=True,id=61657494-08d3-460a-bc24-a6856464d636,network=Network(4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61657494-08')#033[00m
Jan 31 03:18:01 np0005603622 podman[258312]: 2026-01-31 08:18:01.134803963 +0000 UTC m=+0.062071227 container remove 8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.140 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1a20aa64-b9ed-4e86-9f81-1f021874b486]: (4, ('Sat Jan 31 08:18:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 (8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252)\n8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252\nSat Jan 31 08:18:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 (8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252)\n8355c3cf497be3428ddd731228e9edb714e9bdef63ff7a59d907775525cd1252\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.144 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[734c869a-553e-4444-b4a7-e0bbe57dd092]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.146 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba4d6d9-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.148 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 kernel: tap4ba4d6d9-c0: left promiscuous mode
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.163 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8c69bf-f0a5-4089-8a90-4c343831fe55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.182 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e29134fe-caf8-4384-9186-5e0d0cd1e788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.183 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[05009390-6746-49a7-8e49-935ffe93d54e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.204 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16181d92-45b2-4a6a-98ee-45be2cc8c947]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669645, 'reachable_time': 19519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258347, 'error': None, 'target': 'ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 systemd[1]: run-netns-ovnmeta\x2d4ba4d6d9\x2dcb51\x2d4c5e\x2d9b78\x2ddca15ca271c9.mount: Deactivated successfully.
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.209 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4ba4d6d9-cb51-4c5e-9b78-dca15ca271c9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:18:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:01.210 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4ef414-3e31-4c72-9a82-1c3881ccff89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.268 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:18:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1898625822' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:18:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:18:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1898625822' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.944 221324 INFO nova.virt.libvirt.driver [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Deleting instance files /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced_del#033[00m
Jan 31 03:18:01 np0005603622 nova_compute[221301]: 2026-01-31 08:18:01.946 221324 INFO nova.virt.libvirt.driver [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Deletion of /var/lib/nova/instances/82cdb8e3-12f4-4363-948c-8d9b88d4cced_del complete#033[00m
Jan 31 03:18:02 np0005603622 nova_compute[221301]: 2026-01-31 08:18:02.219 221324 INFO nova.compute.manager [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Took 1.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:18:02 np0005603622 nova_compute[221301]: 2026-01-31 08:18:02.220 221324 DEBUG oslo.service.loopingcall [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:18:02 np0005603622 nova_compute[221301]: 2026-01-31 08:18:02.220 221324 DEBUG nova.compute.manager [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:18:02 np0005603622 nova_compute[221301]: 2026-01-31 08:18:02.221 221324 DEBUG nova.network.neutron [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:18:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:02.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:02.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.020 221324 DEBUG nova.compute.manager [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-unplugged-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.022 221324 DEBUG oslo_concurrency.lockutils [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.024 221324 DEBUG oslo_concurrency.lockutils [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.024 221324 DEBUG oslo_concurrency.lockutils [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.024 221324 DEBUG nova.compute.manager [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] No waiting events found dispatching network-vif-unplugged-61657494-08d3-460a-bc24-a6856464d636 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:04 np0005603622 nova_compute[221301]: 2026-01-31 08:18:04.025 221324 DEBUG nova.compute.manager [req-463b4d4e-52d8-4147-a183-a705aca4208a req-de2a6da1-6c84-478a-ab6d-35d14ef2fec4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-unplugged-61657494-08d3-460a-bc24-a6856464d636 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:18:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:18:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:04.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:18:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:18:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1931040223' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:18:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:18:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1931040223' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:18:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:04.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:06 np0005603622 nova_compute[221301]: 2026-01-31 08:18:06.131 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:06 np0005603622 nova_compute[221301]: 2026-01-31 08:18:06.269 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:06.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:06.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.597 221324 DEBUG nova.compute.manager [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.598 221324 DEBUG oslo_concurrency.lockutils [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.598 221324 DEBUG oslo_concurrency.lockutils [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.599 221324 DEBUG oslo_concurrency.lockutils [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.599 221324 DEBUG nova.compute.manager [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] No waiting events found dispatching network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:07 np0005603622 nova_compute[221301]: 2026-01-31 08:18:07.600 221324 WARNING nova.compute.manager [req-a22e33c7-387e-4040-ae69-329ae59c6ce4 req-421d8bb7-7bda-43f1-896b-28fe1bfae48a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received unexpected event network-vif-plugged-61657494-08d3-460a-bc24-a6856464d636 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:18:08 np0005603622 podman[258349]: 2026-01-31 08:18:08.224544098 +0000 UTC m=+0.145194883 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:18:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:08.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:08 np0005603622 nova_compute[221301]: 2026-01-31 08:18:08.495 221324 DEBUG nova.network.neutron [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:08.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:08 np0005603622 nova_compute[221301]: 2026-01-31 08:18:08.681 221324 INFO nova.compute.manager [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Took 6.46 seconds to deallocate network for instance.#033[00m
Jan 31 03:18:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:09 np0005603622 nova_compute[221301]: 2026-01-31 08:18:09.152 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:09 np0005603622 nova_compute[221301]: 2026-01-31 08:18:09.153 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:09 np0005603622 nova_compute[221301]: 2026-01-31 08:18:09.252 221324 DEBUG oslo_concurrency.processutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2021059437' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:09 np0005603622 nova_compute[221301]: 2026-01-31 08:18:09.731 221324 DEBUG oslo_concurrency.processutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:09 np0005603622 nova_compute[221301]: 2026-01-31 08:18:09.741 221324 DEBUG nova.compute.provider_tree [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:18:10 np0005603622 nova_compute[221301]: 2026-01-31 08:18:10.034 221324 DEBUG nova.compute.manager [req-6e44ab8f-d9a7-476f-9a57-45364a8fdf07 req-48007027-c67d-4928-8ec5-df92ad605df0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Received event network-vif-deleted-61657494-08d3-460a-bc24-a6856464d636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:10 np0005603622 nova_compute[221301]: 2026-01-31 08:18:10.109 221324 DEBUG nova.scheduler.client.report [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:18:10 np0005603622 podman[258398]: 2026-01-31 08:18:10.148094072 +0000 UTC m=+0.075337546 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:18:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:10.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:10 np0005603622 nova_compute[221301]: 2026-01-31 08:18:10.502 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:10.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:10 np0005603622 nova_compute[221301]: 2026-01-31 08:18:10.644 221324 INFO nova.scheduler.client.report [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Deleted allocations for instance 82cdb8e3-12f4-4363-948c-8d9b88d4cced#033[00m
Jan 31 03:18:10 np0005603622 nova_compute[221301]: 2026-01-31 08:18:10.940 221324 DEBUG oslo_concurrency.lockutils [None req-5b9c8ced-95f2-44b8-9eef-796f5702df4a 8db5a8acb6d04c988f9dd4f74380c487 eafe22d6cfcb41d4b31597087498a565 - - default default] Lock "82cdb8e3-12f4-4363-948c-8d9b88d4cced" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:11 np0005603622 nova_compute[221301]: 2026-01-31 08:18:11.176 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:11 np0005603622 nova_compute[221301]: 2026-01-31 08:18:11.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:12.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:12.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:14.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:18:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:14.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:18:15 np0005603622 nova_compute[221301]: 2026-01-31 08:18:15.938 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847480.9370003, 82cdb8e3-12f4-4363-948c-8d9b88d4cced => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:18:15 np0005603622 nova_compute[221301]: 2026-01-31 08:18:15.939 221324 INFO nova.compute.manager [-] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:18:16 np0005603622 nova_compute[221301]: 2026-01-31 08:18:16.109 221324 DEBUG nova.compute.manager [None req-45207826-5b81-4be2-8b1d-2eeea0c14c06 - - - - - -] [instance: 82cdb8e3-12f4-4363-948c-8d9b88d4cced] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:16 np0005603622 nova_compute[221301]: 2026-01-31 08:18:16.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:16 np0005603622 nova_compute[221301]: 2026-01-31 08:18:16.273 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:16.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:16.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:18.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:18.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:20.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:20.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.276 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.407 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.408 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.591 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.851 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.852 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.862 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:18:21 np0005603622 nova_compute[221301]: 2026-01-31 08:18:21.862 221324 INFO nova.compute.claims [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.199 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:22.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2229668358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.604 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.611 221324 DEBUG nova.compute.provider_tree [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:18:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:22.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.736 221324 DEBUG nova.scheduler.client.report [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.881 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:22 np0005603622 nova_compute[221301]: 2026-01-31 08:18:22.882 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:18:23 np0005603622 nova_compute[221301]: 2026-01-31 08:18:23.445 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:18:23 np0005603622 nova_compute[221301]: 2026-01-31 08:18:23.445 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:18:23 np0005603622 nova_compute[221301]: 2026-01-31 08:18:23.791 221324 INFO nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.143 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.196 221324 DEBUG nova.policy [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6ba0fce7089428195e6cdb716cb9394', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26804c65ac6d433ba336d37f70fdfa3f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:18:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:24.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.401 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:24.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.638992) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504639168, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2391, "num_deletes": 252, "total_data_size": 5710197, "memory_usage": 5791712, "flush_reason": "Manual Compaction"}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504666994, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3744132, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44759, "largest_seqno": 47145, "table_properties": {"data_size": 3734529, "index_size": 6033, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20336, "raw_average_key_size": 20, "raw_value_size": 3715173, "raw_average_value_size": 3760, "num_data_blocks": 263, "num_entries": 988, "num_filter_entries": 988, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847288, "oldest_key_time": 1769847288, "file_creation_time": 1769847504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 27992 microseconds, and 12629 cpu microseconds.
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.667074) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3744132 bytes OK
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.667109) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.668695) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.668724) EVENT_LOG_v1 {"time_micros": 1769847504668713, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.668756) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5699760, prev total WAL file size 5699760, number of live WAL files 2.
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.670194) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3656KB)], [87(9481KB)]
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504670266, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 13453438, "oldest_snapshot_seqno": -1}
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.712 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.714 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.715 221324 INFO nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Creating image(s)#033[00m
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7213 keys, 11566922 bytes, temperature: kUnknown
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504756379, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 11566922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11517946, "index_size": 29856, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18053, "raw_key_size": 185542, "raw_average_key_size": 25, "raw_value_size": 11388522, "raw_average_value_size": 1578, "num_data_blocks": 1183, "num_entries": 7213, "num_filter_entries": 7213, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847504, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.756925) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 11566922 bytes
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.760370) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.7 rd, 134.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.3 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 7737, records dropped: 524 output_compression: NoCompression
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.760435) EVENT_LOG_v1 {"time_micros": 1769847504760389, "job": 54, "event": "compaction_finished", "compaction_time_micros": 85869, "compaction_time_cpu_micros": 45696, "output_level": 6, "num_output_files": 1, "total_output_size": 11566922, "num_input_records": 7737, "num_output_records": 7213, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504761201, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.760 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847504763227, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.670061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.763275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.763313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.763316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.763319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:18:24.763322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.810 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.844 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.850 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.928 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.929 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.930 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.930 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.961 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:24 np0005603622 nova_compute[221301]: 2026-01-31 08:18:24.965 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:25 np0005603622 nova_compute[221301]: 2026-01-31 08:18:25.857 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.892s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:25 np0005603622 nova_compute[221301]: 2026-01-31 08:18:25.964 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] resizing rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.277 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.279 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:26.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:26.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.655 221324 DEBUG nova.objects.instance [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lazy-loading 'migration_context' on Instance uuid a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.989 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.990 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Ensure instance console log exists: /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.991 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.991 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:26 np0005603622 nova_compute[221301]: 2026-01-31 08:18:26.992 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:27 np0005603622 nova_compute[221301]: 2026-01-31 08:18:27.403 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Successfully created port: 2d51188f-3dca-49ed-943d-60d42a394a35 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:18:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:28.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:28.529 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:28.531 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:18:28 np0005603622 nova_compute[221301]: 2026-01-31 08:18:28.567 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:18:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:28.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:18:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:30.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:30.872 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:30.873 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:30.873 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.178 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Successfully updated port: 2d51188f-3dca-49ed-943d-60d42a394a35 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.272 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.272 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquired lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.272 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.280 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.487 221324 DEBUG nova.compute.manager [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-changed-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.487 221324 DEBUG nova.compute.manager [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Refreshing instance network info cache due to event network-changed-2d51188f-3dca-49ed-943d-60d42a394a35. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:18:31 np0005603622 nova_compute[221301]: 2026-01-31 08:18:31.487 221324 DEBUG oslo_concurrency.lockutils [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:18:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:31.533 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:32 np0005603622 nova_compute[221301]: 2026-01-31 08:18:32.016 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:18:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:32.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:32.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:33 np0005603622 nova_compute[221301]: 2026-01-31 08:18:33.955 221324 DEBUG nova.network.neutron [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Updating instance_info_cache with network_info: [{"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.045 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Releasing lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.046 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Instance network_info: |[{"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.047 221324 DEBUG oslo_concurrency.lockutils [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.047 221324 DEBUG nova.network.neutron [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Refreshing network info cache for port 2d51188f-3dca-49ed-943d-60d42a394a35 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.055 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Start _get_guest_xml network_info=[{"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.063 221324 WARNING nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.074 221324 DEBUG nova.virt.libvirt.host [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.075 221324 DEBUG nova.virt.libvirt.host [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.102 221324 DEBUG nova.virt.libvirt.host [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.102 221324 DEBUG nova.virt.libvirt.host [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.105 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.106 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.107 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.107 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.108 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.108 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.108 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.109 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.109 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.110 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.111 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.111 221324 DEBUG nova.virt.hardware [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.116 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:34.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:18:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3015968488' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:18:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.646 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.686 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:34 np0005603622 nova_compute[221301]: 2026-01-31 08:18:34.692 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:18:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/234867490' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.173 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.176 221324 DEBUG nova.virt.libvirt.vif [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:18:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-792220334',display_name='tempest-ServerPasswordTestJSON-server-792220334',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-792220334',id=101,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26804c65ac6d433ba336d37f70fdfa3f',ramdisk_id='',reservation_id='r-y0vrp5wn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-783946140',owner_user_name='tempest-ServerPasswordTestJSON-783946140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:18:24Z,user_data=None,user_id='e6ba0fce7089428195e6cdb716cb9394',uuid=a50751dc-85d5-4cd7-84a2-11c25f7ba2c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.177 221324 DEBUG nova.network.os_vif_util [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converting VIF {"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.178 221324 DEBUG nova.network.os_vif_util [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.180 221324 DEBUG nova.objects.instance [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lazy-loading 'pci_devices' on Instance uuid a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.389 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <uuid>a50751dc-85d5-4cd7-84a2-11c25f7ba2c6</uuid>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <name>instance-00000065</name>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerPasswordTestJSON-server-792220334</nova:name>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:18:34</nova:creationTime>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:user uuid="e6ba0fce7089428195e6cdb716cb9394">tempest-ServerPasswordTestJSON-783946140-project-member</nova:user>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:project uuid="26804c65ac6d433ba336d37f70fdfa3f">tempest-ServerPasswordTestJSON-783946140</nova:project>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <nova:port uuid="2d51188f-3dca-49ed-943d-60d42a394a35">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="serial">a50751dc-85d5-4cd7-84a2-11c25f7ba2c6</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="uuid">a50751dc-85d5-4cd7-84a2-11c25f7ba2c6</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:8b:55:b6"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <target dev="tap2d51188f-3d"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/console.log" append="off"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:18:35 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:18:35 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:18:35 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:18:35 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.391 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Preparing to wait for external event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.392 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.393 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.393 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.395 221324 DEBUG nova.virt.libvirt.vif [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:18:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-792220334',display_name='tempest-ServerPasswordTestJSON-server-792220334',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-792220334',id=101,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26804c65ac6d433ba336d37f70fdfa3f',ramdisk_id='',reservation_id='r-y0vrp5wn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-783946140',owner_user_name='tempest-ServerPasswordTestJSON-783946140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:18:24Z,user_data=None,user_id='e6ba0fce7089428195e6cdb716cb9394',uuid=a50751dc-85d5-4cd7-84a2-11c25f7ba2c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.395 221324 DEBUG nova.network.os_vif_util [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converting VIF {"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.396 221324 DEBUG nova.network.os_vif_util [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.397 221324 DEBUG os_vif [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.398 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.399 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.399 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.403 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.404 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d51188f-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.404 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d51188f-3d, col_values=(('external_ids', {'iface-id': '2d51188f-3dca-49ed-943d-60d42a394a35', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:55:b6', 'vm-uuid': 'a50751dc-85d5-4cd7-84a2-11c25f7ba2c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:35 np0005603622 NetworkManager[49080]: <info>  [1769847515.4094] manager: (tap2d51188f-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.410 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.413 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.415 221324 INFO os_vif [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d')#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.613 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.614 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.615 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] No VIF found with MAC fa:16:3e:8b:55:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.615 221324 INFO nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Using config drive#033[00m
Jan 31 03:18:35 np0005603622 nova_compute[221301]: 2026-01-31 08:18:35.646 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:36 np0005603622 nova_compute[221301]: 2026-01-31 08:18:36.282 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:36.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:36.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:37 np0005603622 nova_compute[221301]: 2026-01-31 08:18:37.176 221324 INFO nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Creating config drive at /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config#033[00m
Jan 31 03:18:37 np0005603622 nova_compute[221301]: 2026-01-31 08:18:37.184 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr71rq_mw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:37 np0005603622 nova_compute[221301]: 2026-01-31 08:18:37.326 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr71rq_mw" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:37 np0005603622 nova_compute[221301]: 2026-01-31 08:18:37.369 221324 DEBUG nova.storage.rbd_utils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] rbd image a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:18:37 np0005603622 nova_compute[221301]: 2026-01-31 08:18:37.374 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:38.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:38 np0005603622 nova_compute[221301]: 2026-01-31 08:18:38.567 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:38 np0005603622 nova_compute[221301]: 2026-01-31 08:18:38.568 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:38 np0005603622 nova_compute[221301]: 2026-01-31 08:18:38.568 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:38 np0005603622 nova_compute[221301]: 2026-01-31 08:18:38.569 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:38 np0005603622 nova_compute[221301]: 2026-01-31 08:18:38.569 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:18:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:18:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:38.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:18:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:39 np0005603622 podman[258731]: 2026-01-31 08:18:39.167650729 +0000 UTC m=+0.091490382 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.449 221324 DEBUG nova.network.neutron [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Updated VIF entry in instance network info cache for port 2d51188f-3dca-49ed-943d-60d42a394a35. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.450 221324 DEBUG nova.network.neutron [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Updating instance_info_cache with network_info: [{"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.489 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.489 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.490 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.509 221324 DEBUG oslo_concurrency.lockutils [req-5d592936-e8dc-457f-84d1-a3f0ba3de0d4 req-2e1829a2-e1cc-490a-b706-3056c92532d7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.709 221324 DEBUG oslo_concurrency.processutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.335s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.710 221324 INFO nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Deleting local config drive /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6/disk.config because it was imported into RBD.#033[00m
Jan 31 03:18:39 np0005603622 kernel: tap2d51188f-3d: entered promiscuous mode
Jan 31 03:18:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:39Z|00438|binding|INFO|Claiming lport 2d51188f-3dca-49ed-943d-60d42a394a35 for this chassis.
Jan 31 03:18:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:39Z|00439|binding|INFO|2d51188f-3dca-49ed-943d-60d42a394a35: Claiming fa:16:3e:8b:55:b6 10.100.0.3
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.789 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:39 np0005603622 NetworkManager[49080]: <info>  [1769847519.7915] manager: (tap2d51188f-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/202)
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.797 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:39 np0005603622 systemd-machined[190406]: New machine qemu-43-instance-00000065.
Jan 31 03:18:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:39Z|00440|binding|INFO|Setting lport 2d51188f-3dca-49ed-943d-60d42a394a35 ovn-installed in OVS
Jan 31 03:18:39 np0005603622 nova_compute[221301]: 2026-01-31 08:18:39.824 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:39 np0005603622 systemd-udevd[258771]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:18:39 np0005603622 systemd[1]: Started Virtual Machine qemu-43-instance-00000065.
Jan 31 03:18:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:39Z|00441|binding|INFO|Setting lport 2d51188f-3dca-49ed-943d-60d42a394a35 up in Southbound
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.839 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:55:b6 10.100.0.3'], port_security=['fa:16:3e:8b:55:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a50751dc-85d5-4cd7-84a2-11c25f7ba2c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee2df140-a91d-4735-ae79-a422c3709462', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26804c65ac6d433ba336d37f70fdfa3f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7c80c7fa-fc05-4698-8ef8-8057de32059f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8aedf088-c215-42b4-9d75-2fdd48c453fc, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2d51188f-3dca-49ed-943d-60d42a394a35) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.841 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2d51188f-3dca-49ed-943d-60d42a394a35 in datapath ee2df140-a91d-4735-ae79-a422c3709462 bound to our chassis#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.843 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee2df140-a91d-4735-ae79-a422c3709462#033[00m
Jan 31 03:18:39 np0005603622 NetworkManager[49080]: <info>  [1769847519.8504] device (tap2d51188f-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:18:39 np0005603622 NetworkManager[49080]: <info>  [1769847519.8516] device (tap2d51188f-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.857 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe51bfb3-86ff-4768-b9b1-7503acc460c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.858 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee2df140-a1 in ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.860 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee2df140-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.860 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[df490c51-3d0d-4371-91e0-cdfa85ea2d5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.862 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[22c3e2c6-d303-4a0d-b80a-b77ca41ff458]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.875 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5219a0ab-97c6-4a18-af5f-ad40b3c0910c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.903 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[737169c3-e69e-47db-80f2-d5a852d5d727]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.939 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6d236e67-7331-4c36-9467-441860185b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.947 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d92f7ddb-93b0-4340-ae0a-aefb06d4ed39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 NetworkManager[49080]: <info>  [1769847519.9490] manager: (tapee2df140-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/203)
Jan 31 03:18:39 np0005603622 systemd-udevd[258774]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.968 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3525f586-cf48-4620-bb2b-7a57039c2b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.971 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3253e533-4f81-4cae-84da-b950f8363af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:39 np0005603622 NetworkManager[49080]: <info>  [1769847519.9912] device (tapee2df140-a0): carrier: link connected
Jan 31 03:18:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:39.997 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8819ec-1dc7-47c3-9f08-4069c22c981a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.012 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd411d5-ce12-44e3-9f4f-b96a8f0b1149]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee2df140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:a7:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677133, 'reachable_time': 24961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258804, 'error': None, 'target': 'ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.029 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[03f19515-fb94-4f27-b8cf-5abe316b56e7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:a7ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677133, 'tstamp': 677133}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258805, 'error': None, 'target': 'ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.051 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[36547687-2d22-4fac-bc6e-e5de6423cc46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee2df140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:a7:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677133, 'reachable_time': 24961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258806, 'error': None, 'target': 'ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.082 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3543dd39-04b9-4aae-a2aa-f942b8a9f2ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.132 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[81ef1f50-c0f3-4f74-af2e-c2716dc5b8cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.134 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee2df140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.135 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.136 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee2df140-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.177 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 NetworkManager[49080]: <info>  [1769847520.1780] manager: (tapee2df140-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 31 03:18:40 np0005603622 kernel: tapee2df140-a0: entered promiscuous mode
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.182 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee2df140-a0, col_values=(('external_ids', {'iface-id': 'ac366c5f-ebae-4bae-98c7-1c4bf91edf43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:40Z|00442|binding|INFO|Releasing lport ac366c5f-ebae-4bae-98c7-1c4bf91edf43 from this chassis (sb_readonly=0)
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.191 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee2df140-a91d-4735-ae79-a422c3709462.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee2df140-a91d-4735-ae79-a422c3709462.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.192 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b8bcd7-6424-4076-913a-fe4ce9959ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.193 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-ee2df140-a91d-4735-ae79-a422c3709462
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/ee2df140-a91d-4735-ae79-a422c3709462.pid.haproxy
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID ee2df140-a91d-4735-ae79-a422c3709462
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:18:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:40.194 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462', 'env', 'PROCESS_TAG=haproxy-ee2df140-a91d-4735-ae79-a422c3709462', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee2df140-a91d-4735-ae79-a422c3709462.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:18:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:40.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.471 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.508 221324 DEBUG nova.compute.manager [req-d403d82a-fc26-455a-a241-0d93f016cf8a req-4222fd7a-4307-40c0-b3b9-87e1f45f8f23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.508 221324 DEBUG oslo_concurrency.lockutils [req-d403d82a-fc26-455a-a241-0d93f016cf8a req-4222fd7a-4307-40c0-b3b9-87e1f45f8f23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.509 221324 DEBUG oslo_concurrency.lockutils [req-d403d82a-fc26-455a-a241-0d93f016cf8a req-4222fd7a-4307-40c0-b3b9-87e1f45f8f23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.509 221324 DEBUG oslo_concurrency.lockutils [req-d403d82a-fc26-455a-a241-0d93f016cf8a req-4222fd7a-4307-40c0-b3b9-87e1f45f8f23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.510 221324 DEBUG nova.compute.manager [req-d403d82a-fc26-455a-a241-0d93f016cf8a req-4222fd7a-4307-40c0-b3b9-87e1f45f8f23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Processing event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.517 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.517 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.518 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.518 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.519 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:40 np0005603622 podman[258858]: 2026-01-31 08:18:40.607275171 +0000 UTC m=+0.064313458 container create f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:18:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:40.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:40 np0005603622 systemd[1]: Started libpod-conmon-f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103.scope.
Jan 31 03:18:40 np0005603622 podman[258858]: 2026-01-31 08:18:40.570629451 +0000 UTC m=+0.027667788 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:18:40 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:18:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c1e2275a16d3fe436368917bc444e42670e994024eda656a5e2df4b1f2767b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:18:40 np0005603622 podman[258858]: 2026-01-31 08:18:40.702312058 +0000 UTC m=+0.159350365 container init f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:18:40 np0005603622 podman[258858]: 2026-01-31 08:18:40.706754078 +0000 UTC m=+0.163792345 container start f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Jan 31 03:18:40 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [NOTICE]   (258916) : New worker (258918) forked
Jan 31 03:18:40 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [NOTICE]   (258916) : Loading success.
Jan 31 03:18:40 np0005603622 podman[258873]: 2026-01-31 08:18:40.741460685 +0000 UTC m=+0.098282285 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 31 03:18:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/264163523' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:40 np0005603622 nova_compute[221301]: 2026-01-31 08:18:40.983 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.071 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847521.0707703, a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.073 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] VM Started (Lifecycle Event)#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.076 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.081 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.085 221324 INFO nova.virt.libvirt.driver [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Instance spawned successfully.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.085 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.135 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.140 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.172 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.173 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.196 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.196 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.197 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.198 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.198 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.199 221324 DEBUG nova.virt.libvirt.driver [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.230 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.234 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847521.070918, a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.234 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.292 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.341 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847521.0802274, a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.341 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.366 221324 INFO nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Took 16.65 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.368 221324 DEBUG nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.408 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.413 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.485 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.522 221324 INFO nova.compute.manager [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Took 19.71 seconds to build instance.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.526 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.527 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4518MB free_disk=20.967525482177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.528 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.528 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.610 221324 DEBUG oslo_concurrency.lockutils [None req-d555293c-c87e-4362-9ca0-1572f74911af e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.684 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.684 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.684 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:18:41 np0005603622 nova_compute[221301]: 2026-01-31 08:18:41.749 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:18:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:18:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1239145919' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.207 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.214 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.251 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.329 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.329 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:42.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:42.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.757 221324 DEBUG nova.compute.manager [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.757 221324 DEBUG oslo_concurrency.lockutils [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.758 221324 DEBUG oslo_concurrency.lockutils [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.758 221324 DEBUG oslo_concurrency.lockutils [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.758 221324 DEBUG nova.compute.manager [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] No waiting events found dispatching network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:42 np0005603622 nova_compute[221301]: 2026-01-31 08:18:42.758 221324 WARNING nova.compute.manager [req-70764f45-5932-48f3-b3f4-af93b252e3ee req-eac22318-1f65-4df9-9b32-6e41d50998d8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received unexpected event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:18:43 np0005603622 nova_compute[221301]: 2026-01-31 08:18:43.254 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:43 np0005603622 nova_compute[221301]: 2026-01-31 08:18:43.255 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:18:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:44.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:44.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:45 np0005603622 nova_compute[221301]: 2026-01-31 08:18:45.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.035 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.036 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.036 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.037 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.069 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.071 221324 INFO nova.compute.manager [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Terminating instance#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.073 221324 DEBUG nova.compute.manager [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:18:46 np0005603622 kernel: tap2d51188f-3d (unregistering): left promiscuous mode
Jan 31 03:18:46 np0005603622 NetworkManager[49080]: <info>  [1769847526.2956] device (tap2d51188f-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:18:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:46.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.347 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:46Z|00443|binding|INFO|Releasing lport 2d51188f-3dca-49ed-943d-60d42a394a35 from this chassis (sb_readonly=0)
Jan 31 03:18:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:46Z|00444|binding|INFO|Setting lport 2d51188f-3dca-49ed-943d-60d42a394a35 down in Southbound
Jan 31 03:18:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:18:46Z|00445|binding|INFO|Removing iface tap2d51188f-3d ovn-installed in OVS
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.354 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 31 03:18:46 np0005603622 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000065.scope: Consumed 5.892s CPU time.
Jan 31 03:18:46 np0005603622 systemd-machined[190406]: Machine qemu-43-instance-00000065 terminated.
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.406 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:55:b6 10.100.0.3'], port_security=['fa:16:3e:8b:55:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a50751dc-85d5-4cd7-84a2-11c25f7ba2c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee2df140-a91d-4735-ae79-a422c3709462', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26804c65ac6d433ba336d37f70fdfa3f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7c80c7fa-fc05-4698-8ef8-8057de32059f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8aedf088-c215-42b4-9d75-2fdd48c453fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2d51188f-3dca-49ed-943d-60d42a394a35) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.408 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2d51188f-3dca-49ed-943d-60d42a394a35 in datapath ee2df140-a91d-4735-ae79-a422c3709462 unbound from our chassis#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.410 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee2df140-a91d-4735-ae79-a422c3709462, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.412 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16b30535-60bc-4cac-966d-b4e04acc2ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.414 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462 namespace which is not needed anymore#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.515 221324 INFO nova.virt.libvirt.driver [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Instance destroyed successfully.#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.516 221324 DEBUG nova.objects.instance [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lazy-loading 'resources' on Instance uuid a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.570 221324 DEBUG nova.virt.libvirt.vif [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:18:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-792220334',display_name='tempest-ServerPasswordTestJSON-server-792220334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-792220334',id=101,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:18:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26804c65ac6d433ba336d37f70fdfa3f',ramdisk_id='',reservation_id='r-y0vrp5wn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-783946140',owner_user_name='tempest-ServerPasswordTestJSON-783946140-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:18:45Z,user_data=None,user_id='e6ba0fce7089428195e6cdb716cb9394',uuid=a50751dc-85d5-4cd7-84a2-11c25f7ba2c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.571 221324 DEBUG nova.network.os_vif_util [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converting VIF {"id": "2d51188f-3dca-49ed-943d-60d42a394a35", "address": "fa:16:3e:8b:55:b6", "network": {"id": "ee2df140-a91d-4735-ae79-a422c3709462", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1285236338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26804c65ac6d433ba336d37f70fdfa3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d51188f-3d", "ovs_interfaceid": "2d51188f-3dca-49ed-943d-60d42a394a35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.574 221324 DEBUG nova.network.os_vif_util [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.574 221324 DEBUG os_vif [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.577 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.577 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d51188f-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.582 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.587 221324 INFO os_vif [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:55:b6,bridge_name='br-int',has_traffic_filtering=True,id=2d51188f-3dca-49ed-943d-60d42a394a35,network=Network(ee2df140-a91d-4735-ae79-a422c3709462),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d51188f-3d')#033[00m
Jan 31 03:18:46 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [NOTICE]   (258916) : haproxy version is 2.8.14-c23fe91
Jan 31 03:18:46 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [NOTICE]   (258916) : path to executable is /usr/sbin/haproxy
Jan 31 03:18:46 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [WARNING]  (258916) : Exiting Master process...
Jan 31 03:18:46 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [ALERT]    (258916) : Current worker (258918) exited with code 143 (Terminated)
Jan 31 03:18:46 np0005603622 neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462[258883]: [WARNING]  (258916) : All workers exited. Exiting... (0)
Jan 31 03:18:46 np0005603622 systemd[1]: libpod-f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103.scope: Deactivated successfully.
Jan 31 03:18:46 np0005603622 conmon[258883]: conmon f35f18ccd8c4b90247ce <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103.scope/container/memory.events
Jan 31 03:18:46 np0005603622 podman[259137]: 2026-01-31 08:18:46.633301899 +0000 UTC m=+0.071027289 container died f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:18:46 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103-userdata-shm.mount: Deactivated successfully.
Jan 31 03:18:46 np0005603622 systemd[1]: var-lib-containers-storage-overlay-4c1e2275a16d3fe436368917bc444e42670e994024eda656a5e2df4b1f2767b9-merged.mount: Deactivated successfully.
Jan 31 03:18:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:46.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:46 np0005603622 podman[259137]: 2026-01-31 08:18:46.662592099 +0000 UTC m=+0.100317489 container cleanup f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:18:46 np0005603622 systemd[1]: libpod-conmon-f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103.scope: Deactivated successfully.
Jan 31 03:18:46 np0005603622 podman[259184]: 2026-01-31 08:18:46.715086157 +0000 UTC m=+0.036732503 container remove f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.720 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe2ea96-e9c0-4e37-b488-28e2daaf6342]: (4, ('Sat Jan 31 08:18:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462 (f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103)\nf35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103\nSat Jan 31 08:18:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462 (f35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103)\nf35f18ccd8c4b90247cee4d55d50fb09b4cf6d3102a44af5fb4c702b58447103\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.722 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[69c6ef5c-16ca-44a2-9e06-5afaedd77494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.723 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee2df140-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 kernel: tapee2df140-a0: left promiscuous mode
Jan 31 03:18:46 np0005603622 nova_compute[221301]: 2026-01-31 08:18:46.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.735 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f3032d86-59b0-44a8-895a-99da8779c665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.755 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6502865e-da0e-41c7-9431-745ac589ea05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.757 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[11284af5-f039-4fe5-a6a5-0ff9b026f931]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.769 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d14a93e7-5528-4119-be5b-ce11e39f383c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677127, 'reachable_time': 31950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259199, 'error': None, 'target': 'ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:46 np0005603622 systemd[1]: run-netns-ovnmeta\x2dee2df140\x2da91d\x2d4735\x2dae79\x2da422c3709462.mount: Deactivated successfully.
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.772 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee2df140-a91d-4735-ae79-a422c3709462 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:18:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:18:46.772 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[aee3bf1e-9583-4f17-9c8b-ece64c18178b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:18:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:48.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:48.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.127 221324 DEBUG nova.compute.manager [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-unplugged-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.128 221324 DEBUG oslo_concurrency.lockutils [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.128 221324 DEBUG oslo_concurrency.lockutils [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.129 221324 DEBUG oslo_concurrency.lockutils [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.130 221324 DEBUG nova.compute.manager [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] No waiting events found dispatching network-vif-unplugged-2d51188f-3dca-49ed-943d-60d42a394a35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:49 np0005603622 nova_compute[221301]: 2026-01-31 08:18:49.130 221324 DEBUG nova.compute.manager [req-747163c5-df8a-4e65-afb7-2281cf06131b req-e7efa7ab-5bd8-4a8b-8e3d-6b5afb100888 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-unplugged-2d51188f-3dca-49ed-943d-60d42a394a35 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:18:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:50.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:50.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.352 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.411 221324 DEBUG nova.compute.manager [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.411 221324 DEBUG oslo_concurrency.lockutils [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.411 221324 DEBUG oslo_concurrency.lockutils [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.412 221324 DEBUG oslo_concurrency.lockutils [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.412 221324 DEBUG nova.compute.manager [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] No waiting events found dispatching network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.412 221324 WARNING nova.compute.manager [req-d726b204-695f-4a18-a9b8-86fb35e827b9 req-9875dcfb-e8fa-4357-a342-8f0053559fa8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received unexpected event network-vif-plugged-2d51188f-3dca-49ed-943d-60d42a394a35 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:18:51 np0005603622 nova_compute[221301]: 2026-01-31 08:18:51.587 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:52.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:52.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:54.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:54.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:54 np0005603622 nova_compute[221301]: 2026-01-31 08:18:54.873 221324 INFO nova.virt.libvirt.driver [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Deleting instance files /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_del#033[00m
Jan 31 03:18:54 np0005603622 nova_compute[221301]: 2026-01-31 08:18:54.875 221324 INFO nova.virt.libvirt.driver [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Deletion of /var/lib/nova/instances/a50751dc-85d5-4cd7-84a2-11c25f7ba2c6_del complete#033[00m
Jan 31 03:18:55 np0005603622 nova_compute[221301]: 2026-01-31 08:18:55.104 221324 INFO nova.compute.manager [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Took 9.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:18:55 np0005603622 nova_compute[221301]: 2026-01-31 08:18:55.105 221324 DEBUG oslo.service.loopingcall [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:18:55 np0005603622 nova_compute[221301]: 2026-01-31 08:18:55.106 221324 DEBUG nova.compute.manager [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:18:55 np0005603622 nova_compute[221301]: 2026-01-31 08:18:55.107 221324 DEBUG nova.network.neutron [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:18:56 np0005603622 nova_compute[221301]: 2026-01-31 08:18:56.356 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:56.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:56 np0005603622 nova_compute[221301]: 2026-01-31 08:18:56.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:18:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:18:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:56.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:18:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:18:58.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.503 221324 DEBUG nova.network.neutron [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.531 221324 DEBUG nova.compute.manager [req-25d48488-cf4f-4a41-ad5f-965cb6de566f req-edb135dd-c5cb-48d2-94d0-809f8ad10372 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Received event network-vif-deleted-2d51188f-3dca-49ed-943d-60d42a394a35 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.532 221324 INFO nova.compute.manager [req-25d48488-cf4f-4a41-ad5f-965cb6de566f req-edb135dd-c5cb-48d2-94d0-809f8ad10372 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Neutron deleted interface 2d51188f-3dca-49ed-943d-60d42a394a35; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.533 221324 DEBUG nova.network.neutron [req-25d48488-cf4f-4a41-ad5f-965cb6de566f req-edb135dd-c5cb-48d2-94d0-809f8ad10372 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:18:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:18:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:18:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:18:58.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.825 221324 INFO nova.compute.manager [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Took 3.72 seconds to deallocate network for instance.#033[00m
Jan 31 03:18:58 np0005603622 nova_compute[221301]: 2026-01-31 08:18:58.853 221324 DEBUG nova.compute.manager [req-25d48488-cf4f-4a41-ad5f-965cb6de566f req-edb135dd-c5cb-48d2-94d0-809f8ad10372 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Detach interface failed, port_id=2d51188f-3dca-49ed-943d-60d42a394a35, reason: Instance a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:18:59 np0005603622 nova_compute[221301]: 2026-01-31 08:18:59.050 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:18:59 np0005603622 nova_compute[221301]: 2026-01-31 08:18:59.050 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:18:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:18:59 np0005603622 nova_compute[221301]: 2026-01-31 08:18:59.805 221324 DEBUG oslo_concurrency.processutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3071150472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:00.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:00.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:00 np0005603622 nova_compute[221301]: 2026-01-31 08:19:00.688 221324 DEBUG oslo_concurrency.processutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.883s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:00 np0005603622 nova_compute[221301]: 2026-01-31 08:19:00.696 221324 DEBUG nova.compute.provider_tree [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:00 np0005603622 nova_compute[221301]: 2026-01-31 08:19:00.750 221324 DEBUG nova.scheduler.client.report [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:00 np0005603622 nova_compute[221301]: 2026-01-31 08:19:00.845 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:00 np0005603622 nova_compute[221301]: 2026-01-31 08:19:00.944 221324 INFO nova.scheduler.client.report [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Deleted allocations for instance a50751dc-85d5-4cd7-84a2-11c25f7ba2c6#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.261 221324 DEBUG oslo_concurrency.lockutils [None req-0db76ef0-ccd2-466c-86ce-1b522ce625e7 e6ba0fce7089428195e6cdb716cb9394 26804c65ac6d433ba336d37f70fdfa3f - - default default] Lock "a50751dc-85d5-4cd7-84a2-11c25f7ba2c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 15.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.357 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.514 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847526.512649, a50751dc-85d5-4cd7-84a2-11c25f7ba2c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.514 221324 INFO nova.compute.manager [-] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.623 221324 DEBUG nova.compute.manager [None req-3c031bb9-3c5a-41ec-b6c1-fe2e660cf627 - - - - - -] [instance: a50751dc-85d5-4cd7-84a2-11c25f7ba2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:01 np0005603622 nova_compute[221301]: 2026-01-31 08:19:01.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:02.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:02.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:04.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:04.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:19:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:19:06 np0005603622 nova_compute[221301]: 2026-01-31 08:19:06.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:06.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:06 np0005603622 nova_compute[221301]: 2026-01-31 08:19:06.639 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:06.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:08.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:08.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:10 np0005603622 podman[259274]: 2026-01-31 08:19:10.230055915 +0000 UTC m=+0.143299482 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Jan 31 03:19:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:10.268 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:19:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:10.270 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:19:10 np0005603622 nova_compute[221301]: 2026-01-31 08:19:10.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:10.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:10.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:11 np0005603622 podman[259301]: 2026-01-31 08:19:11.13366446 +0000 UTC m=+0.062008816 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.422 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.532 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.532 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.582 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.641 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.748 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.749 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.759 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:19:11 np0005603622 nova_compute[221301]: 2026-01-31 08:19:11.760 221324 INFO nova.compute.claims [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.041 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:12.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1475480796' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.465 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.473 221324 DEBUG nova.compute.provider_tree [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.530 221324 DEBUG nova.scheduler.client.report [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.689 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.690 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:19:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:12.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.938 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:19:12 np0005603622 nova_compute[221301]: 2026-01-31 08:19:12.939 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:19:13 np0005603622 nova_compute[221301]: 2026-01-31 08:19:13.039 221324 INFO nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:19:13 np0005603622 nova_compute[221301]: 2026-01-31 08:19:13.153 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:19:13 np0005603622 nova_compute[221301]: 2026-01-31 08:19:13.282 221324 DEBUG nova.policy [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d03198d8ab846bda092e089b2d5a6c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:19:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.286 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.288 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.289 221324 INFO nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Creating image(s)#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.330 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.374 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:14.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.418 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.423 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.494 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.495 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.496 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.497 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.542 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.549 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d0c13002-57d9-4fad-8579-7343af29719d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:14.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:14 np0005603622 nova_compute[221301]: 2026-01-31 08:19:14.908 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 d0c13002-57d9-4fad-8579-7343af29719d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.358s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.008 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] resizing rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.162 221324 DEBUG nova.objects.instance [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.202 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Successfully created port: cc59ad05-3242-4d5f-8eec-a2480d285193 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.305 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.306 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Ensure instance console log exists: /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.307 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.308 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:15 np0005603622 nova_compute[221301]: 2026-01-31 08:19:15.308 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:16.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:16 np0005603622 nova_compute[221301]: 2026-01-31 08:19:16.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:16 np0005603622 nova_compute[221301]: 2026-01-31 08:19:16.694 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:16.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.169 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Successfully updated port: cc59ad05-3242-4d5f-8eec-a2480d285193 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.235 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.236 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.236 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.379 221324 DEBUG nova.compute.manager [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.379 221324 DEBUG nova.compute.manager [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing instance network info cache due to event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:19:17 np0005603622 nova_compute[221301]: 2026-01-31 08:19:17.380 221324 DEBUG oslo_concurrency.lockutils [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:19:18 np0005603622 nova_compute[221301]: 2026-01-31 08:19:18.296 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:19:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:18.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:19.272 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:20.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:20.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.252 221324 DEBUG nova.network.neutron [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.295 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.295 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance network_info: |[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.296 221324 DEBUG oslo_concurrency.lockutils [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.296 221324 DEBUG nova.network.neutron [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.299 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start _get_guest_xml network_info=[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.304 221324 WARNING nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.309 221324 DEBUG nova.virt.libvirt.host [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.309 221324 DEBUG nova.virt.libvirt.host [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.312 221324 DEBUG nova.virt.libvirt.host [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.313 221324 DEBUG nova.virt.libvirt.host [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.314 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.314 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.315 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.315 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.315 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.316 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.316 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.316 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.317 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.317 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.317 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.318 221324 DEBUG nova.virt.hardware [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.322 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.740 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:19:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/801041898' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.793 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.825 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:21 np0005603622 nova_compute[221301]: 2026-01-31 08:19:21.830 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:19:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1063825812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.256 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.259 221324 DEBUG nova.virt.libvirt.vif [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:19:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.260 221324 DEBUG nova.network.os_vif_util [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.261 221324 DEBUG nova.network.os_vif_util [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.263 221324 DEBUG nova.objects.instance [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.361 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <uuid>d0c13002-57d9-4fad-8579-7343af29719d</uuid>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <name>instance-00000068</name>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-922415262</nova:name>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:19:21</nova:creationTime>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <nova:port uuid="cc59ad05-3242-4d5f-8eec-a2480d285193">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="serial">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="uuid">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk.config">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b9:24:4f"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <target dev="tapcc59ad05-32"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log" append="off"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:19:22 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:19:22 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:19:22 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:19:22 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.363 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Preparing to wait for external event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.364 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.365 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.365 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.367 221324 DEBUG nova.virt.libvirt.vif [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:19:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.368 221324 DEBUG nova.network.os_vif_util [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.369 221324 DEBUG nova.network.os_vif_util [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.370 221324 DEBUG os_vif [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.371 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.372 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.373 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.379 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc59ad05-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.380 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc59ad05-32, col_values=(('external_ids', {'iface-id': 'cc59ad05-3242-4d5f-8eec-a2480d285193', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:24:4f', 'vm-uuid': 'd0c13002-57d9-4fad-8579-7343af29719d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:22 np0005603622 NetworkManager[49080]: <info>  [1769847562.3847] manager: (tapcc59ad05-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.391 221324 INFO os_vif [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:19:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:22.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.496 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.497 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.497 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:b9:24:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.497 221324 INFO nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Using config drive#033[00m
Jan 31 03:19:22 np0005603622 nova_compute[221301]: 2026-01-31 08:19:22.528 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:22.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:24.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:24.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.127 221324 INFO nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Creating config drive at /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.132 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiq2y2ay_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.150 221324 DEBUG nova.network.neutron [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated VIF entry in instance network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.151 221324 DEBUG nova.network.neutron [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.255 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiq2y2ay_" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.296 221324 DEBUG nova.storage.rbd_utils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image d0c13002-57d9-4fad-8579-7343af29719d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.303 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config d0c13002-57d9-4fad-8579-7343af29719d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.338 221324 DEBUG oslo_concurrency.lockutils [req-7451217a-b3e2-452f-aa7b-7e6653636e5f req-e64539fc-746d-4ba7-a57a-8615645f7452 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.977 221324 DEBUG oslo_concurrency.processutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config d0c13002-57d9-4fad-8579-7343af29719d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.674s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:25 np0005603622 nova_compute[221301]: 2026-01-31 08:19:25.978 221324 INFO nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Deleting local config drive /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/disk.config because it was imported into RBD.#033[00m
Jan 31 03:19:26 np0005603622 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.0514] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:26Z|00446|binding|INFO|Claiming lport cc59ad05-3242-4d5f-8eec-a2480d285193 for this chassis.
Jan 31 03:19:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:26Z|00447|binding|INFO|cc59ad05-3242-4d5f-8eec-a2480d285193: Claiming fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.057 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 systemd-machined[190406]: New machine qemu-44-instance-00000068.
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.086 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.087 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.089 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.096 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.098 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4706bb9c-3fde-4dc1-b0f9-bce8d8175035]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.099 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:19:26 np0005603622 systemd[1]: Started Virtual Machine qemu-44-instance-00000068.
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.102 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.102 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd61441-22ec-43a3-8b1f-7dbc1c22ba14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.103 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d06ef61d-9cd5-4304-bf7a-b4a41abefd4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:26Z|00448|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 ovn-installed in OVS
Jan 31 03:19:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:26Z|00449|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 up in Southbound
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.109 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 systemd-udevd[259646]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.115 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4cb22e-1455-485b-b5ab-599441b7d744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.1246] device (tapcc59ad05-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.1259] device (tapcc59ad05-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.135 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a20d13a0-aa72-4e2a-84f7-e2a7861046a1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.164 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[41db9d83-7255-4426-8557-3af8a2760fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 systemd-udevd[259653]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.170 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[62cffb36-7d95-43e0-b80b-b17e86946072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.1721] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.201 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6401a35b-7321-4cf0-bba7-8aaab34c0be5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.204 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d034922f-0fa4-4556-98a3-d349edb80950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.2253] device (tap1186b71b-00): carrier: link connected
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.229 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bece4a5c-5df7-46d0-9ced-28a1200e3edb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8a0593-1303-4edb-8db0-51efef972c0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681756, 'reachable_time': 29154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259678, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.263 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[92f39fc0-0e18-47dc-a9ad-d3d4d5389995]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 681756, 'tstamp': 681756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259679, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.276 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3270ccd5-f58d-4418-8adb-d907ba703165]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681756, 'reachable_time': 29154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259680, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.300 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[482e091f-e692-4f2d-8c0b-e7987eaf170d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.340 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88465269-59fe-496a-9173-2d587077cf1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.342 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.342 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.342 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:26 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:19:26 np0005603622 NetworkManager[49080]: <info>  [1769847566.3449] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.344 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.350 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:19:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:26Z|00450|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.351 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.354 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.355 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.356 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a1afa448-002e-4fbd-a4bc-74c97d6f6675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.357 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:19:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:26.357 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:19:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:26.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:26 np0005603622 nova_compute[221301]: 2026-01-31 08:19:26.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:26 np0005603622 podman[259730]: 2026-01-31 08:19:26.731103901 +0000 UTC m=+0.053562818 container create 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:19:26 np0005603622 systemd[1]: Started libpod-conmon-90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff.scope.
Jan 31 03:19:26 np0005603622 podman[259730]: 2026-01-31 08:19:26.695684995 +0000 UTC m=+0.018143942 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:19:26 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:19:26 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aeb948f695618a203697648edc29489960eca72615268c541c624e01a90aaf3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:19:26 np0005603622 podman[259730]: 2026-01-31 08:19:26.825048549 +0000 UTC m=+0.147507476 container init 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:19:26 np0005603622 podman[259730]: 2026-01-31 08:19:26.833368603 +0000 UTC m=+0.155827520 container start 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:19:26 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [NOTICE]   (259757) : New worker (259759) forked
Jan 31 03:19:26 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [NOTICE]   (259757) : Loading success.
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.043 221324 DEBUG nova.compute.manager [req-7d1abbd7-003f-4e7b-9ce8-ffd2b49cddc7 req-d63dd7f9-b562-4c80-83e9-d213a6508a17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.044 221324 DEBUG oslo_concurrency.lockutils [req-7d1abbd7-003f-4e7b-9ce8-ffd2b49cddc7 req-d63dd7f9-b562-4c80-83e9-d213a6508a17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.047 221324 DEBUG oslo_concurrency.lockutils [req-7d1abbd7-003f-4e7b-9ce8-ffd2b49cddc7 req-d63dd7f9-b562-4c80-83e9-d213a6508a17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.048 221324 DEBUG oslo_concurrency.lockutils [req-7d1abbd7-003f-4e7b-9ce8-ffd2b49cddc7 req-d63dd7f9-b562-4c80-83e9-d213a6508a17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.048 221324 DEBUG nova.compute.manager [req-7d1abbd7-003f-4e7b-9ce8-ffd2b49cddc7 req-d63dd7f9-b562-4c80-83e9-d213a6508a17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Processing event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.057 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847567.057187, d0c13002-57d9-4fad-8579-7343af29719d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.058 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.060 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.063 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.068 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance spawned successfully.#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.068 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.106 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.115 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.123 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.124 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.125 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.125 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.126 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.127 221324 DEBUG nova.virt.libvirt.driver [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.177 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.178 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847567.0574453, d0c13002-57d9-4fad-8579-7343af29719d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.178 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.245 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.250 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847567.0624266, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.250 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.289 221324 INFO nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Took 13.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.289 221324 DEBUG nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.327 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.332 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.407 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.444 221324 INFO nova.compute.manager [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Took 15.73 seconds to build instance.#033[00m
Jan 31 03:19:27 np0005603622 nova_compute[221301]: 2026-01-31 08:19:27.500 221324 DEBUG oslo_concurrency.lockutils [None req-85f1f7e4-2afd-448c-a20c-0b8bde219cc6 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:28.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:19:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:28.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:19:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.499 221324 DEBUG nova.compute.manager [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.499 221324 DEBUG oslo_concurrency.lockutils [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.499 221324 DEBUG oslo_concurrency.lockutils [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.499 221324 DEBUG oslo_concurrency.lockutils [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.500 221324 DEBUG nova.compute.manager [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:19:29 np0005603622 nova_compute[221301]: 2026-01-31 08:19:29.500 221324 WARNING nova.compute.manager [req-ec31ba51-9a4c-4a89-9e0e-79d76db23fc5 req-fce66c75-ae3e-48c0-947b-7910c82da24b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:19:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:30.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:30.873 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:30.874 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:30.875 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:31 np0005603622 nova_compute[221301]: 2026-01-31 08:19:31.431 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:32 np0005603622 nova_compute[221301]: 2026-01-31 08:19:32.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:32.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:32.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:33 np0005603622 nova_compute[221301]: 2026-01-31 08:19:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:33 np0005603622 nova_compute[221301]: 2026-01-31 08:19:33.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:33 np0005603622 NetworkManager[49080]: <info>  [1769847573.6858] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 31 03:19:33 np0005603622 NetworkManager[49080]: <info>  [1769847573.6900] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 31 03:19:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:33Z|00451|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:19:33 np0005603622 nova_compute[221301]: 2026-01-31 08:19:33.714 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:33 np0005603622 nova_compute[221301]: 2026-01-31 08:19:33.724 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:34.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:34.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.743 221324 DEBUG nova.compute.manager [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.743 221324 DEBUG nova.compute.manager [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing instance network info cache due to event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.744 221324 DEBUG oslo_concurrency.lockutils [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.744 221324 DEBUG oslo_concurrency.lockutils [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:19:34 np0005603622 nova_compute[221301]: 2026-01-31 08:19:34.744 221324 DEBUG nova.network.neutron [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:19:35 np0005603622 nova_compute[221301]: 2026-01-31 08:19:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:35 np0005603622 nova_compute[221301]: 2026-01-31 08:19:35.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:19:36 np0005603622 nova_compute[221301]: 2026-01-31 08:19:36.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:36.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:36 np0005603622 nova_compute[221301]: 2026-01-31 08:19:36.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:36.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:37 np0005603622 nova_compute[221301]: 2026-01-31 08:19:37.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:38.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:39 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 31 03:19:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:39 np0005603622 nova_compute[221301]: 2026-01-31 08:19:39.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.353 221324 DEBUG nova.network.neutron [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated VIF entry in instance network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.354 221324 DEBUG nova.network.neutron [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:19:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:40.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.633 221324 DEBUG oslo_concurrency.lockutils [req-e178e4b0-2778-4370-bbdf-7e0b7717846d req-b0d8af4b-8ec9-472b-b023-62fcf6dfd570 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:19:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:40Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:19:40 np0005603622 ovn_controller[130109]: 2026-01-31T08:19:40Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:19:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:40.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.767 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.768 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.768 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:19:40 np0005603622 nova_compute[221301]: 2026-01-31 08:19:40.768 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:19:41 np0005603622 podman[259786]: 2026-01-31 08:19:41.149677942 +0000 UTC m=+0.071602735 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:19:41 np0005603622 podman[259813]: 2026-01-31 08:19:41.231353088 +0000 UTC m=+0.059332804 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:19:41 np0005603622 nova_compute[221301]: 2026-01-31 08:19:41.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:42 np0005603622 nova_compute[221301]: 2026-01-31 08:19:42.391 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:42.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:42.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.317 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:19:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:44.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.558 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.559 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.559 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.560 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.560 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.560 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.707 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.708 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.709 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.709 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:19:44 np0005603622 nova_compute[221301]: 2026-01-31 08:19:44.709 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:44.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2651465628' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.184 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.489 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.491 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.706 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.708 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4435MB free_disk=20.942874908447266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.709 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.710 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.809 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.985 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance d0c13002-57d9-4fad-8579-7343af29719d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.986 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:19:45 np0005603622 nova_compute[221301]: 2026-01-31 08:19:45.986 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.277 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:19:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:46.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:19:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/464541162' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.752 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:19:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:46.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.760 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.801 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.894 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:19:46 np0005603622 nova_compute[221301]: 2026-01-31 08:19:46.895 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:19:47 np0005603622 nova_compute[221301]: 2026-01-31 08:19:47.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:48.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:48.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:50.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:50.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:51 np0005603622 nova_compute[221301]: 2026-01-31 08:19:51.442 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:52 np0005603622 nova_compute[221301]: 2026-01-31 08:19:52.396 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:52.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:52.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:54.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:19:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:54.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:19:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:54.962 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:19:54 np0005603622 nova_compute[221301]: 2026-01-31 08:19:54.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:19:54.964 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:19:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:56.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:56 np0005603622 nova_compute[221301]: 2026-01-31 08:19:56.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:56.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:57 np0005603622 nova_compute[221301]: 2026-01-31 08:19:57.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:19:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:19:58.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:19:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:19:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:19:58.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:19:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:19:59 np0005603622 nova_compute[221301]: 2026-01-31 08:19:59.788 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:59 np0005603622 nova_compute[221301]: 2026-01-31 08:19:59.789 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:59 np0005603622 nova_compute[221301]: 2026-01-31 08:19:59.955 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:19:59 np0005603622 nova_compute[221301]: 2026-01-31 08:19:59.956 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:19:59 np0005603622 nova_compute[221301]: 2026-01-31 08:19:59.957 221324 INFO nova.compute.manager [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Rebooting instance#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.069 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:20:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.387 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.388 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.388 221324 DEBUG nova.network.neutron [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:00.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:00.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.908 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.908 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.916 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:20:00 np0005603622 nova_compute[221301]: 2026-01-31 08:20:00.916 221324 INFO nova.compute.claims [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:20:01 np0005603622 nova_compute[221301]: 2026-01-31 08:20:01.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:01 np0005603622 nova_compute[221301]: 2026-01-31 08:20:01.594 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2345005676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.032 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.041 221324 DEBUG nova.compute.provider_tree [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.093 221324 DEBUG nova.scheduler.client.report [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.396 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.398 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.399 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.399 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.400 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.400 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.411 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.421 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.422 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:20:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:02.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:02.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.872 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.873 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Image id 0864ca59-9877-4e6d-adfc-f0a3204ed8f8 yields fingerprint 365f9823d2619ef09948bdeed685488da63755b5 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.873 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image 0864ca59-9877-4e6d-adfc-f0a3204ed8f8 at (/var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5): checking#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.873 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image 0864ca59-9877-4e6d-adfc-f0a3204ed8f8 at (/var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.875 221324 INFO oslo.privsep.daemon [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpf5xaf2k4/privsep.sock']#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.963 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:20:02 np0005603622 nova_compute[221301]: 2026-01-31 08:20:02.965 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.289 221324 DEBUG nova.policy [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a80ca71875e8413caa2b52e679e1dd40', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782758ebebe64580accb21a22280e02f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.450 221324 INFO nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.521 221324 INFO oslo.privsep.daemon [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.415 259905 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.420 259905 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.423 259905 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.424 259905 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259905#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.622 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.623 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Image id 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 yields fingerprint b1c202daae0a5d5b639e0239462ea0d46fe633d6 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.623 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 at (/var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6): checking#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.623 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 at (/var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.625 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] d0c13002-57d9-4fad-8579-7343af29719d is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.625 221324 WARNING nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.625 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Active base files: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.626 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Removable base files: /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.626 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.626 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.627 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.627 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 31 03:20:03 np0005603622 nova_compute[221301]: 2026-01-31 08:20:03.889 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:20:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:04.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:04.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:04.965 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.432 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.434 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.434 221324 INFO nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Creating image(s)#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.473 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.515 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.556 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.562 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.639 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.640 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "365f9823d2619ef09948bdeed685488da63755b5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.641 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.641 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "365f9823d2619ef09948bdeed685488da63755b5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.681 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:05 np0005603622 nova_compute[221301]: 2026-01-31 08:20:05.688 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 095f5561-959f-440e-8e6e-2521170b775d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:06.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:06 np0005603622 nova_compute[221301]: 2026-01-31 08:20:06.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:06.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:06 np0005603622 nova_compute[221301]: 2026-01-31 08:20:06.804 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5 095f5561-959f-440e-8e6e-2521170b775d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:06 np0005603622 nova_compute[221301]: 2026-01-31 08:20:06.908 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] resizing rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:20:07 np0005603622 nova_compute[221301]: 2026-01-31 08:20:07.127 221324 DEBUG nova.network.neutron [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:07 np0005603622 nova_compute[221301]: 2026-01-31 08:20:07.408 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:07 np0005603622 nova_compute[221301]: 2026-01-31 08:20:07.414 221324 DEBUG nova.compute.manager [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:07 np0005603622 nova_compute[221301]: 2026-01-31 08:20:07.415 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:20:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:20:07 np0005603622 nova_compute[221301]: 2026-01-31 08:20:07.915 221324 DEBUG nova.objects.instance [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'migration_context' on Instance uuid 095f5561-959f-440e-8e6e-2521170b775d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.161 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Successfully created port: 84df7d74-b3d1-4c52-9231-e0563e927d0f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:20:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:08.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:08.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.879 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.879 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Ensure instance console log exists: /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.880 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.881 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:08 np0005603622 nova_compute[221301]: 2026-01-31 08:20:08.881 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:09 np0005603622 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:20:09 np0005603622 NetworkManager[49080]: <info>  [1769847609.6037] device (tapcc59ad05-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.604 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00452|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=0)
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00453|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down in Southbound
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00454|binding|INFO|Removing iface tapcc59ad05-32 ovn-installed in OVS
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 31 03:20:09 np0005603622 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000068.scope: Consumed 15.194s CPU time.
Jan 31 03:20:09 np0005603622 systemd-machined[190406]: Machine qemu-44-instance-00000068 terminated.
Jan 31 03:20:09 np0005603622 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:20:09 np0005603622 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:20:09 np0005603622 NetworkManager[49080]: <info>  [1769847609.7137] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00455|if_status|INFO|Dropped 1 log messages in last 245 seconds (most recently, 245 seconds ago) due to excessive rate
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00456|if_status|INFO|Not updating pb chassis for cc59ad05-3242-4d5f-8eec-a2480d285193 now as sb is readonly
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.716 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00457|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=1)
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00458|if_status|INFO|Dropped 28 log messages in last 178 seconds (most recently, 177 seconds ago) due to excessive rate
Jan 31 03:20:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:09Z|00459|if_status|INFO|Not setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down as sb is readonly
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.734 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:20:09 np0005603622 nova_compute[221301]: 2026-01-31 08:20:09.734 221324 DEBUG nova.objects.instance [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:09.983 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:09.985 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:20:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:09.988 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:20:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:09.993 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e2048cbd-39a5-4074-8fac-36166940c875]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:09.995 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.003 221324 DEBUG nova.virt.libvirt.vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.003 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.005 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.005 221324 DEBUG os_vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.009 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.010 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.020 221324 INFO os_vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.031 221324 DEBUG nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start _get_guest_xml network_info=[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.039 221324 WARNING nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.045 221324 DEBUG nova.virt.libvirt.host [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.047 221324 DEBUG nova.virt.libvirt.host [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.051 221324 DEBUG nova.virt.libvirt.host [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.052 221324 DEBUG nova.virt.libvirt.host [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.054 221324 DEBUG nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.055 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.056 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.056 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.056 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.057 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.057 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.057 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.058 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.058 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.059 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.059 221324 DEBUG nova.virt.hardware [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.059 221324 DEBUG nova.objects.instance [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:10 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [NOTICE]   (259757) : haproxy version is 2.8.14-c23fe91
Jan 31 03:20:10 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [NOTICE]   (259757) : path to executable is /usr/sbin/haproxy
Jan 31 03:20:10 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [WARNING]  (259757) : Exiting Master process...
Jan 31 03:20:10 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [ALERT]    (259757) : Current worker (259759) exited with code 143 (Terminated)
Jan 31 03:20:10 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[259746]: [WARNING]  (259757) : All workers exited. Exiting... (0)
Jan 31 03:20:10 np0005603622 systemd[1]: libpod-90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff.scope: Deactivated successfully.
Jan 31 03:20:10 np0005603622 podman[260236]: 2026-01-31 08:20:10.186410003 +0000 UTC m=+0.055628803 container died 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:20:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff-userdata-shm.mount: Deactivated successfully.
Jan 31 03:20:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-aeb948f695618a203697648edc29489960eca72615268c541c624e01a90aaf3a-merged.mount: Deactivated successfully.
Jan 31 03:20:10 np0005603622 podman[260236]: 2026-01-31 08:20:10.228252004 +0000 UTC m=+0.097470774 container cleanup 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.241 221324 DEBUG oslo_concurrency.processutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:10 np0005603622 systemd[1]: libpod-conmon-90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff.scope: Deactivated successfully.
Jan 31 03:20:10 np0005603622 podman[260265]: 2026-01-31 08:20:10.29217138 +0000 UTC m=+0.045170851 container remove 90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.298 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef82974-6bc5-4bff-b35e-892ac03435a1]: (4, ('Sat Jan 31 08:20:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff)\n90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff\nSat Jan 31 08:20:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff)\n90a9feebbcc4193a6c74820914be5d27dcc83aecac9c05658798f5cef5ef52ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.301 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5cacbfb3-8f1e-4a12-b464-254ddc40b4e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.303 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.305 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.312 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.319 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c14d7c-001d-4a88-867b-f45dbc3064cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.342 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f30dd4-51b7-4dc7-b8c8-86f8e78616e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.343 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f420537d-874f-44db-bb16-551dee5e4a27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.361 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d86be6da-bcdb-4dbc-b190-6ea150e332bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681749, 'reachable_time': 24687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260282, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.365 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:20:10 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:20:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:10.366 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[88f5b6e0-c342-4ffa-a39e-8a9e31759777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:10.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2481780665' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.685 221324 DEBUG oslo_concurrency.processutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.747 221324 DEBUG oslo_concurrency.processutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:10.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.868 221324 DEBUG nova.compute.manager [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.869 221324 DEBUG oslo_concurrency.lockutils [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.869 221324 DEBUG oslo_concurrency.lockutils [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.870 221324 DEBUG oslo_concurrency.lockutils [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.870 221324 DEBUG nova.compute.manager [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:10 np0005603622 nova_compute[221301]: 2026-01-31 08:20:10.871 221324 WARNING nova.compute.manager [req-b937922d-28e1-4a82-8bdf-a0eeaec20d6a req-5f45098d-72cf-4553-bd3e-d033d85f5085 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 03:20:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2719575440' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.266 221324 DEBUG oslo_concurrency.processutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.268 221324 DEBUG nova.virt.libvirt.vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.269 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.270 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.272 221324 DEBUG nova.objects.instance [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.634 221324 DEBUG nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <uuid>d0c13002-57d9-4fad-8579-7343af29719d</uuid>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <name>instance-00000068</name>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-922415262</nova:name>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:20:10</nova:creationTime>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <nova:port uuid="cc59ad05-3242-4d5f-8eec-a2480d285193">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="serial">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="uuid">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk.config">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b9:24:4f"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <target dev="tapcc59ad05-32"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log" append="off"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:20:11 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:20:11 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:20:11 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:20:11 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.636 221324 DEBUG nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.636 221324 DEBUG nova.virt.libvirt.driver [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.637 221324 DEBUG nova.virt.libvirt.vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.638 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.639 221324 DEBUG nova.network.os_vif_util [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.639 221324 DEBUG os_vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.640 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.641 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.641 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.646 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.646 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc59ad05-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.647 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc59ad05-32, col_values=(('external_ids', {'iface-id': 'cc59ad05-3242-4d5f-8eec-a2480d285193', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:24:4f', 'vm-uuid': 'd0c13002-57d9-4fad-8579-7343af29719d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.649 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 NetworkManager[49080]: <info>  [1769847611.6506] manager: (tapcc59ad05-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.659 221324 INFO os_vif [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:20:11 np0005603622 podman[260349]: 2026-01-31 08:20:11.783208171 +0000 UTC m=+0.081554193 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Jan 31 03:20:11 np0005603622 podman[260348]: 2026-01-31 08:20:11.832064511 +0000 UTC m=+0.130868506 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:20:11 np0005603622 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:20:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:11Z|00460|binding|INFO|Claiming lport cc59ad05-3242-4d5f-8eec-a2480d285193 for this chassis.
Jan 31 03:20:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:11Z|00461|binding|INFO|cc59ad05-3242-4d5f-8eec-a2480d285193: Claiming fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.845 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 NetworkManager[49080]: <info>  [1769847611.8476] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Jan 31 03:20:11 np0005603622 systemd-udevd[260206]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:11Z|00462|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 ovn-installed in OVS
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 nova_compute[221301]: 2026-01-31 08:20:11.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:11 np0005603622 NetworkManager[49080]: <info>  [1769847611.8621] device (tapcc59ad05-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:11 np0005603622 NetworkManager[49080]: <info>  [1769847611.8628] device (tapcc59ad05-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:11 np0005603622 systemd-machined[190406]: New machine qemu-45-instance-00000068.
Jan 31 03:20:11 np0005603622 systemd[1]: Started Virtual Machine qemu-45-instance-00000068.
Jan 31 03:20:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:12Z|00463|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 up in Southbound
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.291 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.294 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.296 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.309 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[87c650a5-2ad1-4512-88b2-2beae0db457b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.310 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.313 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Successfully updated port: 84df7d74-b3d1-4c52-9231-e0563e927d0f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.314 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.314 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4b799784-3a8b-4488-b9d3-83d8e50fb7f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.316 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25b672ee-90ca-41d4-9bf4-d9484a1932cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.328 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9e23a9ce-0961-43e8-a45d-7308f49761a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.344 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e60842-bf96-4c30-b5f9-6948959db45a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.374 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b79aa375-19eb-48fc-9b46-c5fbcd028259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.384 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[20129869-42f3-4bf8-bf67-777fdde27e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 NetworkManager[49080]: <info>  [1769847612.3868] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.419 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[255dd0bc-63be-4deb-83fd-220b209d4618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.423 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[139a8979-284a-4c0c-94e7-b892d15f1d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 NetworkManager[49080]: <info>  [1769847612.4487] device (tap1186b71b-00): carrier: link connected
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.454 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[87f683ec-4908-4454-8f1b-582316db8ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:12.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.472 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aded8b12-678c-4fa5-8842-0b3d8e0448f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686378, 'reachable_time': 28781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260473, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.494 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff456ad-23f5-4f99-85bb-d42fbe66a9d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 686378, 'tstamp': 686378}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260476, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.514 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.515 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquired lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.515 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.524 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a6711e53-d2bf-4e7a-8a64-df54042df44c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686378, 'reachable_time': 28781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260480, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.563 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fffa3050-629e-4ce5-9715-bc7395333995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.618 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b0765203-a430-4bd5-8b63-aea75475380f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.620 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.620 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.621 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:12 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:12 np0005603622 NetworkManager[49080]: <info>  [1769847612.6262] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.629 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:12 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:12Z|00464|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.633 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.635 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[529131a2-572d-4342-930a-742ee440441d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.636 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.636 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for d0c13002-57d9-4fad-8579-7343af29719d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:20:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:12.637 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.637 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847612.63585, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.637 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.644 221324 DEBUG nova.compute.manager [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.648 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance rebooted successfully.#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.649 221324 DEBUG nova.compute.manager [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.709 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.714 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:12.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.903 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.904 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847612.6415248, d0c13002-57d9-4fad-8579-7343af29719d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.905 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:12 np0005603622 nova_compute[221301]: 2026-01-31 08:20:12.974 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.034 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.039 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:13 np0005603622 podman[260513]: 2026-01-31 08:20:13.062675208 +0000 UTC m=+0.062629573 container create 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.084 221324 DEBUG oslo_concurrency.lockutils [None req-c02c30ed-1fb3-4d7b-b73d-bc318167cd21 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 13.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:13 np0005603622 systemd[1]: Started libpod-conmon-8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4.scope.
Jan 31 03:20:13 np0005603622 podman[260513]: 2026-01-31 08:20:13.03460644 +0000 UTC m=+0.034560865 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:20:13 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:20:13 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8faf65b777dc8f81760b30d637343933eb2ebc28bea71470f58b19eba434f369/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:20:13 np0005603622 podman[260513]: 2026-01-31 08:20:13.191567729 +0000 UTC m=+0.191522164 container init 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:20:13 np0005603622 podman[260513]: 2026-01-31 08:20:13.199687899 +0000 UTC m=+0.199642284 container start 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:20:13 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [NOTICE]   (260532) : New worker (260534) forked
Jan 31 03:20:13 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [NOTICE]   (260532) : Loading success.
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.305 221324 DEBUG nova.compute.manager [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.306 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.306 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.306 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.307 221324 DEBUG nova.compute.manager [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.307 221324 WARNING nova.compute.manager [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.308 221324 DEBUG nova.compute.manager [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-changed-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.308 221324 DEBUG nova.compute.manager [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Refreshing instance network info cache due to event network-changed-84df7d74-b3d1-4c52-9231-e0563e927d0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:20:13 np0005603622 nova_compute[221301]: 2026-01-31 08:20:13.308 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:20:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:14.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:20:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:14.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.718 221324 DEBUG nova.network.neutron [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Updating instance_info_cache with network_info: [{"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.781 221324 DEBUG nova.compute.manager [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.782 221324 DEBUG oslo_concurrency.lockutils [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.782 221324 DEBUG oslo_concurrency.lockutils [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.783 221324 DEBUG oslo_concurrency.lockutils [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.783 221324 DEBUG nova.compute.manager [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:15 np0005603622 nova_compute[221301]: 2026-01-31 08:20:15.783 221324 WARNING nova.compute.manager [req-819447fa-81d3-4731-8718-06ebfad6bb80 req-f4eb66f9-56e0-4684-bd5e-f3273303d35b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.037 221324 INFO nova.compute.manager [None req-c524cc48-ac31-4eeb-8804-10cb424bb5a8 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Get console output#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.081 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Releasing lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.082 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Instance network_info: |[{"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.083 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.084 221324 DEBUG nova.network.neutron [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Refreshing network info cache for port 84df7d74-b3d1-4c52-9231-e0563e927d0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.090 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Start _get_guest_xml network_info=[{"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '0864ca59-9877-4e6d-adfc-f0a3204ed8f8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.096 221324 WARNING nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.102 221324 DEBUG nova.virt.libvirt.host [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.103 221324 DEBUG nova.virt.libvirt.host [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.109 221324 DEBUG nova.virt.libvirt.host [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.110 221324 DEBUG nova.virt.libvirt.host [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.112 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.113 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:45Z,direct_url=<?>,disk_format='qcow2',id=0864ca59-9877-4e6d-adfc-f0a3204ed8f8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.114 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.114 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.115 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.116 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.116 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.117 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.117 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.118 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.118 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.119 221324 DEBUG nova.virt.hardware [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.124 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:16.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3996995373' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.628 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.659 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.665 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:16 np0005603622 nova_compute[221301]: 2026-01-31 08:20:16.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:16.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3135979524' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.584 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.919s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.587 221324 DEBUG nova.virt.libvirt.vif [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1301696160',display_name='tempest-ListServerFiltersTestJSON-instance-1301696160',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1301696160',id=107,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-308e6c6h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:20:04Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=095f5561-959f-440e-8e6e-2521170b775d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.588 221324 DEBUG nova.network.os_vif_util [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.589 221324 DEBUG nova.network.os_vif_util [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.591 221324 DEBUG nova.objects.instance [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'pci_devices' on Instance uuid 095f5561-959f-440e-8e6e-2521170b775d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.633 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <uuid>095f5561-959f-440e-8e6e-2521170b775d</uuid>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <name>instance-0000006b</name>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1301696160</nova:name>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:20:16</nova:creationTime>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:user uuid="a80ca71875e8413caa2b52e679e1dd40">tempest-ListServerFiltersTestJSON-166541249-project-member</nova:user>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:project uuid="782758ebebe64580accb21a22280e02f">tempest-ListServerFiltersTestJSON-166541249</nova:project>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="0864ca59-9877-4e6d-adfc-f0a3204ed8f8"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <nova:port uuid="84df7d74-b3d1-4c52-9231-e0563e927d0f">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="serial">095f5561-959f-440e-8e6e-2521170b775d</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="uuid">095f5561-959f-440e-8e6e-2521170b775d</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/095f5561-959f-440e-8e6e-2521170b775d_disk">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/095f5561-959f-440e-8e6e-2521170b775d_disk.config">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:49:e8:24"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <target dev="tap84df7d74-b3"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/console.log" append="off"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:20:17 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:20:17 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:20:17 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:20:17 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.640 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Preparing to wait for external event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.641 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.641 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.642 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.643 221324 DEBUG nova.virt.libvirt.vif [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:19:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1301696160',display_name='tempest-ListServerFiltersTestJSON-instance-1301696160',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1301696160',id=107,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-308e6c6h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:20:04Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=095f5561-959f-440e-8e6e-2521170b775d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.643 221324 DEBUG nova.network.os_vif_util [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.644 221324 DEBUG nova.network.os_vif_util [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.644 221324 DEBUG os_vif [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.646 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.646 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.654 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84df7d74-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.654 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap84df7d74-b3, col_values=(('external_ids', {'iface-id': '84df7d74-b3d1-4c52-9231-e0563e927d0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:e8:24', 'vm-uuid': '095f5561-959f-440e-8e6e-2521170b775d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.656 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603622 NetworkManager[49080]: <info>  [1769847617.6577] manager: (tap84df7d74-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.661 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:17 np0005603622 nova_compute[221301]: 2026-01-31 08:20:17.665 221324 INFO os_vif [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3')#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.116 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.117 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.117 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] No VIF found with MAC fa:16:3e:49:e8:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.118 221324 INFO nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Using config drive#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.156 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:18.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.721 221324 DEBUG nova.compute.manager [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.723 221324 DEBUG oslo_concurrency.lockutils [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.723 221324 DEBUG oslo_concurrency.lockutils [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.724 221324 DEBUG oslo_concurrency.lockutils [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.725 221324 DEBUG nova.compute.manager [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.725 221324 WARNING nova.compute.manager [req-c354c8d5-fe6e-4955-ad64-2304bef8f00c req-602bbc75-d3e0-4a76-9faa-51b479516d70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:18.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:18 np0005603622 nova_compute[221301]: 2026-01-31 08:20:18.997 221324 INFO nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Creating config drive at /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.004 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6znzmfoy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.138 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6znzmfoy" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.241 221324 DEBUG nova.storage.rbd_utils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] rbd image 095f5561-959f-440e-8e6e-2521170b775d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.247 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config 095f5561-959f-440e-8e6e-2521170b775d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.507 221324 DEBUG oslo_concurrency.processutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config 095f5561-959f-440e-8e6e-2521170b775d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.509 221324 INFO nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Deleting local config drive /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d/disk.config because it was imported into RBD.#033[00m
Jan 31 03:20:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:19 np0005603622 kernel: tap84df7d74-b3: entered promiscuous mode
Jan 31 03:20:19 np0005603622 NetworkManager[49080]: <info>  [1769847619.5730] manager: (tap84df7d74-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Jan 31 03:20:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:19Z|00465|binding|INFO|Claiming lport 84df7d74-b3d1-4c52-9231-e0563e927d0f for this chassis.
Jan 31 03:20:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:19Z|00466|binding|INFO|84df7d74-b3d1-4c52-9231-e0563e927d0f: Claiming fa:16:3e:49:e8:24 10.100.0.9
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:19 np0005603622 systemd-udevd[260727]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:19 np0005603622 NetworkManager[49080]: <info>  [1769847619.6101] device (tap84df7d74-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:19 np0005603622 NetworkManager[49080]: <info>  [1769847619.6113] device (tap84df7d74-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:19 np0005603622 systemd-machined[190406]: New machine qemu-46-instance-0000006b.
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:19Z|00467|binding|INFO|Setting lport 84df7d74-b3d1-4c52-9231-e0563e927d0f ovn-installed in OVS
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.627 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:19 np0005603622 systemd[1]: Started Virtual Machine qemu-46-instance-0000006b.
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.832 221324 DEBUG nova.network.neutron [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Updated VIF entry in instance network info cache for port 84df7d74-b3d1-4c52-9231-e0563e927d0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:20:19 np0005603622 nova_compute[221301]: 2026-01-31 08:20:19.832 221324 DEBUG nova.network.neutron [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Updating instance_info_cache with network_info: [{"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:19Z|00468|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.867 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:e8:24 10.100.0.9'], port_security=['fa:16:3e:49:e8:24 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '095f5561-959f-440e-8e6e-2521170b775d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=84df7d74-b3d1-4c52-9231-e0563e927d0f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.870 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 84df7d74-b3d1-4c52-9231-e0563e927d0f in datapath 81dd779a-d164-4109-911b-0834e390c815 bound to our chassis#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.872 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81dd779a-d164-4109-911b-0834e390c815#033[00m
Jan 31 03:20:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:19Z|00469|binding|INFO|Setting lport 84df7d74-b3d1-4c52-9231-e0563e927d0f up in Southbound
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.884 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fa334659-ecd5-4d9d-8de9-f4dcfe62df14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.885 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81dd779a-d1 in ovnmeta-81dd779a-d164-4109-911b-0834e390c815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.887 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81dd779a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.887 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7e405bef-1ab8-4ddf-a9cc-c5331f7d64d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.889 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eeee29d9-e86f-4809-8211-35ec56e5ddc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.901 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5f7a3e-5a37-4edf-99a8-5706d152ddcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.925 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7177bdc6-d9e5-4eec-9567-e780e592a5ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.960 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0d2c9e-7032-437c-b187-67ca8eee79fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 NetworkManager[49080]: <info>  [1769847619.9681] manager: (tap81dd779a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.969 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48d1a9f0-9262-44e7-8971-2a90ac4ed6da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.995 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e4349099-686f-496a-945c-0a6c03f8a3e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:19.998 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6967d1b0-e258-42fe-a5cb-70c773f23960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 NetworkManager[49080]: <info>  [1769847620.0196] device (tap81dd779a-d0): carrier: link connected
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.028 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0fcdcd-fb16-4be5-a851-3f461c7832b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.043 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[47b61823-0e0b-43f9-9f94-69ae6047e448]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687135, 'reachable_time': 33181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260763, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.061 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4699e0-2e40-46bb-a54c-5cdf40932da2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:7ac2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687135, 'tstamp': 687135}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260764, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.079 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f406d819-6ffa-4cda-946e-64e884fbd736]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81dd779a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687135, 'reachable_time': 33181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260765, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.111 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b12edafd-b6d3-4904-bdc8-dc1d4423d84d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.178 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[28323361-82f8-4dc7-a7b8-0ec8552e3e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.182 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.182 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.183 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81dd779a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603622 kernel: tap81dd779a-d0: entered promiscuous mode
Jan 31 03:20:20 np0005603622 NetworkManager[49080]: <info>  [1769847620.1888] manager: (tap81dd779a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.197 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81dd779a-d0, col_values=(('external_ids', {'iface-id': 'cd015007-f775-4d63-920f-2a0c657e4d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:20Z|00470|binding|INFO|Releasing lport cd015007-f775-4d63-920f-2a0c657e4d70 from this chassis (sb_readonly=0)
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.200 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.203 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe0dd70-c089-484c-87e1-e6e50c43b653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.204 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-81dd779a-d164-4109-911b-0834e390c815
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/81dd779a-d164-4109-911b-0834e390c815.pid.haproxy
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 81dd779a-d164-4109-911b-0834e390c815
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:20:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:20.205 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'env', 'PROCESS_TAG=haproxy-81dd779a-d164-4109-911b-0834e390c815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81dd779a-d164-4109-911b-0834e390c815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.206 221324 DEBUG oslo_concurrency.lockutils [req-85bc214e-82b1-447c-9d8a-0e637df18993 req-9ec9fd58-b0fd-498c-84ce-59b42644a448 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-095f5561-959f-440e-8e6e-2521170b775d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.554 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847620.5537667, 095f5561-959f-440e-8e6e-2521170b775d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:20 np0005603622 nova_compute[221301]: 2026-01-31 08:20:20.556 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:20 np0005603622 podman[260839]: 2026-01-31 08:20:20.612204862 +0000 UTC m=+0.071470491 container create 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:20:20 np0005603622 systemd[1]: Started libpod-conmon-680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21.scope.
Jan 31 03:20:20 np0005603622 podman[260839]: 2026-01-31 08:20:20.580728882 +0000 UTC m=+0.039994511 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:20:20 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:20:20 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/072d92298eecb34a9d9ade5e58836bb4b2ee475db47c5cb799224a33abf80e9d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:20:20 np0005603622 podman[260839]: 2026-01-31 08:20:20.72394725 +0000 UTC m=+0.183212870 container init 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 03:20:20 np0005603622 podman[260839]: 2026-01-31 08:20:20.7331832 +0000 UTC m=+0.192448809 container start 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:20:20 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [NOTICE]   (260858) : New worker (260860) forked
Jan 31 03:20:20 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [NOTICE]   (260858) : Loading success.
Jan 31 03:20:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:20.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.090 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.097 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847620.5540233, 095f5561-959f-440e-8e6e-2521170b775d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.098 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.518 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.524 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:21 np0005603622 nova_compute[221301]: 2026-01-31 08:20:21.894 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:20:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:22.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:22 np0005603622 nova_compute[221301]: 2026-01-31 08:20:22.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:22.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.485 221324 DEBUG nova.compute.manager [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.486 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.486 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.486 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.487 221324 DEBUG nova.compute.manager [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Processing event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.487 221324 DEBUG nova.compute.manager [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.487 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.487 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.487 221324 DEBUG oslo_concurrency.lockutils [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.488 221324 DEBUG nova.compute.manager [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] No waiting events found dispatching network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.488 221324 WARNING nova.compute.manager [req-ce3d2346-f15f-4aee-ac63-80ba4402977a req-3c1acea4-f726-480a-8e8a-494d1e20f559 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received unexpected event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.488 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.493 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847623.4930432, 095f5561-959f-440e-8e6e-2521170b775d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.493 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.495 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.499 221324 INFO nova.virt.libvirt.driver [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Instance spawned successfully.#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.499 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.597 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.602 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.632 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.633 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.634 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.634 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.635 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.635 221324 DEBUG nova.virt.libvirt.driver [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:20:23 np0005603622 nova_compute[221301]: 2026-01-31 08:20:23.738 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:20:24 np0005603622 nova_compute[221301]: 2026-01-31 08:20:24.248 221324 INFO nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Took 18.82 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:20:24 np0005603622 nova_compute[221301]: 2026-01-31 08:20:24.249 221324 DEBUG nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:24.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:24 np0005603622 nova_compute[221301]: 2026-01-31 08:20:24.613 221324 INFO nova.compute.manager [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Took 23.74 seconds to build instance.#033[00m
Jan 31 03:20:24 np0005603622 nova_compute[221301]: 2026-01-31 08:20:24.790 221324 DEBUG oslo_concurrency.lockutils [None req-352b59a0-a882-4e7f-a96a-678828d93e46 a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:24.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.067 221324 DEBUG oslo_concurrency.lockutils [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.068 221324 DEBUG oslo_concurrency.lockutils [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.068 221324 DEBUG nova.compute.manager [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.072 221324 DEBUG nova.compute.manager [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.073 221324 DEBUG nova.objects.instance [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:25 np0005603622 nova_compute[221301]: 2026-01-31 08:20:25.247 221324 DEBUG nova.virt.libvirt.driver [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:20:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:26.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:26 np0005603622 nova_compute[221301]: 2026-01-31 08:20:26.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:26.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:27 np0005603622 nova_compute[221301]: 2026-01-31 08:20:27.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:27Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:20:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:28.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:28.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:30.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:30.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:30.874 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:30.875 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:30.877 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:31 np0005603622 nova_compute[221301]: 2026-01-31 08:20:31.543 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:32 np0005603622 nova_compute[221301]: 2026-01-31 08:20:32.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:32.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:34.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:34 np0005603622 nova_compute[221301]: 2026-01-31 08:20:34.519 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:34.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:35 np0005603622 nova_compute[221301]: 2026-01-31 08:20:35.302 221324 DEBUG nova.virt.libvirt.driver [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 31 03:20:36 np0005603622 nova_compute[221301]: 2026-01-31 08:20:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:36 np0005603622 nova_compute[221301]: 2026-01-31 08:20:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:20:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:36.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:36 np0005603622 nova_compute[221301]: 2026-01-31 08:20:36.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:36.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:37Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:e8:24 10.100.0.9
Jan 31 03:20:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:37Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:e8:24 10.100.0.9
Jan 31 03:20:37 np0005603622 nova_compute[221301]: 2026-01-31 08:20:37.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:37 np0005603622 nova_compute[221301]: 2026-01-31 08:20:37.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:37 np0005603622 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:20:38 np0005603622 NetworkManager[49080]: <info>  [1769847638.0011] device (tapcc59ad05-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:20:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:38Z|00471|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=0)
Jan 31 03:20:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:38Z|00472|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down in Southbound
Jan 31 03:20:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:38Z|00473|binding|INFO|Removing iface tapcc59ad05-32 ovn-installed in OVS
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.020 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 31 03:20:38 np0005603622 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000068.scope: Consumed 14.205s CPU time.
Jan 31 03:20:38 np0005603622 systemd-machined[190406]: Machine qemu-45-instance-00000068 terminated.
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.124 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.126 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.127 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.129 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a25b814e-f1f8-4bdd-a17f-dba253b2c0ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.129 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [NOTICE]   (260532) : haproxy version is 2.8.14-c23fe91
Jan 31 03:20:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [NOTICE]   (260532) : path to executable is /usr/sbin/haproxy
Jan 31 03:20:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [WARNING]  (260532) : Exiting Master process...
Jan 31 03:20:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [ALERT]    (260532) : Current worker (260534) exited with code 143 (Terminated)
Jan 31 03:20:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[260528]: [WARNING]  (260532) : All workers exited. Exiting... (0)
Jan 31 03:20:38 np0005603622 systemd[1]: libpod-8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4.scope: Deactivated successfully.
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.321 221324 INFO nova.virt.libvirt.driver [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance shutdown successfully after 13 seconds.#033[00m
Jan 31 03:20:38 np0005603622 podman[260894]: 2026-01-31 08:20:38.327168406 +0000 UTC m=+0.084632907 container died 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.331 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.332 221324 DEBUG nova.objects.instance [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'numa_topology' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:38 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4-userdata-shm.mount: Deactivated successfully.
Jan 31 03:20:38 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8faf65b777dc8f81760b30d637343933eb2ebc28bea71470f58b19eba434f369-merged.mount: Deactivated successfully.
Jan 31 03:20:38 np0005603622 podman[260894]: 2026-01-31 08:20:38.447812414 +0000 UTC m=+0.205276875 container cleanup 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:20:38 np0005603622 systemd[1]: libpod-conmon-8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4.scope: Deactivated successfully.
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.481 221324 DEBUG nova.compute.manager [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:38.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:38 np0005603622 podman[260931]: 2026-01-31 08:20:38.512126411 +0000 UTC m=+0.049730734 container remove 8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.516 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c9cad8-b3d9-4849-ac1d-51d4b35876df]: (4, ('Sat Jan 31 08:20:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4)\n8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4\nSat Jan 31 08:20:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4)\n8edd0c4b408d3b99088223e77d3d3d6a9b2f0ca81b63692a940fbf549ccdd1f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.518 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bf549cdf-54a9-4e47-8018-996cb2f25f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.519 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:20:38 np0005603622 nova_compute[221301]: 2026-01-31 08:20:38.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.570 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[52c324b6-03be-4c58-80d1-742f19999599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.586 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[933de97c-d41a-4aab-a5a8-5739f8734292]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.588 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4eefc2-4c3e-444d-94ee-ab52a60e04f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.602 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca10ed4d-db0e-4c5b-b32f-a3f09d9fb7ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686371, 'reachable_time': 27057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260950, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.605 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:20:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:38.605 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[72cf6340-df92-4792-b9a4-704125f1f414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:38 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:20:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:38.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.121 221324 DEBUG oslo_concurrency.lockutils [None req-81e9649d-e769-41f2-a5e0-d5856cd64c23 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.840 221324 DEBUG nova.compute.manager [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.841 221324 DEBUG oslo_concurrency.lockutils [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.841 221324 DEBUG oslo_concurrency.lockutils [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.841 221324 DEBUG oslo_concurrency.lockutils [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.841 221324 DEBUG nova.compute.manager [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:39 np0005603622 nova_compute[221301]: 2026-01-31 08:20:39.842 221324 WARNING nova.compute.manager [req-410eb928-585d-4042-bd7f-d2b42cf4bed0 req-b0d6d054-0732-4cc6-abd4-6a10197d9ef4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:20:40 np0005603622 nova_compute[221301]: 2026-01-31 08:20:40.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:40 np0005603622 nova_compute[221301]: 2026-01-31 08:20:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:40 np0005603622 nova_compute[221301]: 2026-01-31 08:20:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:40 np0005603622 nova_compute[221301]: 2026-01-31 08:20:40.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:20:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:40.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:40 np0005603622 nova_compute[221301]: 2026-01-31 08:20:40.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:40.514 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:40.516 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:20:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 31 03:20:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:40.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.504 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.505 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.505 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.549 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.693 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.699 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.700 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.700 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.701 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:41 np0005603622 nova_compute[221301]: 2026-01-31 08:20:41.805 221324 DEBUG oslo_concurrency.lockutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:20:42 np0005603622 podman[260952]: 2026-01-31 08:20:42.19936242 +0000 UTC m=+0.102429097 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.236 221324 DEBUG nova.compute.manager [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.237 221324 DEBUG oslo_concurrency.lockutils [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.237 221324 DEBUG oslo_concurrency.lockutils [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.237 221324 DEBUG oslo_concurrency.lockutils [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.237 221324 DEBUG nova.compute.manager [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.237 221324 WARNING nova.compute.manager [req-ff708b52-6a20-4c51-93cd-d9ea9c70949b req-ac4c4991-5428-4715-9424-6080708992cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 31 03:20:42 np0005603622 podman[260951]: 2026-01-31 08:20:42.244202572 +0000 UTC m=+0.149785877 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:20:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:42.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:42 np0005603622 nova_compute[221301]: 2026-01-31 08:20:42.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 31 03:20:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:42.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:44.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.596602) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644596967, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1687, "num_deletes": 256, "total_data_size": 3810794, "memory_usage": 3845064, "flush_reason": "Manual Compaction"}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644623597, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2491355, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 47150, "largest_seqno": 48832, "table_properties": {"data_size": 2484388, "index_size": 3974, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15083, "raw_average_key_size": 19, "raw_value_size": 2470159, "raw_average_value_size": 3271, "num_data_blocks": 175, "num_entries": 755, "num_filter_entries": 755, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847505, "oldest_key_time": 1769847505, "file_creation_time": 1769847644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 27112 microseconds, and 16143 cpu microseconds.
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.623706) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2491355 bytes OK
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.623741) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.625349) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.625380) EVENT_LOG_v1 {"time_micros": 1769847644625368, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.625418) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 3803073, prev total WAL file size 3803073, number of live WAL files 2.
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.626683) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353036' seq:72057594037927935, type:22 .. '6C6F676D0031373538' seq:0, type:0; will stop at (end)
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2432KB)], [90(11MB)]
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644626745, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 14058277, "oldest_snapshot_seqno": -1}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7439 keys, 13922609 bytes, temperature: kUnknown
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644721240, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 13922609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13869574, "index_size": 33339, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18629, "raw_key_size": 191338, "raw_average_key_size": 25, "raw_value_size": 13733821, "raw_average_value_size": 1846, "num_data_blocks": 1328, "num_entries": 7439, "num_filter_entries": 7439, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.721664) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 13922609 bytes
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.723212) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.6 rd, 147.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 11.0 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(11.2) write-amplify(5.6) OK, records in: 7968, records dropped: 529 output_compression: NoCompression
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.723236) EVENT_LOG_v1 {"time_micros": 1769847644723225, "job": 56, "event": "compaction_finished", "compaction_time_micros": 94623, "compaction_time_cpu_micros": 56016, "output_level": 6, "num_output_files": 1, "total_output_size": 13922609, "num_input_records": 7968, "num_output_records": 7439, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644723619, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847644724800, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.626598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.724861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.725073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.725091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.725093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:44.725096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:44.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:46.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:46 np0005603622 nova_compute[221301]: 2026-01-31 08:20:46.552 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:46.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:47.518 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:47 np0005603622 nova_compute[221301]: 2026-01-31 08:20:47.668 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.299 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.430 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.431 221324 DEBUG oslo_concurrency.lockutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.432 221324 DEBUG nova.network.neutron [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.433 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.435 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.436 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.436 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:48.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.705 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.706 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.707 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.707 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:20:48 np0005603622 nova_compute[221301]: 2026-01-31 08:20:48.707 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:20:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:48.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:20:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1032153723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.164 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.299 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.300 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.303 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.304 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000006b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.500 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.501 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4359MB free_disk=20.694782257080078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.502 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.502 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.732 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance d0c13002-57d9-4fad-8579-7343af29719d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.733 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 095f5561-959f-440e-8e6e-2521170b775d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.733 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.733 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:20:49 np0005603622 nova_compute[221301]: 2026-01-31 08:20:49.750 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:20:50 np0005603622 nova_compute[221301]: 2026-01-31 08:20:50.163 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:20:50 np0005603622 nova_compute[221301]: 2026-01-31 08:20:50.164 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:20:50 np0005603622 nova_compute[221301]: 2026-01-31 08:20:50.189 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:20:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:50Z|00474|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:20:50 np0005603622 nova_compute[221301]: 2026-01-31 08:20:50.235 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:20:50 np0005603622 nova_compute[221301]: 2026-01-31 08:20:50.321 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:50.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.554 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.602 221324 DEBUG nova.network.neutron [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.682 221324 DEBUG oslo_concurrency.lockutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.737 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.738 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'numa_topology' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.894 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.929 221324 DEBUG nova.virt.libvirt.vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.930 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.930 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.931 221324 DEBUG os_vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.933 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.940 221324 INFO os_vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.951 221324 DEBUG nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start _get_guest_xml network_info=[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.956 221324 WARNING nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.962 221324 DEBUG nova.virt.libvirt.host [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.964 221324 DEBUG nova.virt.libvirt.host [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.970 221324 DEBUG nova.virt.libvirt.host [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.970 221324 DEBUG nova.virt.libvirt.host [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.972 221324 DEBUG nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.973 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.974 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.974 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.974 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.975 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.975 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.976 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.976 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.977 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.977 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.978 221324 DEBUG nova.virt.hardware [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:20:51 np0005603622 nova_compute[221301]: 2026-01-31 08:20:51.979 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:52 np0005603622 nova_compute[221301]: 2026-01-31 08:20:52.071 221324 DEBUG oslo_concurrency.processutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:52.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:20:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1203466207' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:20:52 np0005603622 nova_compute[221301]: 2026-01-31 08:20:52.971 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.650s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:52 np0005603622 nova_compute[221301]: 2026-01-31 08:20:52.979 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.017 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.096 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.097 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.245 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847638.2439437, d0c13002-57d9-4fad-8579-7343af29719d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.246 221324 INFO nova.compute.manager [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:20:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2554497092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.327 221324 DEBUG oslo_concurrency.processutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.256s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.375 221324 DEBUG nova.compute.manager [None req-6db3cef1-4604-4d82-bc0d-a4a44c833f86 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.382 221324 DEBUG oslo_concurrency.processutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:20:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:20:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4064076568' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.911 221324 DEBUG oslo_concurrency.processutils [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.913 221324 DEBUG nova.virt.libvirt.vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.914 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.916 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.918 221324 DEBUG nova.objects.instance [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.970 221324 DEBUG nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <uuid>d0c13002-57d9-4fad-8579-7343af29719d</uuid>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <name>instance-00000068</name>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-922415262</nova:name>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:20:51</nova:creationTime>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <nova:port uuid="cc59ad05-3242-4d5f-8eec-a2480d285193">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="serial">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="uuid">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk.config">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b9:24:4f"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <target dev="tapcc59ad05-32"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log" append="off"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:20:53 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:20:53 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:20:53 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:20:53 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.972 221324 DEBUG nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.972 221324 DEBUG nova.virt.libvirt.driver [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.973 221324 DEBUG nova.virt.libvirt.vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.973 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.974 221324 DEBUG nova.network.os_vif_util [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.974 221324 DEBUG os_vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.975 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.975 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.976 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.978 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc59ad05-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.979 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc59ad05-32, col_values=(('external_ids', {'iface-id': 'cc59ad05-3242-4d5f-8eec-a2480d285193', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:24:4f', 'vm-uuid': 'd0c13002-57d9-4fad-8579-7343af29719d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.980 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:53 np0005603622 NetworkManager[49080]: <info>  [1769847653.9815] manager: (tapcc59ad05-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.988 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:53 np0005603622 nova_compute[221301]: 2026-01-31 08:20:53.989 221324 INFO os_vif [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:20:54 np0005603622 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.0768] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Jan 31 03:20:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:54Z|00475|binding|INFO|Claiming lport cc59ad05-3242-4d5f-8eec-a2480d285193 for this chassis.
Jan 31 03:20:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:54Z|00476|binding|INFO|cc59ad05-3242-4d5f-8eec-a2480d285193: Claiming fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:54Z|00477|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 ovn-installed in OVS
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.090 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 systemd-udevd[261119]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:20:54 np0005603622 systemd-machined[190406]: New machine qemu-47-instance-00000068.
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.1167] device (tapcc59ad05-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.1172] device (tapcc59ad05-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:20:54 np0005603622 systemd[1]: Started Virtual Machine qemu-47-instance-00000068.
Jan 31 03:20:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:54Z|00478|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 up in Southbound
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.183 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '7', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.186 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.188 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.201 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d84c882f-d9a9-4141-861d-d7f4756aa7da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.202 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.205 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.206 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f749da-5859-4840-a46a-c38e9e532c11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.207 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4daec676-1de4-4423-951f-dc2a698c1d3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.224 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d49db61e-e98e-401b-a93a-d2cc1cc63c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.240 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15f83449-99a9-4bc3-9aad-a0812e96f153]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.271 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2f06c306-e710-4466-8b7d-dae7eedad1bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.2788] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.278 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c316ca3-3ca6-4de7-a2ef-6ee630bb8452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.310 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd37ba3-c677-4dac-aaa2-fc424b1e74df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.315 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e302a3c9-bd37-400f-a89e-1a18ecbbc63b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.3326] device (tap1186b71b-00): carrier: link connected
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.338 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[459ff513-216c-4f28-9aca-24f99b4c957e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.341553) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654341618, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 367, "num_deletes": 252, "total_data_size": 255724, "memory_usage": 263392, "flush_reason": "Manual Compaction"}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.354 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e1466f-2538-4141-a801-58a6165173fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690567, 'reachable_time': 38785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261153, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654363384, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 167472, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48838, "largest_seqno": 49199, "table_properties": {"data_size": 165314, "index_size": 322, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6317, "raw_average_key_size": 20, "raw_value_size": 160767, "raw_average_value_size": 527, "num_data_blocks": 14, "num_entries": 305, "num_filter_entries": 305, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847644, "oldest_key_time": 1769847644, "file_creation_time": 1769847654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 21876 microseconds, and 1299 cpu microseconds.
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.363436) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 167472 bytes OK
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.363461) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.365051) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.365066) EVENT_LOG_v1 {"time_micros": 1769847654365061, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.365084) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 253251, prev total WAL file size 253251, number of live WAL files 2.
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.365621) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353032' seq:72057594037927935, type:22 .. '6D6772737461740031373534' seq:0, type:0; will stop at (end)
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(163KB)], [93(13MB)]
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654365685, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14090081, "oldest_snapshot_seqno": -1}
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.394 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2d96c09d-d6f2-4e88-9f6a-ae639a18223e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 690567, 'tstamp': 690567}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261154, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.409 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e30e1207-a600-441e-a9b1-74062ddc114d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690567, 'reachable_time': 38785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261155, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.438 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a27333de-7835-4e22-9d50-8da5b8ef9cc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7227 keys, 10248404 bytes, temperature: kUnknown
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654450259, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 10248404, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10201596, "index_size": 27624, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18117, "raw_key_size": 187146, "raw_average_key_size": 25, "raw_value_size": 10074290, "raw_average_value_size": 1393, "num_data_blocks": 1089, "num_entries": 7227, "num_filter_entries": 7227, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.450593) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 10248404 bytes
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.454216) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.3 rd, 120.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 13.3 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(145.3) write-amplify(61.2) OK, records in: 7744, records dropped: 517 output_compression: NoCompression
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.454259) EVENT_LOG_v1 {"time_micros": 1769847654454242, "job": 58, "event": "compaction_finished", "compaction_time_micros": 84742, "compaction_time_cpu_micros": 22963, "output_level": 6, "num_output_files": 1, "total_output_size": 10248404, "num_input_records": 7744, "num_output_records": 7227, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654454480, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847654456046, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.365544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.456132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.456142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.456145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.456148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:20:54.456150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.504 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5954e46f-0db6-4b28-927e-b3c870b731c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.506 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.507 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.508 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.510 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 NetworkManager[49080]: <info>  [1769847654.5117] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 31 03:20:54 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.514 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.517 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.519 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:20:54Z|00479|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.527 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.529 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:20:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:54.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.531 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[571f301f-2cec-49cb-b83f-fc654754aabc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.531 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:20:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:20:54.532 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:20:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.739 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847654.7377129, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.741 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.747 221324 DEBUG nova.compute.manager [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.752 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance rebooted successfully.#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.752 221324 DEBUG nova.compute.manager [None req-ad336634-8dd1-48c5-b267-3c91e126c722 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.874 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.879 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:54.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:54 np0005603622 podman[261229]: 2026-01-31 08:20:54.905606795 +0000 UTC m=+0.067034461 container create 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:20:54 np0005603622 systemd[1]: Started libpod-conmon-21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac.scope.
Jan 31 03:20:54 np0005603622 podman[261229]: 2026-01-31 08:20:54.865755209 +0000 UTC m=+0.027182855 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:20:54 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.972 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.973 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847654.7400591, d0c13002-57d9-4fad-8579-7343af29719d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.974 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:20:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6472c5397f28d4fab5e1c0104774708b60ae5bfb9f0d9587f13f06b9c072799/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:20:54 np0005603622 nova_compute[221301]: 2026-01-31 08:20:54.984 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:20:55 np0005603622 podman[261229]: 2026-01-31 08:20:55.000153689 +0000 UTC m=+0.161581335 container init 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:20:55 np0005603622 podman[261229]: 2026-01-31 08:20:55.006903011 +0000 UTC m=+0.168330647 container start 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:20:55 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [NOTICE]   (261248) : New worker (261250) forked
Jan 31 03:20:55 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [NOTICE]   (261248) : Loading success.
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.073 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.077 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.700 221324 DEBUG nova.compute.manager [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.701 221324 DEBUG oslo_concurrency.lockutils [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.701 221324 DEBUG oslo_concurrency.lockutils [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.701 221324 DEBUG oslo_concurrency.lockutils [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.701 221324 DEBUG nova.compute.manager [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:55 np0005603622 nova_compute[221301]: 2026-01-31 08:20:55.701 221324 WARNING nova.compute.manager [req-6376e17b-a92e-4cce-a4b5-b526160ca692 req-b53565b0-c3c2-4447-a424-aa3d1192fb69 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:20:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:56.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:20:56 np0005603622 nova_compute[221301]: 2026-01-31 08:20:56.558 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:56.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.900 221324 DEBUG nova.compute.manager [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.901 221324 DEBUG oslo_concurrency.lockutils [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.902 221324 DEBUG oslo_concurrency.lockutils [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.904 221324 DEBUG oslo_concurrency.lockutils [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.906 221324 DEBUG nova.compute.manager [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:20:57 np0005603622 nova_compute[221301]: 2026-01-31 08:20:57.907 221324 WARNING nova.compute.manager [req-bbb25b84-07d4-4d85-918b-a70beba12573 req-531d81cc-2284-4636-bbc7-967a2a11bd4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:20:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:20:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:20:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:20:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:20:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:20:58 np0005603622 nova_compute[221301]: 2026-01-31 08:20:58.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:20:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:00.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:00.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:01 np0005603622 nova_compute[221301]: 2026-01-31 08:21:01.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:02.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:02.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.682 221324 INFO nova.compute.manager [None req-02439b9b-39a4-4f9a-9b0e-24f1d55d588f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Pausing#033[00m
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.685 221324 DEBUG nova.objects.instance [None req-02439b9b-39a4-4f9a-9b0e-24f1d55d588f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.875 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847663.8747, d0c13002-57d9-4fad-8579-7343af29719d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.875 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.877 221324 DEBUG nova.compute.manager [None req-02439b9b-39a4-4f9a-9b0e-24f1d55d588f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:03 np0005603622 nova_compute[221301]: 2026-01-31 08:21:03.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:04 np0005603622 nova_compute[221301]: 2026-01-31 08:21:04.030 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:04 np0005603622 nova_compute[221301]: 2026-01-31 08:21:04.034 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:21:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:04.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:04.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:06 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 03:21:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:06.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:06 np0005603622 nova_compute[221301]: 2026-01-31 08:21:06.567 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:06.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:08.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).paxos(paxos updating c 4017..4705) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.557593346s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 03:21:08 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1[81564]: 2026-01-31T08:21:08.728+0000 7f65ea1de640 -1 mon.compute-1@2(peon).paxos(paxos updating c 4017..4705) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.557593346s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_commit, latency = 6.593651295s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_sync, latency = 6.593651772s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.594028473s, txc = 0x55bb28600300
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for submit_transact, latency = 6.372465611s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for throttle_transact, latency = 6.368119240s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for submit_transact, latency = 6.383985043s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for throttle_transact, latency = 6.382652283s
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.641544819s, txc = 0x55bb27a06300
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.640821457s, txc = 0x55bb279d7200
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.636855125s, txc = 0x55bb28709800
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.634720325s, txc = 0x55bb27876000
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.455379009s, txc = 0x55bb27a07200
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.454532146s, txc = 0x55bb278c2300
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.441769123s, txc = 0x55bb278ec300
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.435131073s, txc = 0x55bb28708600
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.434397697s, txc = 0x55bb28708000
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.433858871s, txc = 0x55bb278ed800
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.432523251s, txc = 0x55bb29692600
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.440075397s, txc = 0x55bb292c9500
Jan 31 03:21:08 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.429066181s, txc = 0x55bb28600600
Jan 31 03:21:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:08.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:08 np0005603622 nova_compute[221301]: 2026-01-31 08:21:08.988 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:09 np0005603622 nova_compute[221301]: 2026-01-31 08:21:09.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:09 np0005603622 nova_compute[221301]: 2026-01-31 08:21:09.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:21:09 np0005603622 nova_compute[221301]: 2026-01-31 08:21:09.670 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.030 221324 INFO nova.compute.manager [None req-e7870010-0f1e-4a29-a82c-11759ed73c08 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Unpausing#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.031 221324 DEBUG nova.objects.instance [None req-e7870010-0f1e-4a29-a82c-11759ed73c08 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.234 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.262 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847670.2619681, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.262 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:21:10 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.270 221324 DEBUG nova.virt.libvirt.guest [None req-e7870010-0f1e-4a29-a82c-11759ed73c08 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.270 221324 DEBUG nova.compute.manager [None req-e7870010-0f1e-4a29-a82c-11759ed73c08 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.442 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.448 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.452 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid d0c13002-57d9-4fad-8579-7343af29719d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.453 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 095f5561-959f-440e-8e6e-2521170b775d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.453 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.454 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.454 221324 INFO nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.455 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.455 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.456 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "095f5561-959f-440e-8e6e-2521170b775d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:10.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:10 np0005603622 nova_compute[221301]: 2026-01-31 08:21:10.626 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "095f5561-959f-440e-8e6e-2521170b775d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:10.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:11 np0005603622 nova_compute[221301]: 2026-01-31 08:21:11.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:12.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:12.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:13 np0005603622 podman[261261]: 2026-01-31 08:21:13.176650279 +0000 UTC m=+0.083564418 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:21:13 np0005603622 podman[261260]: 2026-01-31 08:21:13.211964933 +0000 UTC m=+0.121321378 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:21:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:13 np0005603622 nova_compute[221301]: 2026-01-31 08:21:13.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:14.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:14.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:15 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:15Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:21:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:16.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:16 np0005603622 nova_compute[221301]: 2026-01-31 08:21:16.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:16.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:18.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:18.586 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:18.588 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:21:18 np0005603622 nova_compute[221301]: 2026-01-31 08:21:18.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:18.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:18 np0005603622 nova_compute[221301]: 2026-01-31 08:21:18.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:21:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:21:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:20.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:20.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:21 np0005603622 nova_compute[221301]: 2026-01-31 08:21:21.575 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:22.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:22.591 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.903 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.904 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.905 221324 INFO nova.compute.manager [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Rebooting instance#033[00m
Jan 31 03:21:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:22.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.948 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.948 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:21:22 np0005603622 nova_compute[221301]: 2026-01-31 08:21:22.949 221324 DEBUG nova.network.neutron [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:21:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:23 np0005603622 nova_compute[221301]: 2026-01-31 08:21:23.995 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.053 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.054 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.054 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.055 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.055 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.056 221324 INFO nova.compute.manager [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Terminating instance#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.057 221324 DEBUG nova.compute.manager [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:21:24 np0005603622 kernel: tap84df7d74-b3 (unregistering): left promiscuous mode
Jan 31 03:21:24 np0005603622 NetworkManager[49080]: <info>  [1769847684.1537] device (tap84df7d74-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.160 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:24Z|00480|binding|INFO|Releasing lport 84df7d74-b3d1-4c52-9231-e0563e927d0f from this chassis (sb_readonly=0)
Jan 31 03:21:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:24Z|00481|binding|INFO|Setting lport 84df7d74-b3d1-4c52-9231-e0563e927d0f down in Southbound
Jan 31 03:21:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:24Z|00482|binding|INFO|Removing iface tap84df7d74-b3 ovn-installed in OVS
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 31 03:21:24 np0005603622 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006b.scope: Consumed 16.403s CPU time.
Jan 31 03:21:24 np0005603622 systemd-machined[190406]: Machine qemu-46-instance-0000006b terminated.
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.284 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:e8:24 10.100.0.9'], port_security=['fa:16:3e:49:e8:24 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '095f5561-959f-440e-8e6e-2521170b775d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81dd779a-d164-4109-911b-0834e390c815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782758ebebe64580accb21a22280e02f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16d4de5b-4914-4656-8fe1-e7d7abed377f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f936838-8680-43ea-b7b8-c96b02e037d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=84df7d74-b3d1-4c52-9231-e0563e927d0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.286 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 84df7d74-b3d1-4c52-9231-e0563e927d0f in datapath 81dd779a-d164-4109-911b-0834e390c815 unbound from our chassis#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.288 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81dd779a-d164-4109-911b-0834e390c815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.292 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[487ba110-d7f7-4357-86cf-b38650127da6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.293 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81dd779a-d164-4109-911b-0834e390c815 namespace which is not needed anymore#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.298 221324 INFO nova.virt.libvirt.driver [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Instance destroyed successfully.#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.298 221324 DEBUG nova.objects.instance [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lazy-loading 'resources' on Instance uuid 095f5561-959f-440e-8e6e-2521170b775d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [NOTICE]   (260858) : haproxy version is 2.8.14-c23fe91
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [NOTICE]   (260858) : path to executable is /usr/sbin/haproxy
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [WARNING]  (260858) : Exiting Master process...
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [WARNING]  (260858) : Exiting Master process...
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [ALERT]    (260858) : Current worker (260860) exited with code 143 (Terminated)
Jan 31 03:21:24 np0005603622 neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815[260854]: [WARNING]  (260858) : All workers exited. Exiting... (0)
Jan 31 03:21:24 np0005603622 systemd[1]: libpod-680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21.scope: Deactivated successfully.
Jan 31 03:21:24 np0005603622 podman[261474]: 2026-01-31 08:21:24.494985087 +0000 UTC m=+0.067966317 container died 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:21:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21-userdata-shm.mount: Deactivated successfully.
Jan 31 03:21:24 np0005603622 systemd[1]: var-lib-containers-storage-overlay-072d92298eecb34a9d9ade5e58836bb4b2ee475db47c5cb799224a33abf80e9d-merged.mount: Deactivated successfully.
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.558 221324 DEBUG nova.virt.libvirt.vif [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1301696160',display_name='tempest-ListServerFiltersTestJSON-instance-1301696160',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1301696160',id=107,image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:20:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782758ebebe64580accb21a22280e02f',ramdisk_id='',reservation_id='r-308e6c6h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='0864ca59-9877-4e6d-adfc-f0a3204ed8f8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-166541249',owner_user_name='tempest-ListServerFiltersTestJSON-166541249-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:20:24Z,user_data=None,user_id='a80ca71875e8413caa2b52e679e1dd40',uuid=095f5561-959f-440e-8e6e-2521170b775d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.559 221324 DEBUG nova.network.os_vif_util [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converting VIF {"id": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "address": "fa:16:3e:49:e8:24", "network": {"id": "81dd779a-d164-4109-911b-0834e390c815", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1493193016-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782758ebebe64580accb21a22280e02f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84df7d74-b3", "ovs_interfaceid": "84df7d74-b3d1-4c52-9231-e0563e927d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.560 221324 DEBUG nova.network.os_vif_util [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.562 221324 DEBUG os_vif [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:21:24 np0005603622 podman[261474]: 2026-01-31 08:21:24.565120451 +0000 UTC m=+0.138101721 container cleanup 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.567 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84df7d74-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:24.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.578 221324 INFO os_vif [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:e8:24,bridge_name='br-int',has_traffic_filtering=True,id=84df7d74-b3d1-4c52-9231-e0563e927d0f,network=Network(81dd779a-d164-4109-911b-0834e390c815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84df7d74-b3')#033[00m
Jan 31 03:21:24 np0005603622 systemd[1]: libpod-conmon-680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21.scope: Deactivated successfully.
Jan 31 03:21:24 np0005603622 podman[261502]: 2026-01-31 08:21:24.654544717 +0000 UTC m=+0.063161227 container remove 680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.659 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3f24c3-d750-4465-98cf-67044cea1b4a]: (4, ('Sat Jan 31 08:21:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815 (680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21)\n680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21\nSat Jan 31 08:21:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-81dd779a-d164-4109-911b-0834e390c815 (680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21)\n680d302850d81f356033dabdd554b6b8088a09652131d95e23f1b0487f171d21\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.661 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[de752181-da2c-4e5a-9b71-e255b3832070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.662 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81dd779a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 kernel: tap81dd779a-d0: left promiscuous mode
Jan 31 03:21:24 np0005603622 nova_compute[221301]: 2026-01-31 08:21:24.669 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.675 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e64b62-b4e8-4fa5-89ca-70119f2f1228]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.692 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9df2ece9-9bbf-42b2-bb3c-efe7f9919ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.694 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0fcf8b19-95f6-406d-a9de-7414c3dbb6d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.709 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25e347cc-302d-45b6-b038-ea58bbe7469f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687129, 'reachable_time': 29830, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261534, 'error': None, 'target': 'ovnmeta-81dd779a-d164-4109-911b-0834e390c815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.713 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81dd779a-d164-4109-911b-0834e390c815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:21:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:24.713 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[bdedff5c-21c3-4b10-a405-11feaaced435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:24 np0005603622 systemd[1]: run-netns-ovnmeta\x2d81dd779a\x2dd164\x2d4109\x2d911b\x2d0834e390c815.mount: Deactivated successfully.
Jan 31 03:21:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:24.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.177 221324 DEBUG nova.compute.manager [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-unplugged-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.177 221324 DEBUG oslo_concurrency.lockutils [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.179 221324 DEBUG oslo_concurrency.lockutils [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.179 221324 DEBUG oslo_concurrency.lockutils [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.180 221324 DEBUG nova.compute.manager [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] No waiting events found dispatching network-vif-unplugged-84df7d74-b3d1-4c52-9231-e0563e927d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.181 221324 DEBUG nova.compute.manager [req-ada8b49c-d677-47a5-b506-7ee319eed1f6 req-c122eac6-2186-4184-a4fc-59869a1d7fba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-unplugged-84df7d74-b3d1-4c52-9231-e0563e927d0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.816 221324 INFO nova.virt.libvirt.driver [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Deleting instance files /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d_del#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.817 221324 INFO nova.virt.libvirt.driver [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Deletion of /var/lib/nova/instances/095f5561-959f-440e-8e6e-2521170b775d_del complete#033[00m
Jan 31 03:21:25 np0005603622 nova_compute[221301]: 2026-01-31 08:21:25.906 221324 DEBUG nova.network.neutron [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.021 221324 INFO nova.compute.manager [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Took 1.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.022 221324 DEBUG oslo.service.loopingcall [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.023 221324 DEBUG nova.compute.manager [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.023 221324 DEBUG nova.network.neutron [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.071 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.074 221324 DEBUG nova.compute.manager [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:26.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:26 np0005603622 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:21:26 np0005603622 NetworkManager[49080]: <info>  [1769847686.8075] device (tapcc59ad05-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:21:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:26Z|00483|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=0)
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:26Z|00484|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down in Southbound
Jan 31 03:21:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:26Z|00485|binding|INFO|Removing iface tapcc59ad05-32 ovn-installed in OVS
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:26 np0005603622 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 31 03:21:26 np0005603622 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000068.scope: Consumed 15.498s CPU time.
Jan 31 03:21:26 np0005603622 systemd-machined[190406]: Machine qemu-47-instance-00000068 terminated.
Jan 31 03:21:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:26.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:26 np0005603622 NetworkManager[49080]: <info>  [1769847686.9513] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Jan 31 03:21:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:26.951 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:26.954 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:21:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:26.956 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:21:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:26.958 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[803edfb7-264d-4285-870f-b69a4f5639bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:26.959 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.968 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:21:26 np0005603622 nova_compute[221301]: 2026-01-31 08:21:26.969 221324 DEBUG nova.objects.instance [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.063 221324 DEBUG nova.virt.libvirt.vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:21:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.064 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.065 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.066 221324 DEBUG os_vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.069 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.069 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.075 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.077 221324 INFO os_vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:21:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.089 221324 DEBUG nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Start _get_guest_xml network_info=[{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [NOTICE]   (261248) : haproxy version is 2.8.14-c23fe91
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [NOTICE]   (261248) : path to executable is /usr/sbin/haproxy
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [WARNING]  (261248) : Exiting Master process...
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [WARNING]  (261248) : Exiting Master process...
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.096 221324 WARNING nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [ALERT]    (261248) : Current worker (261250) exited with code 143 (Terminated)
Jan 31 03:21:27 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261244]: [WARNING]  (261248) : All workers exited. Exiting... (0)
Jan 31 03:21:27 np0005603622 systemd[1]: libpod-21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac.scope: Deactivated successfully.
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.104 221324 DEBUG nova.virt.libvirt.host [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.106 221324 DEBUG nova.virt.libvirt.host [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:21:27 np0005603622 podman[261619]: 2026-01-31 08:21:27.10824416 +0000 UTC m=+0.052739875 container died 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.112 221324 DEBUG nova.virt.libvirt.host [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.113 221324 DEBUG nova.virt.libvirt.host [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.115 221324 DEBUG nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.115 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.116 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.116 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.116 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.117 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.117 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.118 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.118 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.118 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.119 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.119 221324 DEBUG nova.virt.hardware [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.119 221324 DEBUG nova.objects.instance [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac-userdata-shm.mount: Deactivated successfully.
Jan 31 03:21:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay-b6472c5397f28d4fab5e1c0104774708b60ae5bfb9f0d9587f13f06b9c072799-merged.mount: Deactivated successfully.
Jan 31 03:21:27 np0005603622 podman[261619]: 2026-01-31 08:21:27.147267104 +0000 UTC m=+0.091762809 container cleanup 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:21:27 np0005603622 systemd[1]: libpod-conmon-21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac.scope: Deactivated successfully.
Jan 31 03:21:27 np0005603622 podman[261645]: 2026-01-31 08:21:27.221487849 +0000 UTC m=+0.053723952 container remove 21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.225 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[503acd51-de12-4403-a1f6-8cbecb88ebe0]: (4, ('Sat Jan 31 08:21:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac)\n21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac\nSat Jan 31 08:21:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac)\n21815ce62e516180ceb968d95c4c57fe7ac6864628af5b629e9c47b53e42c9ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.227 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88f42f3c-643d-4cfa-a85a-4170ceaf6866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.228 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:27 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.242 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f89588ea-6d11-42d8-a2cb-0ceb402d128a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.250 221324 DEBUG oslo_concurrency.processutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.262 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c6342c-7656-4c07-b5b5-00cc53e92778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.264 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[da7e65b3-f1a6-4e6e-bd9b-50abbef06212]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.279 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7eb7d5-9953-450e-9d74-2c3bcb28cd46]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690560, 'reachable_time': 42000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261659, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.282 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:21:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:27.282 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[610d5ba5-4a61-41bd-a403-e678526c9f97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:21:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/301901232' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.724 221324 DEBUG oslo_concurrency.processutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.777 221324 DEBUG oslo_concurrency.processutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.804 221324 DEBUG nova.compute.manager [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.805 221324 DEBUG oslo_concurrency.lockutils [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "095f5561-959f-440e-8e6e-2521170b775d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.806 221324 DEBUG oslo_concurrency.lockutils [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.807 221324 DEBUG oslo_concurrency.lockutils [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.807 221324 DEBUG nova.compute.manager [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] No waiting events found dispatching network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:27 np0005603622 nova_compute[221301]: 2026-01-31 08:21:27.808 221324 WARNING nova.compute.manager [req-67ee0576-9faa-4d58-be18-898982616aa9 req-9ec4b01a-a285-4ffa-b86d-cc552dcec21e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received unexpected event network-vif-plugged-84df7d74-b3d1-4c52-9231-e0563e927d0f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.160 221324 DEBUG nova.network.neutron [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:21:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:21:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/644906341' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.317 221324 DEBUG oslo_concurrency.processutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.319 221324 DEBUG nova.virt.libvirt.vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:21:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.319 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.320 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.322 221324 DEBUG nova.objects.instance [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.575 221324 INFO nova.compute.manager [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Took 2.55 seconds to deallocate network for instance.#033[00m
Jan 31 03:21:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:28.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.581 221324 DEBUG nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <uuid>d0c13002-57d9-4fad-8579-7343af29719d</uuid>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <name>instance-00000068</name>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-922415262</nova:name>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:21:27</nova:creationTime>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <nova:port uuid="cc59ad05-3242-4d5f-8eec-a2480d285193">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="serial">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="uuid">d0c13002-57d9-4fad-8579-7343af29719d</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/d0c13002-57d9-4fad-8579-7343af29719d_disk.config">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b9:24:4f"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <target dev="tapcc59ad05-32"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/console.log" append="off"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:21:28 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:21:28 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:21:28 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:21:28 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.583 221324 DEBUG nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.583 221324 DEBUG nova.virt.libvirt.driver [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.584 221324 DEBUG nova.virt.libvirt.vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:21:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.584 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.585 221324 DEBUG nova.network.os_vif_util [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.585 221324 DEBUG os_vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.586 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.587 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.588 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.590 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.590 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc59ad05-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.591 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc59ad05-32, col_values=(('external_ids', {'iface-id': 'cc59ad05-3242-4d5f-8eec-a2480d285193', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:24:4f', 'vm-uuid': 'd0c13002-57d9-4fad-8579-7343af29719d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:28 np0005603622 NetworkManager[49080]: <info>  [1769847688.6207] manager: (tapcc59ad05-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.630 221324 INFO os_vif [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:21:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:28 np0005603622 kernel: tapcc59ad05-32: entered promiscuous mode
Jan 31 03:21:28 np0005603622 NetworkManager[49080]: <info>  [1769847688.7116] manager: (tapcc59ad05-32): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.713 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:28Z|00486|binding|INFO|Claiming lport cc59ad05-3242-4d5f-8eec-a2480d285193 for this chassis.
Jan 31 03:21:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:28Z|00487|binding|INFO|cc59ad05-3242-4d5f-8eec-a2480d285193: Claiming fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:21:28 np0005603622 systemd-udevd[261660]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:21:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:28Z|00488|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 ovn-installed in OVS
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 nova_compute[221301]: 2026-01-31 08:21:28.722 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:28 np0005603622 NetworkManager[49080]: <info>  [1769847688.7285] device (tapcc59ad05-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:21:28 np0005603622 NetworkManager[49080]: <info>  [1769847688.7290] device (tapcc59ad05-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:21:28 np0005603622 systemd-machined[190406]: New machine qemu-48-instance-00000068.
Jan 31 03:21:28 np0005603622 systemd[1]: Started Virtual Machine qemu-48-instance-00000068.
Jan 31 03:21:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:28.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.036 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.036 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.115 221324 DEBUG oslo_concurrency.processutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:29 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:29Z|00489|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 up in Southbound
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.131 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '9', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.133 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.134 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.142 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74360a66-fe16-488b-9024-feb3203e6c7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.143 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.144 221324 DEBUG nova.compute.manager [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.144 221324 DEBUG oslo_concurrency.lockutils [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.145 221324 DEBUG oslo_concurrency.lockutils [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.145 221324 DEBUG oslo_concurrency.lockutils [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.145 221324 DEBUG nova.compute.manager [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.145 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.145 221324 WARNING nova.compute.manager [req-98210ae8-860a-44a4-aaa8-2e59c401545c req-cfdfbfdb-f5ad-4942-8cf2-799c4a2c7143 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.145 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f35afc-fb6d-49af-8d95-0620f538ef4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.146 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[864b6096-1cb9-47a6-a113-bef6c66ecb3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.160 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[20cd0093-352d-44d3-93ec-968654ce4cb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.174 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7169c9-b5cc-4e2c-9c43-95ebc4371471]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.206 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7d6b65-f911-4e42-8a37-27e4299bd920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 NetworkManager[49080]: <info>  [1769847689.2164] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.215 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ec043c7c-f3cb-4830-9bfa-141cfee5ee12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.240 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[814dfea5-856b-43ea-be9f-21ca37e5321a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.243 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef69cd1-8ff8-4b72-b03a-74ed4037561b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.248 221324 DEBUG nova.compute.manager [req-6e133eca-de6c-44dc-a32b-c2edc24764e1 req-6a227ca7-a985-4a39-b23b-28598339791a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Received event network-vif-deleted-84df7d74-b3d1-4c52-9231-e0563e927d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:29 np0005603622 NetworkManager[49080]: <info>  [1769847689.2604] device (tap1186b71b-00): carrier: link connected
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.264 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8937cc3c-ede7-46ad-b2ac-206668a83377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.281 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c494e4e-a60e-4562-9bf0-40e4ba4ad958]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694060, 'reachable_time': 33846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261832, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.296 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8079f11f-2cfc-4694-9501-bbed1fc28b94]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694060, 'tstamp': 694060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261834, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.318 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[302f149d-21a1-4a51-9967-72679b2b918e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694060, 'reachable_time': 33846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261835, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.347 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[815bb43b-6095-44a8-bfdd-37e0a5faa290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.348 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for d0c13002-57d9-4fad-8579-7343af29719d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.348 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847689.3463864, d0c13002-57d9-4fad-8579-7343af29719d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.352 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.354 221324 DEBUG nova.compute.manager [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.374 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance rebooted successfully.#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.375 221324 DEBUG nova.compute.manager [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.404 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e61a7cc9-1516-47ca-9609-31e0b77944e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.405 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.406 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.406 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:29 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:21:29 np0005603622 NetworkManager[49080]: <info>  [1769847689.4091] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.411 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:21:29 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:29Z|00490|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.421 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.422 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8715fe49-fdc6-4102-a392-4c72d5c543d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.423 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:21:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:29.425 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1826583249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.574 221324 DEBUG oslo_concurrency.processutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.580 221324 DEBUG nova.compute.provider_tree [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.606 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:29 np0005603622 nova_compute[221301]: 2026-01-31 08:21:29.609 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:21:29 np0005603622 podman[261870]: 2026-01-31 08:21:29.850376611 +0000 UTC m=+0.105187061 container create 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:21:29 np0005603622 podman[261870]: 2026-01-31 08:21:29.764015549 +0000 UTC m=+0.018825989 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:21:29 np0005603622 systemd[1]: Started libpod-conmon-9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce.scope.
Jan 31 03:21:29 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:21:29 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ff984f57afd331ab59c545eaa389ae03fcaee328da66176fcb964ccbed5def2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:21:29 np0005603622 podman[261870]: 2026-01-31 08:21:29.952667984 +0000 UTC m=+0.207478444 container init 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:21:29 np0005603622 podman[261870]: 2026-01-31 08:21:29.957830204 +0000 UTC m=+0.212640624 container start 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:21:29 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [NOTICE]   (261888) : New worker (261890) forked
Jan 31 03:21:29 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [NOTICE]   (261888) : Loading success.
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.001 221324 DEBUG oslo_concurrency.lockutils [None req-8ea412ac-1976-4c9f-b05a-d46154f0bfe9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.006 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847689.3464966, d0c13002-57d9-4fad-8579-7343af29719d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.007 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Started (Lifecycle Event)#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.010 221324 DEBUG nova.scheduler.client.report [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.283 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.287 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.442 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:30.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:30 np0005603622 nova_compute[221301]: 2026-01-31 08:21:30.774 221324 INFO nova.scheduler.client.report [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Deleted allocations for instance 095f5561-959f-440e-8e6e-2521170b775d#033[00m
Jan 31 03:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:30.875 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:30.876 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:21:30.876 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:30.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.259 221324 DEBUG oslo_concurrency.lockutils [None req-c8230dfa-dbbc-4d6d-95c8-0eb386d4838b a80ca71875e8413caa2b52e679e1dd40 782758ebebe64580accb21a22280e02f - - default default] Lock "095f5561-959f-440e-8e6e-2521170b775d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.362 221324 DEBUG nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.363 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.363 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.363 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.363 221324 DEBUG nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.364 221324 WARNING nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.364 221324 DEBUG nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.364 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.364 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.365 221324 DEBUG oslo_concurrency.lockutils [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.365 221324 DEBUG nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.365 221324 WARNING nova.compute.manager [req-24a21b58-305d-49e1-9002-4cff63c4720e req-5b6c2fbc-63cf-4bae-8467-964a20e63a7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:21:31 np0005603622 nova_compute[221301]: 2026-01-31 08:21:31.581 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:32.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:32.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.676 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.810 221324 DEBUG nova.compute.manager [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.814 221324 DEBUG oslo_concurrency.lockutils [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.814 221324 DEBUG oslo_concurrency.lockutils [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.815 221324 DEBUG oslo_concurrency.lockutils [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.816 221324 DEBUG nova.compute.manager [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:21:33 np0005603622 nova_compute[221301]: 2026-01-31 08:21:33.816 221324 WARNING nova.compute.manager [req-724f583e-8569-4fcd-9261-33507de40cbd req-88619404-a235-4e78-b900-85e051af20b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:21:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:34.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:34.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:36 np0005603622 nova_compute[221301]: 2026-01-31 08:21:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:36 np0005603622 nova_compute[221301]: 2026-01-31 08:21:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:36 np0005603622 nova_compute[221301]: 2026-01-31 08:21:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:21:36 np0005603622 nova_compute[221301]: 2026-01-31 08:21:36.584 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:36.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:36.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:38 np0005603622 nova_compute[221301]: 2026-01-31 08:21:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:38.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:38 np0005603622 nova_compute[221301]: 2026-01-31 08:21:38.678 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:21:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:38.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:21:39 np0005603622 nova_compute[221301]: 2026-01-31 08:21:39.296 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847684.2942464, 095f5561-959f-440e-8e6e-2521170b775d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:21:39 np0005603622 nova_compute[221301]: 2026-01-31 08:21:39.297 221324 INFO nova.compute.manager [-] [instance: 095f5561-959f-440e-8e6e-2521170b775d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:21:39 np0005603622 nova_compute[221301]: 2026-01-31 08:21:39.514 221324 DEBUG nova.compute.manager [None req-75cebf3b-52e2-4bd0-8d0d-c80b5b12c205 - - - - - -] [instance: 095f5561-959f-440e-8e6e-2521170b775d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:21:40 np0005603622 nova_compute[221301]: 2026-01-31 08:21:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:40.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:41 np0005603622 nova_compute[221301]: 2026-01-31 08:21:41.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:41 np0005603622 nova_compute[221301]: 2026-01-31 08:21:41.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:41 np0005603622 nova_compute[221301]: 2026-01-31 08:21:41.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:21:41 np0005603622 nova_compute[221301]: 2026-01-31 08:21:41.532 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:21:41 np0005603622 nova_compute[221301]: 2026-01-31 08:21:41.587 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:42 np0005603622 nova_compute[221301]: 2026-01-31 08:21:42.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:42Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:24:4f 10.100.0.4
Jan 31 03:21:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:42.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:42.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.537 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.537 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.537 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.537 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.538 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:43 np0005603622 nova_compute[221301]: 2026-01-31 08:21:43.681 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/302466727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.004 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:44 np0005603622 podman[261922]: 2026-01-31 08:21:44.197121093 +0000 UTC m=+0.150337102 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:21:44 np0005603622 podman[261923]: 2026-01-31 08:21:44.197043941 +0000 UTC m=+0.141613376 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.254 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.255 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.495 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.497 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4355MB free_disk=20.784988403320312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.498 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.499 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:21:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:44.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.851 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance d0c13002-57d9-4fad-8579-7343af29719d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.852 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.852 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:21:44 np0005603622 nova_compute[221301]: 2026-01-31 08:21:44.910 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:21:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:44.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:21:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/565660977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:21:45 np0005603622 nova_compute[221301]: 2026-01-31 08:21:45.407 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:21:45 np0005603622 nova_compute[221301]: 2026-01-31 08:21:45.415 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:21:45 np0005603622 nova_compute[221301]: 2026-01-31 08:21:45.830 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:21:46 np0005603622 nova_compute[221301]: 2026-01-31 08:21:46.029 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:21:46 np0005603622 nova_compute[221301]: 2026-01-31 08:21:46.030 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:21:46 np0005603622 nova_compute[221301]: 2026-01-31 08:21:46.592 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:46.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:46.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:47 np0005603622 nova_compute[221301]: 2026-01-31 08:21:47.030 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:21:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:21:47Z|00491|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:21:47 np0005603622 nova_compute[221301]: 2026-01-31 08:21:47.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:48.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:48 np0005603622 nova_compute[221301]: 2026-01-31 08:21:48.685 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:48.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:21:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:50.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:21:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:50.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:51 np0005603622 nova_compute[221301]: 2026-01-31 08:21:51.596 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:52.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:52.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:53 np0005603622 nova_compute[221301]: 2026-01-31 08:21:53.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:54.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:56 np0005603622 nova_compute[221301]: 2026-01-31 08:21:56.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:56.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:56.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.062806) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718062930, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 869, "num_deletes": 251, "total_data_size": 1675372, "memory_usage": 1698720, "flush_reason": "Manual Compaction"}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718076344, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 1105091, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49204, "largest_seqno": 50068, "table_properties": {"data_size": 1100998, "index_size": 1809, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9498, "raw_average_key_size": 19, "raw_value_size": 1092719, "raw_average_value_size": 2295, "num_data_blocks": 79, "num_entries": 476, "num_filter_entries": 476, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847654, "oldest_key_time": 1769847654, "file_creation_time": 1769847718, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 13569 microseconds, and 5158 cpu microseconds.
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.076393) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 1105091 bytes OK
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.076420) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.078399) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.078413) EVENT_LOG_v1 {"time_micros": 1769847718078407, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.078433) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1670910, prev total WAL file size 1670910, number of live WAL files 2.
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.079122) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(1079KB)], [96(10008KB)]
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718079205, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 11353495, "oldest_snapshot_seqno": -1}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 7186 keys, 9512235 bytes, temperature: kUnknown
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718178620, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 9512235, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9466428, "index_size": 26751, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17989, "raw_key_size": 187074, "raw_average_key_size": 26, "raw_value_size": 9340490, "raw_average_value_size": 1299, "num_data_blocks": 1047, "num_entries": 7186, "num_filter_entries": 7186, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847718, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.178939) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 9512235 bytes
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.180779) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.1 rd, 95.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 9.8 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(18.9) write-amplify(8.6) OK, records in: 7703, records dropped: 517 output_compression: NoCompression
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.180801) EVENT_LOG_v1 {"time_micros": 1769847718180791, "job": 60, "event": "compaction_finished", "compaction_time_micros": 99537, "compaction_time_cpu_micros": 40775, "output_level": 6, "num_output_files": 1, "total_output_size": 9512235, "num_input_records": 7703, "num_output_records": 7186, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718181050, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847718182108, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.078870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.182346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.182357) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.182361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.182364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:21:58.182367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:21:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:21:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:21:58.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:21:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:21:58 np0005603622 nova_compute[221301]: 2026-01-31 08:21:58.732 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:21:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:21:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:21:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:21:58.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:00.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:00.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:01 np0005603622 nova_compute[221301]: 2026-01-31 08:22:01.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 34K writes, 134K keys, 34K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.04 MB/s#012Cumulative WAL: 34K writes, 12K syncs, 2.82 writes per sync, written: 0.13 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5710 writes, 21K keys, 5710 commit groups, 1.0 writes per commit group, ingest: 21.07 MB, 0.04 MB/s#012Interval WAL: 5711 writes, 2324 syncs, 2.46 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:22:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:02.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:02.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:03 np0005603622 nova_compute[221301]: 2026-01-31 08:22:03.787 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:04.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:04.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:06 np0005603622 nova_compute[221301]: 2026-01-31 08:22:06.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:06.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:06.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:07 np0005603622 nova_compute[221301]: 2026-01-31 08:22:07.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:08.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:08 np0005603622 nova_compute[221301]: 2026-01-31 08:22:08.789 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:09.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:10.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:11.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:11 np0005603622 nova_compute[221301]: 2026-01-31 08:22:11.606 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:12.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:13.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:13 np0005603622 nova_compute[221301]: 2026-01-31 08:22:13.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:14.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:15.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:15 np0005603622 podman[261994]: 2026-01-31 08:22:15.158250926 +0000 UTC m=+0.072493309 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:22:15 np0005603622 podman[261993]: 2026-01-31 08:22:15.180568519 +0000 UTC m=+0.101237215 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:22:16 np0005603622 nova_compute[221301]: 2026-01-31 08:22:16.609 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:16.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:17.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:18.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:18 np0005603622 nova_compute[221301]: 2026-01-31 08:22:18.845 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:19.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:19.493 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:19 np0005603622 nova_compute[221301]: 2026-01-31 08:22:19.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:19.495 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:22:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:20.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:21.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:21 np0005603622 nova_compute[221301]: 2026-01-31 08:22:21.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:22.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:22.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:23.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:23 np0005603622 nova_compute[221301]: 2026-01-31 08:22:23.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:23 np0005603622 nova_compute[221301]: 2026-01-31 08:22:23.847 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:24.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:25.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:26 np0005603622 nova_compute[221301]: 2026-01-31 08:22:26.615 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:26.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:27.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 9914 writes, 50K keys, 9914 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 9914 writes, 9914 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1558 writes, 7892 keys, 1558 commit groups, 1.0 writes per commit group, ingest: 15.40 MB, 0.03 MB/s#012Interval WAL: 1558 writes, 1558 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     49.7      1.24              0.19        30    0.041       0      0       0.0       0.0#012  L6      1/0    9.07 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.4     99.1     83.0      3.28              0.86        29    0.113    174K    16K       0.0       0.0#012 Sum      1/0    9.07 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.4     71.9     73.9      4.52              1.05        59    0.077    174K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     82.0     79.7      1.02              0.34        14    0.073     52K   3633       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0     99.1     83.0      3.28              0.86        29    0.113    174K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     49.8      1.24              0.19        29    0.043       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.060, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.33 GB write, 0.09 MB/s write, 0.32 GB read, 0.09 MB/s read, 4.5 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 36.28 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000421 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2081,34.96 MB,11.4995%) FilterBlock(59,493.67 KB,0.158586%) IndexBlock(59,859.45 KB,0.276089%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:22:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:28.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:28 np0005603622 nova_compute[221301]: 2026-01-31 08:22:28.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:29.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:22:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:30.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:30.876 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:30.877 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:30.878 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:31.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:31 np0005603622 nova_compute[221301]: 2026-01-31 08:22:31.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:32 np0005603622 nova_compute[221301]: 2026-01-31 08:22:32.572 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:32 np0005603622 nova_compute[221301]: 2026-01-31 08:22:32.572 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:32.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:32 np0005603622 nova_compute[221301]: 2026-01-31 08:22:32.753 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:22:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:33.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.095 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.096 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.105 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.106 221324 INFO nova.compute.claims [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:22:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.911 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:33 np0005603622 nova_compute[221301]: 2026-01-31 08:22:33.935 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/312419286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.398 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.405 221324 DEBUG nova.compute.provider_tree [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.430 221324 DEBUG nova.scheduler.client.report [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.596 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.597 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:22:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:34.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.815 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:22:34 np0005603622 nova_compute[221301]: 2026-01-31 08:22:34.816 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:22:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:35.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:35 np0005603622 nova_compute[221301]: 2026-01-31 08:22:35.216 221324 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:22:35 np0005603622 nova_compute[221301]: 2026-01-31 08:22:35.393 221324 DEBUG nova.policy [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c086a82bd0384612a78981006889df41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96de645f38844180b404d1a7cf7dd460', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:22:35 np0005603622 nova_compute[221301]: 2026-01-31 08:22:35.530 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:22:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:36Z|00492|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.474 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.476 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.477 221324 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Creating image(s)#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.516 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.563 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.595 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.600 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.627 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.671 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.673 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.675 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.676 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:36.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.714 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:36 np0005603622 nova_compute[221301]: 2026-01-31 08:22:36.721 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7d560595-056b-4b0e-a456-a73fcc457613_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:37 np0005603622 nova_compute[221301]: 2026-01-31 08:22:37.217 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Successfully created port: edff8c3b-f880-4e97-ac56-580e4793ca67 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.041 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7d560595-056b-4b0e-a456-a73fcc457613_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.150 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] resizing rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.535 221324 DEBUG nova.objects.instance [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'migration_context' on Instance uuid 7d560595-056b-4b0e-a456-a73fcc457613 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:38.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:22:38 np0005603622 nova_compute[221301]: 2026-01-31 08:22:38.914 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:39 np0005603622 nova_compute[221301]: 2026-01-31 08:22:39.084 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:22:39 np0005603622 nova_compute[221301]: 2026-01-31 08:22:39.085 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Ensure instance console log exists: /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:22:39 np0005603622 nova_compute[221301]: 2026-01-31 08:22:39.086 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:39 np0005603622 nova_compute[221301]: 2026-01-31 08:22:39.087 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:39 np0005603622 nova_compute[221301]: 2026-01-31 08:22:39.087 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:40.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:41.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.183 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Successfully updated port: edff8c3b-f880-4e97-ac56-580e4793ca67 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.347 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.348 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquired lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.348 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.401 221324 DEBUG nova.compute.manager [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-changed-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.402 221324 DEBUG nova.compute.manager [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Refreshing instance network info cache due to event network-changed-edff8c3b-f880-4e97-ac56-580e4793ca67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.403 221324 DEBUG oslo_concurrency.lockutils [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:41 np0005603622 nova_compute[221301]: 2026-01-31 08:22:41.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:42 np0005603622 nova_compute[221301]: 2026-01-31 08:22:42.175 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:22:42 np0005603622 nova_compute[221301]: 2026-01-31 08:22:42.399 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:42 np0005603622 nova_compute[221301]: 2026-01-31 08:22:42.400 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:22:42 np0005603622 nova_compute[221301]: 2026-01-31 08:22:42.400 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:22:42 np0005603622 nova_compute[221301]: 2026-01-31 08:22:42.672 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:22:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:42.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:43.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:43 np0005603622 nova_compute[221301]: 2026-01-31 08:22:43.651 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:22:43 np0005603622 nova_compute[221301]: 2026-01-31 08:22:43.652 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:43 np0005603622 nova_compute[221301]: 2026-01-31 08:22:43.652 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:22:43 np0005603622 nova_compute[221301]: 2026-01-31 08:22:43.652 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:43 np0005603622 nova_compute[221301]: 2026-01-31 08:22:43.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:44.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:45.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:46 np0005603622 podman[262528]: 2026-01-31 08:22:46.187985197 +0000 UTC m=+0.105953762 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:22:46 np0005603622 podman[262527]: 2026-01-31 08:22:46.231248676 +0000 UTC m=+0.149277843 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.267 221324 DEBUG nova.network.neutron [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Updating instance_info_cache with network_info: [{"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.401 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Releasing lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.402 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Instance network_info: |[{"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.403 221324 DEBUG oslo_concurrency.lockutils [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.403 221324 DEBUG nova.network.neutron [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Refreshing network info cache for port edff8c3b-f880-4e97-ac56-580e4793ca67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.407 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Start _get_guest_xml network_info=[{"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.413 221324 WARNING nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.435 221324 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.436 221324 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.444 221324 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.445 221324 DEBUG nova.virt.libvirt.host [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.447 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.447 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.448 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.448 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.448 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.448 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.448 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.449 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.449 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.449 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.449 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.449 221324 DEBUG nova.virt.hardware [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.453 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:46 np0005603622 nova_compute[221301]: 2026-01-31 08:22:46.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:46.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:47.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2930764358' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.106 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.141 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.145 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:22:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3164631048' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.632 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.636 221324 DEBUG nova.virt.libvirt.vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-1',id=113,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:35Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=7d560595-056b-4b0e-a456-a73fcc457613,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.637 221324 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.638 221324 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.640 221324 DEBUG nova.objects.instance [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7d560595-056b-4b0e-a456-a73fcc457613 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.688 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <uuid>7d560595-056b-4b0e-a456-a73fcc457613</uuid>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <name>instance-00000071</name>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:name>tempest-tempest.common.compute-instance-147995892-1</nova:name>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:22:46</nova:creationTime>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:user uuid="c086a82bd0384612a78981006889df41">tempest-MultipleCreateTestJSON-174245429-project-member</nova:user>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:project uuid="96de645f38844180b404d1a7cf7dd460">tempest-MultipleCreateTestJSON-174245429</nova:project>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <nova:port uuid="edff8c3b-f880-4e97-ac56-580e4793ca67">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="serial">7d560595-056b-4b0e-a456-a73fcc457613</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="uuid">7d560595-056b-4b0e-a456-a73fcc457613</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7d560595-056b-4b0e-a456-a73fcc457613_disk">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7d560595-056b-4b0e-a456-a73fcc457613_disk.config">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:a0:7b:8e"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <target dev="tapedff8c3b-f8"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/console.log" append="off"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:22:47 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:22:47 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:22:47 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:22:47 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.689 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Preparing to wait for external event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.689 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.689 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.690 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.690 221324 DEBUG nova.virt.libvirt.vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-1',id=113,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:22:35Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=7d560595-056b-4b0e-a456-a73fcc457613,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.691 221324 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.691 221324 DEBUG nova.network.os_vif_util [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.692 221324 DEBUG os_vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.692 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.693 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.694 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.703 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.704 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapedff8c3b-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.705 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapedff8c3b-f8, col_values=(('external_ids', {'iface-id': 'edff8c3b-f880-4e97-ac56-580e4793ca67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:7b:8e', 'vm-uuid': '7d560595-056b-4b0e-a456-a73fcc457613'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:47 np0005603622 NetworkManager[49080]: <info>  [1769847767.7113] manager: (tapedff8c3b-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.716 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.717 221324 INFO os_vif [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8')#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.910 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.911 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.912 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No VIF found with MAC fa:16:3e:a0:7b:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.913 221324 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Using config drive#033[00m
Jan 31 03:22:47 np0005603622 nova_compute[221301]: 2026-01-31 08:22:47.989 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:48.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:49.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.171 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.202 221324 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Creating config drive at /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.208 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpay2wn46l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.242 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.243 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.244 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.245 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.245 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.284 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.285 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.285 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.286 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.287 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.351 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpay2wn46l" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.387 221324 DEBUG nova.storage.rbd_utils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 7d560595-056b-4b0e-a456-a73fcc457613_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.393 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config 7d560595-056b-4b0e-a456-a73fcc457613_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.568 221324 DEBUG oslo_concurrency.processutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config 7d560595-056b-4b0e-a456-a73fcc457613_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.569 221324 INFO nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Deleting local config drive /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613/disk.config because it was imported into RBD.#033[00m
Jan 31 03:22:49 np0005603622 kernel: tapedff8c3b-f8: entered promiscuous mode
Jan 31 03:22:49 np0005603622 NetworkManager[49080]: <info>  [1769847769.6394] manager: (tapedff8c3b-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Jan 31 03:22:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:49Z|00493|binding|INFO|Claiming lport edff8c3b-f880-4e97-ac56-580e4793ca67 for this chassis.
Jan 31 03:22:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:49Z|00494|binding|INFO|edff8c3b-f880-4e97-ac56-580e4793ca67: Claiming fa:16:3e:a0:7b:8e 10.100.0.14
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:49Z|00495|binding|INFO|Setting lport edff8c3b-f880-4e97-ac56-580e4793ca67 ovn-installed in OVS
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:49 np0005603622 systemd-machined[190406]: New machine qemu-49-instance-00000071.
Jan 31 03:22:49 np0005603622 systemd-udevd[262724]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:22:49 np0005603622 systemd[1]: Started Virtual Machine qemu-49-instance-00000071.
Jan 31 03:22:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:49Z|00496|binding|INFO|Setting lport edff8c3b-f880-4e97-ac56-580e4793ca67 up in Southbound
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.695 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:7b:8e 10.100.0.14'], port_security=['fa:16:3e:a0:7b:8e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7d560595-056b-4b0e-a456-a73fcc457613', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=edff8c3b-f880-4e97-ac56-580e4793ca67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.698 139793 INFO neutron.agent.ovn.metadata.agent [-] Port edff8c3b-f880-4e97-ac56-580e4793ca67 in datapath bd2feb18-e01d-4084-b50c-13511157dde4 bound to our chassis#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.703 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd2feb18-e01d-4084-b50c-13511157dde4#033[00m
Jan 31 03:22:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:49 np0005603622 NetworkManager[49080]: <info>  [1769847769.7060] device (tapedff8c3b-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:22:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1169984765' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:49 np0005603622 NetworkManager[49080]: <info>  [1769847769.7091] device (tapedff8c3b-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.727 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[292a79ab-b984-445d-a3c5-d5c199d5b50b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.730 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd2feb18-e1 in ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.734 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd2feb18-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.734 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.734 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eac8e8c9-0cf4-47de-bb46-55c0bb2aa421]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.737 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8526649-9735-43be-ae7e-bc4c26b03836]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.752 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[03642158-a762-4117-9cec-63d5b517359e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.768 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8ec906-4970-4c13-a0f6-9e810531719a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.815 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bce89c0c-8505-4bea-8b84-9cd9f09c2915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 systemd-udevd[262727]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.822 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c06f501b-e9cf-4aa0-8312-9f1223f1f819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 NetworkManager[49080]: <info>  [1769847769.8255] manager: (tapbd2feb18-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.852 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a09774-92e3-435f-aaeb-1108fb7ec39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.857 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f309ceea-1c08-46e5-8d01-b1de52351a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 NetworkManager[49080]: <info>  [1769847769.8818] device (tapbd2feb18-e0): carrier: link connected
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.897 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.897 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.893 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[10005141-5a07-476a-9f82-a32d50b0421e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.906 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000071 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:49 np0005603622 nova_compute[221301]: 2026-01-31 08:22:49.907 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000071 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.914 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e19ffc-bc03-4b12-9209-ab8158df5d6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 702122, 'reachable_time': 30248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262759, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.937 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[631ec705-e068-4e89-883b-64ffe711b4b5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:e8ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 702122, 'tstamp': 702122}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262760, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.956 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e6937210-4dbd-425b-8057-28cad48ed957]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 702122, 'reachable_time': 30248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262761, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:49.990 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6d94af-e376-40e9-bfb2-6f4b78bee9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.041 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8a503d-f350-4c20-a373-98306cac7141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.043 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.043 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.043 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd2feb18-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:50 np0005603622 kernel: tapbd2feb18-e0: entered promiscuous mode
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.045 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603622 NetworkManager[49080]: <info>  [1769847770.0474] manager: (tapbd2feb18-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.051 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd2feb18-e0, col_values=(('external_ids', {'iface-id': '7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:50Z|00497|binding|INFO|Releasing lport 7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b from this chassis (sb_readonly=0)
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.055 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.058 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8e61af-2a6a-4c31-a1af-33775c65155f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.059 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:22:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:50.060 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'env', 'PROCESS_TAG=haproxy-bd2feb18-e01d-4084-b50c-13511157dde4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd2feb18-e01d-4084-b50c-13511157dde4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.136 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.138 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4338MB free_disk=20.83382797241211GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.138 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.138 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.300 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847770.2993119, 7d560595-056b-4b0e-a456-a73fcc457613 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.301 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] VM Started (Lifecycle Event)#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.349 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.351 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance d0c13002-57d9-4fad-8579-7343af29719d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.351 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 7d560595-056b-4b0e-a456-a73fcc457613 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.352 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.352 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.361 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847770.3019001, 7d560595-056b-4b0e-a456-a73fcc457613 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.362 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:22:50 np0005603622 podman[262835]: 2026-01-31 08:22:50.416982608 +0000 UTC m=+0.062773167 container create 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.422 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:22:50 np0005603622 systemd[1]: Started libpod-conmon-267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79.scope.
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.472 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:50 np0005603622 podman[262835]: 2026-01-31 08:22:50.380231745 +0000 UTC m=+0.026022304 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.479 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:50 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:22:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f0a6f6de277d605b6eddb1c1cc40f616014f6e73f176c8ce269ec7d51197383/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:22:50 np0005603622 podman[262835]: 2026-01-31 08:22:50.510749 +0000 UTC m=+0.156539579 container init 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:22:50 np0005603622 podman[262835]: 2026-01-31 08:22:50.516930407 +0000 UTC m=+0.162720966 container start 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:22:50 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [NOTICE]   (262855) : New worker (262858) forked
Jan 31 03:22:50 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [NOTICE]   (262855) : Loading success.
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.698 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:22:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:50.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:22:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/887492209' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.904 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.912 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:22:50 np0005603622 nova_compute[221301]: 2026-01-31 08:22:50.985 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:22:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:51.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.226 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.226 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.276 221324 DEBUG nova.network.neutron [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Updated VIF entry in instance network info cache for port edff8c3b-f880-4e97-ac56-580e4793ca67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.277 221324 DEBUG nova.network.neutron [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Updating instance_info_cache with network_info: [{"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.602 221324 DEBUG oslo_concurrency.lockutils [req-52d7cb4a-8a01-4a8f-9c7c-51c699ee8508 req-fdf14b41-8caa-49a2-9262-df60cb8b1dee fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7d560595-056b-4b0e-a456-a73fcc457613" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.627 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.642 221324 DEBUG nova.compute.manager [req-99ba5666-2d18-43cb-bd4b-b37e19b66b3a req-f8be932f-e8cf-4575-bf1f-f04a6f7cbc64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.643 221324 DEBUG oslo_concurrency.lockutils [req-99ba5666-2d18-43cb-bd4b-b37e19b66b3a req-f8be932f-e8cf-4575-bf1f-f04a6f7cbc64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.644 221324 DEBUG oslo_concurrency.lockutils [req-99ba5666-2d18-43cb-bd4b-b37e19b66b3a req-f8be932f-e8cf-4575-bf1f-f04a6f7cbc64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.644 221324 DEBUG oslo_concurrency.lockutils [req-99ba5666-2d18-43cb-bd4b-b37e19b66b3a req-f8be932f-e8cf-4575-bf1f-f04a6f7cbc64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.644 221324 DEBUG nova.compute.manager [req-99ba5666-2d18-43cb-bd4b-b37e19b66b3a req-f8be932f-e8cf-4575-bf1f-f04a6f7cbc64 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Processing event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.645 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.652 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847771.6522243, 7d560595-056b-4b0e-a456-a73fcc457613 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.653 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.657 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.661 221324 INFO nova.virt.libvirt.driver [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Instance spawned successfully.#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.662 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.711 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.720 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.725 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.727 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.727 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.728 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.728 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.729 221324 DEBUG nova.virt.libvirt.driver [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.771 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.865 221324 INFO nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Took 15.39 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.865 221324 DEBUG nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:22:51 np0005603622 nova_compute[221301]: 2026-01-31 08:22:51.989 221324 INFO nova.compute.manager [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Took 18.92 seconds to build instance.#033[00m
Jan 31 03:22:52 np0005603622 nova_compute[221301]: 2026-01-31 08:22:52.041 221324 DEBUG oslo_concurrency.lockutils [None req-b0df54dc-2823-4479-847d-323ab337b5d0 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:52 np0005603622 nova_compute[221301]: 2026-01-31 08:22:52.219 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:52 np0005603622 nova_compute[221301]: 2026-01-31 08:22:52.220 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:22:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:52.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:52 np0005603622 nova_compute[221301]: 2026-01-31 08:22:52.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:53.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.803 221324 DEBUG nova.compute.manager [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.804 221324 DEBUG oslo_concurrency.lockutils [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.804 221324 DEBUG oslo_concurrency.lockutils [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.805 221324 DEBUG oslo_concurrency.lockutils [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.805 221324 DEBUG nova.compute.manager [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] No waiting events found dispatching network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:53 np0005603622 nova_compute[221301]: 2026-01-31 08:22:53.805 221324 WARNING nova.compute.manager [req-16f899e4-63d5-492e-9769-b9d57c4d4d8b req-4f252922-9f17-45ae-b24c-964ddfb489b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received unexpected event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:22:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:54.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:55.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:56 np0005603622 nova_compute[221301]: 2026-01-31 08:22:56.631 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:56.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:57.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.338 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.339 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.340 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.341 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.341 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.343 221324 INFO nova.compute.manager [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Terminating instance#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.345 221324 DEBUG nova.compute.manager [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:22:57 np0005603622 kernel: tapedff8c3b-f8 (unregistering): left promiscuous mode
Jan 31 03:22:57 np0005603622 NetworkManager[49080]: <info>  [1769847777.4191] device (tapedff8c3b-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:22:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:57Z|00498|binding|INFO|Releasing lport edff8c3b-f880-4e97-ac56-580e4793ca67 from this chassis (sb_readonly=0)
Jan 31 03:22:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:57Z|00499|binding|INFO|Setting lport edff8c3b-f880-4e97-ac56-580e4793ca67 down in Southbound
Jan 31 03:22:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:22:57Z|00500|binding|INFO|Removing iface tapedff8c3b-f8 ovn-installed in OVS
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.473 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:7b:8e 10.100.0.14'], port_security=['fa:16:3e:a0:7b:8e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7d560595-056b-4b0e-a456-a73fcc457613', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=edff8c3b-f880-4e97-ac56-580e4793ca67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.476 139793 INFO neutron.agent.ovn.metadata.agent [-] Port edff8c3b-f880-4e97-ac56-580e4793ca67 in datapath bd2feb18-e01d-4084-b50c-13511157dde4 unbound from our chassis#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.478 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd2feb18-e01d-4084-b50c-13511157dde4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.480 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d2b22a-ff33-4d5a-adaf-2bd600a22d64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.480 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace which is not needed anymore#033[00m
Jan 31 03:22:57 np0005603622 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000071.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603622 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000071.scope: Consumed 6.361s CPU time.
Jan 31 03:22:57 np0005603622 systemd-machined[190406]: Machine qemu-49-instance-00000071 terminated.
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.588 221324 INFO nova.virt.libvirt.driver [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Instance destroyed successfully.#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.590 221324 DEBUG nova.objects.instance [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'resources' on Instance uuid 7d560595-056b-4b0e-a456-a73fcc457613 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.618 221324 DEBUG nova.virt.libvirt.vif [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:22:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-147995892',display_name='tempest-tempest.common.compute-instance-147995892-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-147995892-1',id=113,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:22:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-rktr52i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:22:51Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=7d560595-056b-4b0e-a456-a73fcc457613,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.620 221324 DEBUG nova.network.os_vif_util [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "edff8c3b-f880-4e97-ac56-580e4793ca67", "address": "fa:16:3e:a0:7b:8e", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedff8c3b-f8", "ovs_interfaceid": "edff8c3b-f880-4e97-ac56-580e4793ca67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.621 221324 DEBUG nova.network.os_vif_util [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.622 221324 DEBUG os_vif [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.625 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapedff8c3b-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.633 221324 INFO os_vif [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:7b:8e,bridge_name='br-int',has_traffic_filtering=True,id=edff8c3b-f880-4e97-ac56-580e4793ca67,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedff8c3b-f8')#033[00m
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [NOTICE]   (262855) : haproxy version is 2.8.14-c23fe91
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [NOTICE]   (262855) : path to executable is /usr/sbin/haproxy
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [WARNING]  (262855) : Exiting Master process...
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [WARNING]  (262855) : Exiting Master process...
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [ALERT]    (262855) : Current worker (262858) exited with code 143 (Terminated)
Jan 31 03:22:57 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[262851]: [WARNING]  (262855) : All workers exited. Exiting... (0)
Jan 31 03:22:57 np0005603622 systemd[1]: libpod-267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603622 conmon[262851]: conmon 267b1c5458519c3c3334 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79.scope/container/memory.events
Jan 31 03:22:57 np0005603622 podman[262911]: 2026-01-31 08:22:57.651493689 +0000 UTC m=+0.067853625 container died 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:22:57 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79-userdata-shm.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603622 systemd[1]: var-lib-containers-storage-overlay-5f0a6f6de277d605b6eddb1c1cc40f616014f6e73f176c8ce269ec7d51197383-merged.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603622 podman[262911]: 2026-01-31 08:22:57.714449941 +0000 UTC m=+0.130809857 container cleanup 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:22:57 np0005603622 systemd[1]: libpod-conmon-267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79.scope: Deactivated successfully.
Jan 31 03:22:57 np0005603622 podman[262967]: 2026-01-31 08:22:57.817758558 +0000 UTC m=+0.081554903 container remove 267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.824 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[14dd5a1e-46fc-4356-b113-3df8d67ecdd4]: (4, ('Sat Jan 31 08:22:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79)\n267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79\nSat Jan 31 08:22:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79)\n267b1c5458519c3c333478b32d7da636a2725a6536c1ba947d3b37e6525eab79\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.827 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d8945167-362c-4155-afc2-a92685821bf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.829 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.832 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 kernel: tapbd2feb18-e0: left promiscuous mode
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.843 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.848 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9f1412d0-2632-4ba1-8f32-bb9da8858b6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.867 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[021fcb60-09cf-4ff4-8fc0-cd190f6c92d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.870 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0209d5-3b45-49cb-97b7-ecbb46d07d39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.888 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[08cfbadf-9b37-4c2f-b8bc-23e04984a6b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 702115, 'reachable_time': 43461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262982, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 systemd[1]: run-netns-ovnmeta\x2dbd2feb18\x2de01d\x2d4084\x2db50c\x2d13511157dde4.mount: Deactivated successfully.
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.892 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:22:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:22:57.893 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[ae495d59-be09-4511-bbe7-b97b39b1d00c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.907 221324 DEBUG nova.compute.manager [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-unplugged-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.908 221324 DEBUG oslo_concurrency.lockutils [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.908 221324 DEBUG oslo_concurrency.lockutils [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.908 221324 DEBUG oslo_concurrency.lockutils [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.909 221324 DEBUG nova.compute.manager [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] No waiting events found dispatching network-vif-unplugged-edff8c3b-f880-4e97-ac56-580e4793ca67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:22:57 np0005603622 nova_compute[221301]: 2026-01-31 08:22:57.909 221324 DEBUG nova.compute.manager [req-135680ca-56e4-4b89-94c3-4f843f4eef06 req-c2355ddd-c326-497e-b3a3-f2c4a5cadeca fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-unplugged-edff8c3b-f880-4e97-ac56-580e4793ca67 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:22:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:22:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:22:58.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:22:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.742 221324 INFO nova.virt.libvirt.driver [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Deleting instance files /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613_del#033[00m
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.743 221324 INFO nova.virt.libvirt.driver [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Deletion of /var/lib/nova/instances/7d560595-056b-4b0e-a456-a73fcc457613_del complete#033[00m
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.885 221324 INFO nova.compute.manager [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Took 1.54 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.886 221324 DEBUG oslo.service.loopingcall [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.886 221324 DEBUG nova.compute.manager [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:22:58 np0005603622 nova_compute[221301]: 2026-01-31 08:22:58.886 221324 DEBUG nova.network.neutron [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:22:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:22:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:22:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:22:59.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:22:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:22:59 np0005603622 nova_compute[221301]: 2026-01-31 08:22:59.943 221324 DEBUG nova.network.neutron [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:22:59 np0005603622 nova_compute[221301]: 2026-01-31 08:22:59.980 221324 INFO nova.compute.manager [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Took 1.09 seconds to deallocate network for instance.#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.059 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.059 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.078 221324 DEBUG nova.compute.manager [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.078 221324 DEBUG oslo_concurrency.lockutils [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7d560595-056b-4b0e-a456-a73fcc457613-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.079 221324 DEBUG oslo_concurrency.lockutils [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.079 221324 DEBUG oslo_concurrency.lockutils [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.080 221324 DEBUG nova.compute.manager [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] No waiting events found dispatching network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.080 221324 WARNING nova.compute.manager [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received unexpected event network-vif-plugged-edff8c3b-f880-4e97-ac56-580e4793ca67 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.080 221324 DEBUG nova.compute.manager [req-2ebcd512-ccbc-4fd7-8fa3-e9ae5cb01436 req-0099fb82-9774-4d31-bf91-51a280163639 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Received event network-vif-deleted-edff8c3b-f880-4e97-ac56-580e4793ca67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.186 221324 DEBUG oslo_concurrency.processutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/654283632' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.640 221324 DEBUG oslo_concurrency.processutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.648 221324 DEBUG nova.compute.provider_tree [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.683 221324 DEBUG nova.scheduler.client.report [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:00.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.763 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.832 221324 INFO nova.scheduler.client.report [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Deleted allocations for instance 7d560595-056b-4b0e-a456-a73fcc457613#033[00m
Jan 31 03:23:00 np0005603622 nova_compute[221301]: 2026-01-31 08:23:00.970 221324 DEBUG oslo_concurrency.lockutils [None req-5d282b9e-36f2-4dd7-8203-c559f5b62ef4 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "7d560595-056b-4b0e-a456-a73fcc457613" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:01.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:01 np0005603622 nova_compute[221301]: 2026-01-31 08:23:01.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:02 np0005603622 nova_compute[221301]: 2026-01-31 08:23:02.641 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:02.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:03.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:04.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:05.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:06 np0005603622 nova_compute[221301]: 2026-01-31 08:23:06.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:06.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:07.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:07 np0005603622 nova_compute[221301]: 2026-01-31 08:23:07.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:08.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:09.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:10.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:11.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.184 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.185 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.236 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.463 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.464 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.472 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.473 221324 INFO nova.compute.claims [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.637 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:11 np0005603622 nova_compute[221301]: 2026-01-31 08:23:11.908 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.384 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.393 221324 DEBUG nova.compute.provider_tree [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.449 221324 DEBUG nova.scheduler.client.report [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.546 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.547 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.586 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847777.5852075, 7d560595-056b-4b0e-a456-a73fcc457613 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.586 221324 INFO nova.compute.manager [-] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.651 221324 DEBUG nova.compute.manager [None req-4688ff24-f474-4875-a1e7-cd45a17fac62 - - - - - -] [instance: 7d560595-056b-4b0e-a456-a73fcc457613] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.722 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.723 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:23:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:12.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.832 221324 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:23:12 np0005603622 nova_compute[221301]: 2026-01-31 08:23:12.878 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:23:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:13.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.179 221324 DEBUG nova.policy [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c086a82bd0384612a78981006889df41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96de645f38844180b404d1a7cf7dd460', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.188 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.190 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.191 221324 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Creating image(s)#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.234 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.282 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.346 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.354 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.443 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.445 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.446 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.447 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.484 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:13 np0005603622 nova_compute[221301]: 2026-01-31 08:23:13.490 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 04eaec43-2346-4011-a231-cd69f33a0457_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:14.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:15.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:15 np0005603622 nova_compute[221301]: 2026-01-31 08:23:15.134 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Successfully created port: 75e04f4b-0653-4f2c-8afb-7c8536d903d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:23:16 np0005603622 nova_compute[221301]: 2026-01-31 08:23:16.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:16.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:17.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:17 np0005603622 podman[263124]: 2026-01-31 08:23:17.150870035 +0000 UTC m=+0.069798017 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 03:23:17 np0005603622 podman[263123]: 2026-01-31 08:23:17.206711166 +0000 UTC m=+0.124044405 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.309 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Successfully updated port: 75e04f4b-0653-4f2c-8afb-7c8536d903d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.387 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.388 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquired lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.389 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.554 221324 DEBUG nova.compute.manager [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-changed-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.555 221324 DEBUG nova.compute.manager [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Refreshing instance network info cache due to event network-changed-75e04f4b-0653-4f2c-8afb-7c8536d903d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.556 221324 DEBUG oslo_concurrency.lockutils [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:17 np0005603622 nova_compute[221301]: 2026-01-31 08:23:17.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 31 03:23:18 np0005603622 nova_compute[221301]: 2026-01-31 08:23:18.613 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:23:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:18.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:19.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:20.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.019 221324 DEBUG nova.network.neutron [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Updating instance_info_cache with network_info: [{"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.099 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Releasing lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.099 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Instance network_info: |[{"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.100 221324 DEBUG oslo_concurrency.lockutils [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.101 221324 DEBUG nova.network.neutron [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Refreshing network info cache for port 75e04f4b-0653-4f2c-8afb-7c8536d903d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:21.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.207 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 04eaec43-2346-4011-a231-cd69f33a0457_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.717s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.327 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] resizing rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:23:21 np0005603622 nova_compute[221301]: 2026-01-31 08:23:21.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.329 221324 DEBUG nova.objects.instance [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'migration_context' on Instance uuid 04eaec43-2346-4011-a231-cd69f33a0457 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.355 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.356 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Ensure instance console log exists: /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.357 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.357 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.358 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.362 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Start _get_guest_xml network_info=[{"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.368 221324 WARNING nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.378 221324 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.379 221324 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.384 221324 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.385 221324 DEBUG nova.virt.libvirt.host [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.387 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.387 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.388 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.388 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.388 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.389 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.389 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.389 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.390 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.390 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.390 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.390 221324 DEBUG nova.virt.hardware [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.394 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:22 np0005603622 nova_compute[221301]: 2026-01-31 08:23:22.710 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:22.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2700342929' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:23:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:23.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.141 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.746s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.178 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.186 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:23:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3833097749' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.648 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.652 221324 DEBUG nova.virt.libvirt.vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-1',id=116,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:12Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=04eaec43-2346-4011-a231-cd69f33a0457,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.653 221324 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.654 221324 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.657 221324 DEBUG nova.objects.instance [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'pci_devices' on Instance uuid 04eaec43-2346-4011-a231-cd69f33a0457 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.698 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <uuid>04eaec43-2346-4011-a231-cd69f33a0457</uuid>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <name>instance-00000074</name>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:name>tempest-MultipleCreateTestJSON-server-1654578525-1</nova:name>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:23:22</nova:creationTime>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:user uuid="c086a82bd0384612a78981006889df41">tempest-MultipleCreateTestJSON-174245429-project-member</nova:user>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:project uuid="96de645f38844180b404d1a7cf7dd460">tempest-MultipleCreateTestJSON-174245429</nova:project>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <nova:port uuid="75e04f4b-0653-4f2c-8afb-7c8536d903d8">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="serial">04eaec43-2346-4011-a231-cd69f33a0457</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="uuid">04eaec43-2346-4011-a231-cd69f33a0457</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/04eaec43-2346-4011-a231-cd69f33a0457_disk">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/04eaec43-2346-4011-a231-cd69f33a0457_disk.config">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:cc:9e:47"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <target dev="tap75e04f4b-06"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/console.log" append="off"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:23:23 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:23:23 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:23:23 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:23:23 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.701 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Preparing to wait for external event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.702 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.702 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.703 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.704 221324 DEBUG nova.virt.libvirt.vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-1',id=116,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:12Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=04eaec43-2346-4011-a231-cd69f33a0457,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.705 221324 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.706 221324 DEBUG nova.network.os_vif_util [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.707 221324 DEBUG os_vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.708 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.709 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.710 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.724 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.724 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75e04f4b-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.726 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75e04f4b-06, col_values=(('external_ids', {'iface-id': '75e04f4b-0653-4f2c-8afb-7c8536d903d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:9e:47', 'vm-uuid': '04eaec43-2346-4011-a231-cd69f33a0457'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.761 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:23 np0005603622 NetworkManager[49080]: <info>  [1769847803.7653] manager: (tap75e04f4b-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.776 221324 INFO os_vif [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06')#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.934 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.937 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.938 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] No VIF found with MAC fa:16:3e:cc:9e:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:23:23 np0005603622 nova_compute[221301]: 2026-01-31 08:23:23.939 221324 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Using config drive#033[00m
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.162 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.686 221324 DEBUG nova.network.neutron [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Updated VIF entry in instance network info cache for port 75e04f4b-0653-4f2c-8afb-7c8536d903d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.687 221324 DEBUG nova.network.neutron [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Updating instance_info_cache with network_info: [{"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.743 221324 DEBUG oslo_concurrency.lockutils [req-f4dd8e29-be12-46af-96ae-ba50ffdf50b2 req-2131f98e-0d49-4e28-a421-8f7177b18425 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-04eaec43-2346-4011-a231-cd69f33a0457" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:23:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:24.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.911 221324 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Creating config drive at /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config#033[00m
Jan 31 03:23:24 np0005603622 nova_compute[221301]: 2026-01-31 08:23:24.915 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx8b5p2h_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:25 np0005603622 nova_compute[221301]: 2026-01-31 08:23:25.052 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpx8b5p2h_" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:25 np0005603622 nova_compute[221301]: 2026-01-31 08:23:25.086 221324 DEBUG nova.storage.rbd_utils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] rbd image 04eaec43-2346-4011-a231-cd69f33a0457_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:23:25 np0005603622 nova_compute[221301]: 2026-01-31 08:23:25.092 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config 04eaec43-2346-4011-a231-cd69f33a0457_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:25.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 31 03:23:25 np0005603622 nova_compute[221301]: 2026-01-31 08:23:25.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.364 221324 DEBUG oslo_concurrency.processutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config 04eaec43-2346-4011-a231-cd69f33a0457_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.272s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.366 221324 INFO nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Deleting local config drive /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457/disk.config because it was imported into RBD.#033[00m
Jan 31 03:23:26 np0005603622 kernel: tap75e04f4b-06: entered promiscuous mode
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.4389] manager: (tap75e04f4b-06): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 31 03:23:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:26Z|00501|binding|INFO|Claiming lport 75e04f4b-0653-4f2c-8afb-7c8536d903d8 for this chassis.
Jan 31 03:23:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:26Z|00502|binding|INFO|75e04f4b-0653-4f2c-8afb-7c8536d903d8: Claiming fa:16:3e:cc:9e:47 10.100.0.7
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:26Z|00503|binding|INFO|Setting lport 75e04f4b-0653-4f2c-8afb-7c8536d903d8 ovn-installed in OVS
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:26Z|00504|binding|INFO|Setting lport 75e04f4b-0653-4f2c-8afb-7c8536d903d8 up in Southbound
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.462 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:9e:47 10.100.0.7'], port_security=['fa:16:3e:cc:9e:47 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '04eaec43-2346-4011-a231-cd69f33a0457', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=75e04f4b-0653-4f2c-8afb-7c8536d903d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.465 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 75e04f4b-0653-4f2c-8afb-7c8536d903d8 in datapath bd2feb18-e01d-4084-b50c-13511157dde4 bound to our chassis#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.468 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd2feb18-e01d-4084-b50c-13511157dde4#033[00m
Jan 31 03:23:26 np0005603622 systemd-machined[190406]: New machine qemu-50-instance-00000074.
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.487 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4362deba-88e5-4802-8103-036c095d7112]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 systemd[1]: Started Virtual Machine qemu-50-instance-00000074.
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.489 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd2feb18-e1 in ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.494 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd2feb18-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.494 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c698f87e-1517-423c-b644-c9fe701b5775]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.496 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0b586d3b-08a6-4271-867c-8d68673184fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 systemd-udevd[263378]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.522 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b12fc3ff-2f2f-4876-910c-18e5843a0260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.5319] device (tap75e04f4b-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.5340] device (tap75e04f4b-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.548 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d02c53cc-493b-4ad8-9b34-5454c46388ed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.594 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b723298c-412a-42cc-a19f-f69f753ef2a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.602 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[912b1eb3-4421-4c0d-adb3-fbf96989047e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.6041] manager: (tapbd2feb18-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.641 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[26ac911e-52c5-453f-adc6-5a6897424b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.646 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.651 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[387fddaa-98e5-4233-95b0-3cd453090850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.6734] device (tapbd2feb18-e0): carrier: link connected
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.679 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7da485-4256-4830-a3b1-1a7cbc63e416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.702 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3e175fee-fad3-425f-9c80-b4b93a467f32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705801, 'reachable_time': 29949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263408, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.721 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e3f2f0-1c05-4c37-8ef7-c5991d47a4db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:e8ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705801, 'tstamp': 705801}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263409, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.745 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b30ca2-49e8-4886-a9af-6870f887cebf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd2feb18-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:e8:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705801, 'reachable_time': 29949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263410, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.764 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.767 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.786 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba8f923-5bc1-435e-8393-6551a8b4a145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.857 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c0f96b-1ef0-4dfb-8458-a04b63ad597c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.859 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.860 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.861 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd2feb18-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.864 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 NetworkManager[49080]: <info>  [1769847806.8655] manager: (tapbd2feb18-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 31 03:23:26 np0005603622 kernel: tapbd2feb18-e0: entered promiscuous mode
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.867 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.869 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd2feb18-e0, col_values=(('external_ids', {'iface-id': '7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.870 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:26Z|00505|binding|INFO|Releasing lport 7b95dd4c-16d2-4ff4-9598-b3ff910c3f1b from this chassis (sb_readonly=0)
Jan 31 03:23:26 np0005603622 nova_compute[221301]: 2026-01-31 08:23:26.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.880 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.881 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[01adba17-4840-42c5-93f4-d4b9d2f9d1bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.882 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/bd2feb18-e01d-4084-b50c-13511157dde4.pid.haproxy
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID bd2feb18-e01d-4084-b50c-13511157dde4
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:23:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:26.883 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'env', 'PROCESS_TAG=haproxy-bd2feb18-e01d-4084-b50c-13511157dde4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd2feb18-e01d-4084-b50c-13511157dde4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:23:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 31 03:23:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:27.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:27 np0005603622 podman[263449]: 2026-01-31 08:23:27.293824629 +0000 UTC m=+0.089757874 container create 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:23:27 np0005603622 podman[263449]: 2026-01-31 08:23:27.240030033 +0000 UTC m=+0.035963298 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:23:27 np0005603622 systemd[1]: Started libpod-conmon-7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b.scope.
Jan 31 03:23:27 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:23:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bcd045494b5456c794fb8672f10200fd968c14949e3b85b2e067abe32b6205e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:27 np0005603622 podman[263449]: 2026-01-31 08:23:27.406617941 +0000 UTC m=+0.202551226 container init 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:23:27 np0005603622 podman[263449]: 2026-01-31 08:23:27.415054307 +0000 UTC m=+0.210987592 container start 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:23:27 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [NOTICE]   (263479) : New worker (263481) forked
Jan 31 03:23:27 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [NOTICE]   (263479) : Loading success.
Jan 31 03:23:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:27.490 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:23:27 np0005603622 nova_compute[221301]: 2026-01-31 08:23:27.967 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847807.9655843, 04eaec43-2346-4011-a231-cd69f33a0457 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:27 np0005603622 nova_compute[221301]: 2026-01-31 08:23:27.967 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] VM Started (Lifecycle Event)#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:27.999 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.009 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847807.9667015, 04eaec43-2346-4011-a231-cd69f33a0457 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.010 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.016 221324 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.016 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.017 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.017 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.017 221324 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Processing event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.018 221324 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.018 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.018 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.019 221324 DEBUG oslo_concurrency.lockutils [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.019 221324 DEBUG nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] No waiting events found dispatching network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.019 221324 WARNING nova.compute.manager [req-5f251805-c3a8-4dc2-a4b4-bdd98350ac5e req-aa99e2e2-3794-4dab-a1e8-f0d767686f9f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received unexpected event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.020 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.027 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.032 221324 INFO nova.virt.libvirt.driver [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Instance spawned successfully.#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.033 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.080 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.086 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.087 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.087 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.088 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.088 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.088 221324 DEBUG nova.virt.libvirt.driver [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.092 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847808.025869, 04eaec43-2346-4011-a231-cd69f33a0457 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.093 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.136 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.141 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.175 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.181 221324 INFO nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Took 14.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.182 221324 DEBUG nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.287 221324 INFO nova.compute.manager [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Took 16.87 seconds to build instance.#033[00m
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.332 221324 DEBUG oslo_concurrency.lockutils [None req-c3ab3215-7a30-4b59-a29d-a1c865c28ed5 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:28.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:28 np0005603622 nova_compute[221301]: 2026-01-31 08:23:28.810 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:29.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:30.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:30.878 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:30.879 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:30.880 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:31.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.294 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.295 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.295 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.295 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.296 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.297 221324 INFO nova.compute.manager [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Terminating instance#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.298 221324 DEBUG nova.compute.manager [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.649 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 kernel: tap75e04f4b-06 (unregistering): left promiscuous mode
Jan 31 03:23:31 np0005603622 NetworkManager[49080]: <info>  [1769847811.8580] device (tap75e04f4b-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:23:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:31Z|00506|binding|INFO|Releasing lport 75e04f4b-0653-4f2c-8afb-7c8536d903d8 from this chassis (sb_readonly=0)
Jan 31 03:23:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:31Z|00507|binding|INFO|Setting lport 75e04f4b-0653-4f2c-8afb-7c8536d903d8 down in Southbound
Jan 31 03:23:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:31Z|00508|binding|INFO|Removing iface tap75e04f4b-06 ovn-installed in OVS
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:31.877 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:9e:47 10.100.0.7'], port_security=['fa:16:3e:cc:9e:47 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '04eaec43-2346-4011-a231-cd69f33a0457', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd2feb18-e01d-4084-b50c-13511157dde4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96de645f38844180b404d1a7cf7dd460', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f7abf9c-ddb4-47da-9619-41273b5c7231', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6f1841d-a97e-4124-981b-627c1dc4d00d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=75e04f4b-0653-4f2c-8afb-7c8536d903d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:31.879 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 75e04f4b-0653-4f2c-8afb-7c8536d903d8 in datapath bd2feb18-e01d-4084-b50c-13511157dde4 unbound from our chassis#033[00m
Jan 31 03:23:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:31.882 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd2feb18-e01d-4084-b50c-13511157dde4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:23:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:31.883 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[342ef003-48b0-492f-885e-3aaae9145c2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:31.884 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 namespace which is not needed anymore#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 31 03:23:31 np0005603622 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000074.scope: Consumed 4.139s CPU time.
Jan 31 03:23:31 np0005603622 systemd-machined[190406]: Machine qemu-50-instance-00000074 terminated.
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.946 221324 INFO nova.virt.libvirt.driver [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Instance destroyed successfully.#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.947 221324 DEBUG nova.objects.instance [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lazy-loading 'resources' on Instance uuid 04eaec43-2346-4011-a231-cd69f33a0457 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.961 221324 DEBUG nova.virt.libvirt.vif [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:23:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1654578525',display_name='tempest-MultipleCreateTestJSON-server-1654578525-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1654578525-1',id=116,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:23:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96de645f38844180b404d1a7cf7dd460',ramdisk_id='',reservation_id='r-o4skbs7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-174245429',owner_user_name='tempest-MultipleCreateTestJSON-174245429-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:23:28Z,user_data=None,user_id='c086a82bd0384612a78981006889df41',uuid=04eaec43-2346-4011-a231-cd69f33a0457,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.962 221324 DEBUG nova.network.os_vif_util [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converting VIF {"id": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "address": "fa:16:3e:cc:9e:47", "network": {"id": "bd2feb18-e01d-4084-b50c-13511157dde4", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1698880459-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96de645f38844180b404d1a7cf7dd460", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75e04f4b-06", "ovs_interfaceid": "75e04f4b-0653-4f2c-8afb-7c8536d903d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.963 221324 DEBUG nova.network.os_vif_util [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.964 221324 DEBUG os_vif [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.967 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.967 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75e04f4b-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.973 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:31 np0005603622 nova_compute[221301]: 2026-01-31 08:23:31.977 221324 INFO os_vif [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:9e:47,bridge_name='br-int',has_traffic_filtering=True,id=75e04f4b-0653-4f2c-8afb-7c8536d903d8,network=Network(bd2feb18-e01d-4084-b50c-13511157dde4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75e04f4b-06')#033[00m
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [NOTICE]   (263479) : haproxy version is 2.8.14-c23fe91
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [NOTICE]   (263479) : path to executable is /usr/sbin/haproxy
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [WARNING]  (263479) : Exiting Master process...
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [WARNING]  (263479) : Exiting Master process...
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [ALERT]    (263479) : Current worker (263481) exited with code 143 (Terminated)
Jan 31 03:23:32 np0005603622 neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4[263474]: [WARNING]  (263479) : All workers exited. Exiting... (0)
Jan 31 03:23:32 np0005603622 systemd[1]: libpod-7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b.scope: Deactivated successfully.
Jan 31 03:23:32 np0005603622 podman[263560]: 2026-01-31 08:23:32.076597231 +0000 UTC m=+0.069905441 container died 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:23:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b-userdata-shm.mount: Deactivated successfully.
Jan 31 03:23:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9bcd045494b5456c794fb8672f10200fd968c14949e3b85b2e067abe32b6205e-merged.mount: Deactivated successfully.
Jan 31 03:23:32 np0005603622 podman[263560]: 2026-01-31 08:23:32.188875659 +0000 UTC m=+0.182183839 container cleanup 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:23:32 np0005603622 systemd[1]: libpod-conmon-7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b.scope: Deactivated successfully.
Jan 31 03:23:32 np0005603622 podman[263596]: 2026-01-31 08:23:32.304223079 +0000 UTC m=+0.091256294 container remove 7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.309 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e35aec-a71d-49ef-a624-134a7f598cc0]: (4, ('Sat Jan 31 08:23:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b)\n7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b\nSat Jan 31 08:23:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 (7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b)\n7c9a9dccbfff4bf112333d3eeb21de0e3be444ed5c983e7701a3b4201f1ca53b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.312 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0015ebc1-36c9-4858-920b-b6fc959a7e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.313 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd2feb18-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:32 np0005603622 nova_compute[221301]: 2026-01-31 08:23:32.343 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603622 kernel: tapbd2feb18-e0: left promiscuous mode
Jan 31 03:23:32 np0005603622 nova_compute[221301]: 2026-01-31 08:23:32.354 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.358 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a01ce27-a679-4a8c-bcc7-0b6d7967a059]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.376 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0ffe6e-354c-4774-9ea2-783217c8a8b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.378 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b9da8ace-09e5-439f-8781-b3d90b43e060]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.395 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d820c4-3d3a-4f74-9185-50e9593d45cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705792, 'reachable_time': 39761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263610, 'error': None, 'target': 'ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.398 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd2feb18-e01d-4084-b50c-13511157dde4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:23:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:32.398 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[33d74f46-7210-4009-8541-7ee4bf348490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:32 np0005603622 systemd[1]: run-netns-ovnmeta\x2dbd2feb18\x2de01d\x2d4084\x2db50c\x2d13511157dde4.mount: Deactivated successfully.
Jan 31 03:23:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.004000108s ======
Jan 31 03:23:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:32.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000108s
Jan 31 03:23:32 np0005603622 nova_compute[221301]: 2026-01-31 08:23:32.914 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:33.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:33 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 31 03:23:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:33.493 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:33 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 31 03:23:33 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 31 03:23:34 np0005603622 nova_compute[221301]: 2026-01-31 08:23:34.235 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:34 np0005603622 nova_compute[221301]: 2026-01-31 08:23:34.236 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:34 np0005603622 nova_compute[221301]: 2026-01-31 08:23:34.236 221324 DEBUG nova.network.neutron [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:34 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 31 03:23:34 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 31 03:23:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:34.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:35.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.310 221324 DEBUG nova.network.neutron [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.349 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.520 221324 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.521 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Creating file /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/1634625eab114797a9970d9da3eaf229.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.521 221324 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/1634625eab114797a9970d9da3eaf229.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:36.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:36 np0005603622 nova_compute[221301]: 2026-01-31 08:23:36.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.039 221324 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/1634625eab114797a9970d9da3eaf229.tmp" returned: 1 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.041 221324 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d/1634625eab114797a9970d9da3eaf229.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.041 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Creating directory /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.042 221324 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:37.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.248 221324 DEBUG oslo_concurrency.processutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/d0c13002-57d9-4fad-8579-7343af29719d" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.254 221324 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.315 221324 DEBUG nova.compute.manager [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-unplugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.315 221324 DEBUG oslo_concurrency.lockutils [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.316 221324 DEBUG oslo_concurrency.lockutils [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.316 221324 DEBUG oslo_concurrency.lockutils [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.316 221324 DEBUG nova.compute.manager [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] No waiting events found dispatching network-vif-unplugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.316 221324 DEBUG nova.compute.manager [req-500485a3-4c06-4ade-bd6c-68f2a83fd792 req-719c768a-221f-40c1-a111-8eca99ecccc6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-unplugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:23:37 np0005603622 nova_compute[221301]: 2026-01-31 08:23:37.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:37 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.217 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.217 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.245 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.377 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.378 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.392 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.393 221324 INFO nova.compute.claims [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:23:38 np0005603622 nova_compute[221301]: 2026-01-31 08:23:38.632 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:38.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:38 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:23:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3114816272' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.079 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.089 221324 DEBUG nova.compute.provider_tree [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.132 221324 DEBUG nova.scheduler.client.report [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:39.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.172 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.174 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.273 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.274 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.330 221324 INFO nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.362 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.453 221324 INFO nova.virt.block_device [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Booting with volume-backed-image 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16 at /dev/vda#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.532 221324 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.533 221324 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "04eaec43-2346-4011-a231-cd69f33a0457-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.534 221324 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.534 221324 DEBUG oslo_concurrency.lockutils [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.535 221324 DEBUG nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] No waiting events found dispatching network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.535 221324 WARNING nova.compute.manager [req-0d14e95c-1498-4a53-80a0-d442c4ca5bb0 req-0c4749fc-2c00-49ac-90ca-8ae9cd1ecd9c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received unexpected event network-vif-plugged-75e04f4b-0653-4f2c-8afb-7c8536d903d8 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.792021282 +0000 UTC m=+0.057386044 container create 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.763807993 +0000 UTC m=+0.029172845 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:23:39 np0005603622 systemd[1]: Started libpod-conmon-4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158.scope.
Jan 31 03:23:39 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.90579689 +0000 UTC m=+0.171161742 container init 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True)
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.915016778 +0000 UTC m=+0.180381580 container start 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS)
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.921105242 +0000 UTC m=+0.186470044 container attach 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:23:39 np0005603622 recursing_noether[263926]: 167 167
Jan 31 03:23:39 np0005603622 systemd[1]: libpod-4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158.scope: Deactivated successfully.
Jan 31 03:23:39 np0005603622 conmon[263926]: conmon 4e0ff13e2aab53aa4a42 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158.scope/container/memory.events
Jan 31 03:23:39 np0005603622 podman[263910]: 2026-01-31 08:23:39.923127756 +0000 UTC m=+0.188492548 container died 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 31 03:23:39 np0005603622 nova_compute[221301]: 2026-01-31 08:23:39.975 221324 DEBUG nova.policy [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f4d66dd0b7ff443cbcdb6e2c9f5c4c8c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cf024d54545b4af882a87c721105742a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:23:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay-5186615e5092186a395ee6fec84c922bf450c28b2444ac6e3942f0423b7ebb0f-merged.mount: Deactivated successfully.
Jan 31 03:23:40 np0005603622 podman[263910]: 2026-01-31 08:23:40.126683948 +0000 UTC m=+0.392048710 container remove 4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_noether, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 31 03:23:40 np0005603622 systemd[1]: libpod-conmon-4e0ff13e2aab53aa4a4205f21d30fec91530c2071eab0215299c6ced16685158.scope: Deactivated successfully.
Jan 31 03:23:40 np0005603622 podman[263950]: 2026-01-31 08:23:40.320877087 +0000 UTC m=+0.053703035 container create 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 31 03:23:40 np0005603622 systemd[1]: Started libpod-conmon-06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba.scope.
Jan 31 03:23:40 np0005603622 podman[263950]: 2026-01-31 08:23:40.295490244 +0000 UTC m=+0.028316232 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:23:40 np0005603622 nova_compute[221301]: 2026-01-31 08:23:40.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:40 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:23:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d667061909368b723a656668e8d3f3ca234b9b14ec320447cbfa7b810dc1255c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d667061909368b723a656668e8d3f3ca234b9b14ec320447cbfa7b810dc1255c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d667061909368b723a656668e8d3f3ca234b9b14ec320447cbfa7b810dc1255c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d667061909368b723a656668e8d3f3ca234b9b14ec320447cbfa7b810dc1255c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 03:23:40 np0005603622 podman[263950]: 2026-01-31 08:23:40.434016738 +0000 UTC m=+0.166842716 container init 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:23:40 np0005603622 podman[263950]: 2026-01-31 08:23:40.443821782 +0000 UTC m=+0.176647730 container start 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 31 03:23:40 np0005603622 podman[263950]: 2026-01-31 08:23:40.449606197 +0000 UTC m=+0.182432175 container attach 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 03:23:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:40.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:41.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:41 np0005603622 nova_compute[221301]: 2026-01-31 08:23:41.291 221324 INFO nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance shutdown successfully after 4 seconds.#033[00m
Jan 31 03:23:41 np0005603622 nova_compute[221301]: 2026-01-31 08:23:41.338 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Successfully created port: 14f2f6f4-afdf-41af-8c0f-05467025eed7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:23:41 np0005603622 nova_compute[221301]: 2026-01-31 08:23:41.654 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]: [
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:    {
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "available": false,
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "ceph_device": false,
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "lsm_data": {},
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "lvs": [],
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "path": "/dev/sr0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "rejected_reasons": [
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "Insufficient space (<5GB)",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "Has a FileSystem"
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        ],
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        "sys_api": {
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "actuators": null,
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "device_nodes": "sr0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "devname": "sr0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "human_readable_size": "482.00 KB",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "id_bus": "ata",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "model": "QEMU DVD-ROM",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "nr_requests": "2",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "parent": "/dev/sr0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "partitions": {},
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "path": "/dev/sr0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "removable": "1",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "rev": "2.5+",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "ro": "0",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "rotational": "1",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "sas_address": "",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "sas_device_handle": "",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "scheduler_mode": "mq-deadline",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "sectors": 0,
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "sectorsize": "2048",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "size": 493568.0,
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "support_discard": "2048",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "type": "disk",
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:            "vendor": "QEMU"
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:        }
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]:    }
Jan 31 03:23:41 np0005603622 agitated_jennings[263966]: ]
Jan 31 03:23:41 np0005603622 systemd[1]: libpod-06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba.scope: Deactivated successfully.
Jan 31 03:23:41 np0005603622 systemd[1]: libpod-06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba.scope: Consumed 1.274s CPU time.
Jan 31 03:23:41 np0005603622 conmon[263966]: conmon 06c1911f400bb038287f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba.scope/container/memory.events
Jan 31 03:23:41 np0005603622 podman[265189]: 2026-01-31 08:23:41.784487379 +0000 UTC m=+0.040713775 container died 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 03:23:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay-d667061909368b723a656668e8d3f3ca234b9b14ec320447cbfa7b810dc1255c-merged.mount: Deactivated successfully.
Jan 31 03:23:41 np0005603622 podman[265189]: 2026-01-31 08:23:41.890003756 +0000 UTC m=+0.146230172 container remove 06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 31 03:23:41 np0005603622 systemd[1]: libpod-conmon-06c1911f400bb038287f638f593cd6776b4fafb8a9c582c2b407c61f7ef22eba.scope: Deactivated successfully.
Jan 31 03:23:41 np0005603622 nova_compute[221301]: 2026-01-31 08:23:41.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 kernel: tapcc59ad05-32 (unregistering): left promiscuous mode
Jan 31 03:23:42 np0005603622 NetworkManager[49080]: <info>  [1769847822.0125] device (tapcc59ad05-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.021 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:42Z|00509|binding|INFO|Releasing lport cc59ad05-3242-4d5f-8eec-a2480d285193 from this chassis (sb_readonly=0)
Jan 31 03:23:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:42Z|00510|binding|INFO|Setting lport cc59ad05-3242-4d5f-8eec-a2480d285193 down in Southbound
Jan 31 03:23:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:23:42Z|00511|binding|INFO|Removing iface tapcc59ad05-32 ovn-installed in OVS
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 31 03:23:42 np0005603622 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000068.scope: Consumed 20.172s CPU time.
Jan 31 03:23:42 np0005603622 systemd-machined[190406]: Machine qemu-48-instance-00000068 terminated.
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.136 221324 INFO nova.virt.libvirt.driver [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Instance destroyed successfully.#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.138 221324 DEBUG nova.virt.libvirt.vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:19:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:23:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.139 221324 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:b9:24:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.140 221324 DEBUG nova.network.os_vif_util [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.141 221324 DEBUG os_vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.143 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.143 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.153 221324 INFO os_vif [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.160 221324 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.161 221324 DEBUG nova.virt.libvirt.driver [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.399 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:23:42 np0005603622 nova_compute[221301]: 2026-01-31 08:23:42.399 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:23:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:42.785 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:24:4f 10.100.0.4'], port_security=['fa:16:3e:b9:24:4f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd0c13002-57d9-4fad-8579-7343af29719d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '10', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=cc59ad05-3242-4d5f-8eec-a2480d285193) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:23:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:42.787 139793 INFO neutron.agent.ovn.metadata.agent [-] Port cc59ad05-3242-4d5f-8eec-a2480d285193 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:23:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:42.789 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:23:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:42.791 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4f910f1c-8b9b-46be-bb1e-f72f88e9ee7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:42.791 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:23:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:42 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [NOTICE]   (261888) : haproxy version is 2.8.14-c23fe91
Jan 31 03:23:42 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [NOTICE]   (261888) : path to executable is /usr/sbin/haproxy
Jan 31 03:23:42 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [WARNING]  (261888) : Exiting Master process...
Jan 31 03:23:42 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [ALERT]    (261888) : Current worker (261890) exited with code 143 (Terminated)
Jan 31 03:23:42 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[261884]: [WARNING]  (261888) : All workers exited. Exiting... (0)
Jan 31 03:23:42 np0005603622 systemd[1]: libpod-9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce.scope: Deactivated successfully.
Jan 31 03:23:42 np0005603622 podman[265240]: 2026-01-31 08:23:42.976118331 +0000 UTC m=+0.071428521 container died 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:23:43 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce-userdata-shm.mount: Deactivated successfully.
Jan 31 03:23:43 np0005603622 systemd[1]: var-lib-containers-storage-overlay-4ff984f57afd331ab59c545eaa389ae03fcaee328da66176fcb964ccbed5def2-merged.mount: Deactivated successfully.
Jan 31 03:23:43 np0005603622 podman[265240]: 2026-01-31 08:23:43.038522839 +0000 UTC m=+0.133832989 container cleanup 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:23:43 np0005603622 systemd[1]: libpod-conmon-9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce.scope: Deactivated successfully.
Jan 31 03:23:43 np0005603622 podman[265271]: 2026-01-31 08:23:43.145886325 +0000 UTC m=+0.078043479 container remove 9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.152 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ab42136f-645d-4407-b69a-f28b17dc2d01]: (4, ('Sat Jan 31 08:23:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce)\n9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce\nSat Jan 31 08:23:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce)\n9e080349f20311e62ca207319449bce15c51c4c1f22ea05828827cda0db80fce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.156 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c46d74c5-7611-4a30-bde2-3a946ff25b79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.158 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.161 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:43 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.171 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:43.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.176 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d911ab50-685b-4b2a-bbe4-2599cc50fabf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.192 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[603feb71-849b-4204-8e0d-1729cb0eac6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.194 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2542ab-bd25-48fd-9e7f-97ae4a009d2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.219 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cc80f1be-0a36-4606-8b5a-5e7db3705f5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694054, 'reachable_time': 44165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265286, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.222 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:23:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:23:43.223 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d8cf72-119f-4841-ac7e-31bcb076b277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.324 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.325 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.325 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.326 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.326 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.326 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:23:43 np0005603622 nova_compute[221301]: 2026-01-31 08:23:43.349 221324 DEBUG neutronclient.v2_0.client [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port cc59ad05-3242-4d5f-8eec-a2480d285193 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:23:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:44 np0005603622 nova_compute[221301]: 2026-01-31 08:23:44.558 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:44 np0005603622 nova_compute[221301]: 2026-01-31 08:23:44.559 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:44 np0005603622 nova_compute[221301]: 2026-01-31 08:23:44.560 221324 DEBUG oslo_concurrency.lockutils [None req-1a56dd6e-0568-401c-abc9-20db950ccb84 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.388 221324 DEBUG nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.388 221324 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.390 221324 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.390 221324 DEBUG oslo_concurrency.lockutils [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.391 221324 DEBUG nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.391 221324 WARNING nova.compute.manager [req-5f7604b3-7495-47f8-b6af-6a877f1e277e req-58546758-06cc-4e89-a3ec-96bc07040415 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-unplugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.557 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Successfully updated port: 14f2f6f4-afdf-41af-8c0f-05467025eed7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.693 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.693 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.694 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.845 221324 DEBUG nova.compute.manager [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-changed-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.845 221324 DEBUG nova.compute.manager [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Refreshing instance network info cache due to event network-changed-14f2f6f4-afdf-41af-8c0f-05467025eed7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:45 np0005603622 nova_compute[221301]: 2026-01-31 08:23:45.846 221324 DEBUG oslo_concurrency.lockutils [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:46 np0005603622 nova_compute[221301]: 2026-01-31 08:23:46.080 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:23:46 np0005603622 nova_compute[221301]: 2026-01-31 08:23:46.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:46 np0005603622 nova_compute[221301]: 2026-01-31 08:23:46.944 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847811.9433496, 04eaec43-2346-4011-a231-cd69f33a0457 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:46 np0005603622 nova_compute[221301]: 2026-01-31 08:23:46.945 221324 INFO nova.compute.manager [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.002 221324 DEBUG nova.compute.manager [None req-93e58aef-33fc-4cc7-8932-ae0fd4ab6f2a - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.008 221324 DEBUG nova.compute.manager [None req-93e58aef-33fc-4cc7-8932-ae0fd4ab6f2a - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.068 221324 INFO nova.compute.manager [None req-93e58aef-33fc-4cc7-8932-ae0fd4ab6f2a - - - - - -] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:47.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.703 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.822 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.823 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.824 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.825 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.825 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.825 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.943 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.944 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.944 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.945 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.945 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.980 221324 DEBUG nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.981 221324 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.981 221324 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.982 221324 DEBUG oslo_concurrency.lockutils [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.983 221324 DEBUG nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:23:47 np0005603622 nova_compute[221301]: 2026-01-31 08:23:47.983 221324 WARNING nova.compute.manager [req-e310c75d-60f6-4276-af92-8288f9238ad8 req-3c77d365-f955-4457-9afc-27157b0a6f84 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.014 221324 DEBUG nova.compute.manager [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.015 221324 DEBUG nova.compute.manager [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing instance network info cache due to event network-changed-cc59ad05-3242-4d5f-8eec-a2480d285193. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.015 221324 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.016 221324 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.016 221324 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Refreshing network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:48 np0005603622 podman[265300]: 2026-01-31 08:23:48.171597276 +0000 UTC m=+0.077717250 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:23:48 np0005603622 podman[265290]: 2026-01-31 08:23:48.190631148 +0000 UTC m=+0.103681838 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.370 221324 DEBUG nova.network.neutron [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1411584311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.400 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.505 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.505 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance network_info: |[{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.506 221324 DEBUG oslo_concurrency.lockutils [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.507 221324 DEBUG nova.network.neutron [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Refreshing network info cache for port 14f2f6f4-afdf-41af-8c0f-05467025eed7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.625 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.626 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000068 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.631 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000074 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.632 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000074 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:23:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.858 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.859 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4510MB free_disk=20.851348876953125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.859 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.859 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.944 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration for instance d0c13002-57d9-4fad-8579-7343af29719d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.991 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating resource usage from migration b1852fe9-e291-46e2-a9d1-96408d08f71a#033[00m
Jan 31 03:23:48 np0005603622 nova_compute[221301]: 2026-01-31 08:23:48.992 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Starting to track outgoing migration b1852fe9-e291-46e2-a9d1-96408d08f71a with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.059 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 04eaec43-2346-4011-a231-cd69f33a0457 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.060 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration b1852fe9-e291-46e2-a9d1-96408d08f71a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.060 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4627608a-03b0-4a7e-bcb3-b709dade0334 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.061 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.061 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:23:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:49.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.285 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:23:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:23:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1661173494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.817 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:23:49 np0005603622 nova_compute[221301]: 2026-01-31 08:23:49.825 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.220 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.351 221324 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updated VIF entry in instance network info cache for port cc59ad05-3242-4d5f-8eec-a2480d285193. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.352 221324 DEBUG nova.network.neutron [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:23:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:50.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.822 221324 DEBUG oslo_concurrency.lockutils [req-288bb1a3-fcea-47a1-ae63-36574ca1cad3 req-7845037e-e2db-4ed2-8254-ac4dc9c89a9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.875 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:23:50 np0005603622 nova_compute[221301]: 2026-01-31 08:23:50.876 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:23:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:51.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:51 np0005603622 nova_compute[221301]: 2026-01-31 08:23:51.300 221324 DEBUG nova.network.neutron [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updated VIF entry in instance network info cache for port 14f2f6f4-afdf-41af-8c0f-05467025eed7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:23:51 np0005603622 nova_compute[221301]: 2026-01-31 08:23:51.301 221324 DEBUG nova.network.neutron [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:23:51 np0005603622 nova_compute[221301]: 2026-01-31 08:23:51.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:51 np0005603622 nova_compute[221301]: 2026-01-31 08:23:51.870 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:23:52 np0005603622 nova_compute[221301]: 2026-01-31 08:23:52.198 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:23:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:23:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:52.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 31 03:23:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:23:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:54.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:23:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:55.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:55 np0005603622 nova_compute[221301]: 2026-01-31 08:23:55.881 221324 DEBUG oslo_concurrency.lockutils [req-4b0e5d98-9ba1-44b1-9bb2-5eac68c02bc4 req-7d39cf5b-3a98-4bca-a708-64787d69b1ef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:23:56 np0005603622 nova_compute[221301]: 2026-01-31 08:23:56.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:56.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.133 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847822.1322262, d0c13002-57d9-4fad-8579-7343af29719d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.134 221324 INFO nova.compute.manager [-] [instance: d0c13002-57d9-4fad-8579-7343af29719d] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:23:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:57.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.310 221324 DEBUG nova.compute.manager [None req-a4621e8e-07c6-4aac-b4a9-1397cdfe225d - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.314 221324 DEBUG nova.compute.manager [None req-a4621e8e-07c6-4aac-b4a9-1397cdfe225d - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:23:57 np0005603622 nova_compute[221301]: 2026-01-31 08:23:57.639 221324 INFO nova.compute.manager [None req-a4621e8e-07c6-4aac-b4a9-1397cdfe225d - - - - - -] [instance: d0c13002-57d9-4fad-8579-7343af29719d] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:23:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:23:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:23:58.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:23:58 np0005603622 nova_compute[221301]: 2026-01-31 08:23:58.877 221324 INFO nova.virt.libvirt.driver [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Deleting instance files /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457_del#033[00m
Jan 31 03:23:58 np0005603622 nova_compute[221301]: 2026-01-31 08:23:58.878 221324 INFO nova.virt.libvirt.driver [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Deletion of /var/lib/nova/instances/04eaec43-2346-4011-a231-cd69f33a0457_del complete#033[00m
Jan 31 03:23:59 np0005603622 nova_compute[221301]: 2026-01-31 08:23:59.146 221324 INFO nova.compute.manager [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Took 27.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:23:59 np0005603622 nova_compute[221301]: 2026-01-31 08:23:59.147 221324 DEBUG oslo.service.loopingcall [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:23:59 np0005603622 nova_compute[221301]: 2026-01-31 08:23:59.148 221324 DEBUG nova.compute.manager [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:23:59 np0005603622 nova_compute[221301]: 2026-01-31 08:23:59.148 221324 DEBUG nova.network.neutron [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:23:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:23:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:23:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:23:59.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:00.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:01.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:01 np0005603622 nova_compute[221301]: 2026-01-31 08:24:01.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:01 np0005603622 nova_compute[221301]: 2026-01-31 08:24:01.748 221324 DEBUG nova.network.neutron [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:01 np0005603622 nova_compute[221301]: 2026-01-31 08:24:01.803 221324 INFO nova.compute.manager [-] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Took 2.66 seconds to deallocate network for instance.#033[00m
Jan 31 03:24:01 np0005603622 nova_compute[221301]: 2026-01-31 08:24:01.884 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:01 np0005603622 nova_compute[221301]: 2026-01-31 08:24:01.885 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.038 221324 DEBUG oslo_concurrency.processutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.143 221324 DEBUG nova.compute.manager [req-be2364a6-a96e-4c97-9814-d94b16b45689 req-ba5d99ec-386e-4318-a308-5a54b272c3fd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 04eaec43-2346-4011-a231-cd69f33a0457] Received event network-vif-deleted-75e04f4b-0653-4f2c-8afb-7c8536d903d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4152756470' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.453 221324 DEBUG oslo_concurrency.processutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.459 221324 DEBUG nova.compute.provider_tree [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.608 221324 DEBUG nova.scheduler.client.report [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:24:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:02.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.908 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.979 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.980 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:02 np0005603622 nova_compute[221301]: 2026-01-31 08:24:02.981 221324 DEBUG nova.compute.manager [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.040 221324 INFO nova.scheduler.client.report [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Deleted allocations for instance 04eaec43-2346-4011-a231-cd69f33a0457#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.191 221324 DEBUG oslo_concurrency.lockutils [None req-c330b128-ddc4-4819-8dac-1e078c40ff83 c086a82bd0384612a78981006889df41 96de645f38844180b404d1a7cf7dd460 - - default default] Lock "04eaec43-2346-4011-a231-cd69f33a0457" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 31.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:03.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.730 221324 DEBUG neutronclient.v2_0.client [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port cc59ad05-3242-4d5f-8eec-a2480d285193 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.731 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.731 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.731 221324 DEBUG nova.network.neutron [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:24:03 np0005603622 nova_compute[221301]: 2026-01-31 08:24:03.732 221324 DEBUG nova.objects.instance [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'info_cache' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:04.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:05.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:06 np0005603622 nova_compute[221301]: 2026-01-31 08:24:06.668 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:06.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:07.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:07 np0005603622 nova_compute[221301]: 2026-01-31 08:24:07.244 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.113 221324 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.114 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.114 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.115 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.115 221324 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.116 221324 WARNING nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.116 221324 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.116 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "d0c13002-57d9-4fad-8579-7343af29719d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.117 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.117 221324 DEBUG oslo_concurrency.lockutils [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.118 221324 DEBUG nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] No waiting events found dispatching network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.118 221324 WARNING nova.compute.manager [req-aa9e5d68-8e24-43b6-a2af-4f355b0ed78d req-8f73334f-9be3-417f-a80f-0c2cd17762b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Received unexpected event network-vif-plugged-cc59ad05-3242-4d5f-8eec-a2480d285193 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:24:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:08.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:08 np0005603622 nova_compute[221301]: 2026-01-31 08:24:08.988 221324 DEBUG nova.network.neutron [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: d0c13002-57d9-4fad-8579-7343af29719d] Updating instance_info_cache with network_info: [{"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:09 np0005603622 nova_compute[221301]: 2026-01-31 08:24:09.092 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-d0c13002-57d9-4fad-8579-7343af29719d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:09 np0005603622 nova_compute[221301]: 2026-01-31 08:24:09.092 221324 DEBUG nova.objects.instance [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid d0c13002-57d9-4fad-8579-7343af29719d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:09.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:09 np0005603622 nova_compute[221301]: 2026-01-31 08:24:09.456 221324 DEBUG nova.storage.rbd_utils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] removing snapshot(nova-resize) on rbd image(d0c13002-57d9-4fad-8579-7343af29719d_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:24:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:10.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 31 03:24:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:11.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:11 np0005603622 nova_compute[221301]: 2026-01-31 08:24:11.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.169 221324 DEBUG nova.virt.libvirt.vif [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:19:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-922415262',display_name='tempest-ServerActionsTestJSON-server-922415262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-922415262',id=104,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-zpk5jwol',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:24:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=d0c13002-57d9-4fad-8579-7343af29719d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.170 221324 DEBUG nova.network.os_vif_util [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "cc59ad05-3242-4d5f-8eec-a2480d285193", "address": "fa:16:3e:b9:24:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc59ad05-32", "ovs_interfaceid": "cc59ad05-3242-4d5f-8eec-a2480d285193", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.171 221324 DEBUG nova.network.os_vif_util [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.171 221324 DEBUG os_vif [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.174 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc59ad05-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.174 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.177 221324 INFO os_vif [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:24:4f,bridge_name='br-int',has_traffic_filtering=True,id=cc59ad05-3242-4d5f-8eec-a2480d285193,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc59ad05-32')#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.177 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.178 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:24:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.316 221324 DEBUG oslo_concurrency.processutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4278262266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.772 221324 DEBUG oslo_concurrency.processutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.781 221324 DEBUG nova.compute.provider_tree [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:24:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:12.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:12 np0005603622 nova_compute[221301]: 2026-01-31 08:24:12.901 221324 DEBUG nova.scheduler.client.report [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:24:13 np0005603622 nova_compute[221301]: 2026-01-31 08:24:13.159 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:13.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:13 np0005603622 nova_compute[221301]: 2026-01-31 08:24:13.361 221324 INFO nova.scheduler.client.report [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Deleted allocation for migration b1852fe9-e291-46e2-a9d1-96408d08f71a#033[00m
Jan 31 03:24:13 np0005603622 nova_compute[221301]: 2026-01-31 08:24:13.454 221324 DEBUG oslo_concurrency.lockutils [None req-a209c68e-6b32-43f2-9b14-3e4de7c1477b 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "d0c13002-57d9-4fad-8579-7343af29719d" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 10.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:13 np0005603622 nova_compute[221301]: 2026-01-31 08:24:13.558 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:13 np0005603622 nova_compute[221301]: 2026-01-31 08:24:13.639 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:14.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:15.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:16 np0005603622 nova_compute[221301]: 2026-01-31 08:24:16.672 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:16.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:17.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:17 np0005603622 nova_compute[221301]: 2026-01-31 08:24:17.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:18.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:19 np0005603622 podman[265509]: 2026-01-31 08:24:19.187984244 +0000 UTC m=+0.097307156 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:24:19 np0005603622 podman[265508]: 2026-01-31 08:24:19.220129338 +0000 UTC m=+0.128441063 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:24:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:19.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:20.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 31 03:24:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:21.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:21 np0005603622 nova_compute[221301]: 2026-01-31 08:24:21.675 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:22 np0005603622 nova_compute[221301]: 2026-01-31 08:24:22.311 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:22.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:23.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:24.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:25.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.456 221324 DEBUG os_brick.utils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.459 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.490 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.490 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[0da49a51-6d25-4f5a-b0a5-dc0d51eb2533]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.492 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.503 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.503 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[037e2a92-05aa-40af-a152-b59563f04eda]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.505 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.515 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.516 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[843ec57e-5a82-42c2-8bfe-a6088bf10fc1]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.517 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ee23a3-8852-4406-a8c6-9a11a170eba5]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.518 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.553 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "nvme version" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.558 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.558 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.559 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.560 221324 DEBUG os_brick.utils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] <== get_connector_properties: return (102ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.561 221324 DEBUG nova.virt.block_device [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating existing volume attachment record: 13da4cb4-931b-4f9d-98dc-452bf3204cfa _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:24:26 np0005603622 nova_compute[221301]: 2026-01-31 08:24:26.678 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:26.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:27.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:27 np0005603622 nova_compute[221301]: 2026-01-31 08:24:27.314 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:24:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1009447450' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:24:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:28.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:29 np0005603622 nova_compute[221301]: 2026-01-31 08:24:29.994 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:24:29 np0005603622 nova_compute[221301]: 2026-01-31 08:24:29.997 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:24:29 np0005603622 nova_compute[221301]: 2026-01-31 08:24:29.998 221324 INFO nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Creating image(s)#033[00m
Jan 31 03:24:29 np0005603622 nova_compute[221301]: 2026-01-31 08:24:29.999 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:29.999 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Ensure instance console log exists: /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.000 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.001 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.002 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.008 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start _get_guest_xml network_info=[{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': 0, 'attachment_id': '13da4cb4-931b-4f9d-98dc-452bf3204cfa', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b717b315-86df-4d2d-96bf-25aba2759093', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b717b315-86df-4d2d-96bf-25aba2759093', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'attached_at': '', 'detached_at': '', 'volume_id': 'b717b315-86df-4d2d-96bf-25aba2759093', 'serial': 'b717b315-86df-4d2d-96bf-25aba2759093'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.019 221324 WARNING nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.029 221324 DEBUG nova.virt.libvirt.host [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.030 221324 DEBUG nova.virt.libvirt.host [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.040 221324 DEBUG nova.virt.libvirt.host [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.041 221324 DEBUG nova.virt.libvirt.host [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.042 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.042 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.043 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.043 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.043 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.044 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.044 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.044 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.044 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.044 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.045 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.045 221324 DEBUG nova.virt.hardware [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.088 221324 DEBUG nova.storage.rbd_utils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.094 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:24:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996734654' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.550 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.600 221324 DEBUG nova.virt.libvirt.vif [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2039669481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2039669481',id=118,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cf024d54545b4af882a87c721105742a',ramdisk_id='',reservation_id='r-ohvqqfqv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-468517745',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:39Z,user_data=None,user_id='f4d66dd0b7ff443cbcdb6e2c9f5c4c8c',uuid=4627608a-03b0-4a7e-bcb3-b709dade0334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.601 221324 DEBUG nova.network.os_vif_util [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converting VIF {"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.603 221324 DEBUG nova.network.os_vif_util [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.605 221324 DEBUG nova.objects.instance [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.640 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <uuid>4627608a-03b0-4a7e-bcb3-b709dade0334</uuid>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <name>instance-00000076</name>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-2039669481</nova:name>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:24:30</nova:creationTime>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:user uuid="f4d66dd0b7ff443cbcdb6e2c9f5c4c8c">tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member</nova:user>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:project uuid="cf024d54545b4af882a87c721105742a">tempest-ServerBootFromVolumeStableRescueTest-468517745</nova:project>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <nova:port uuid="14f2f6f4-afdf-41af-8c0f-05467025eed7">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="serial">4627608a-03b0-4a7e-bcb3-b709dade0334</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="uuid">4627608a-03b0-4a7e-bcb3-b709dade0334</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-b717b315-86df-4d2d-96bf-25aba2759093">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <serial>b717b315-86df-4d2d-96bf-25aba2759093</serial>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:9b:35:86"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <target dev="tap14f2f6f4-af"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/console.log" append="off"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:24:30 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:24:30 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:24:30 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:24:30 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.642 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Preparing to wait for external event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.642 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.642 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.642 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.643 221324 DEBUG nova.virt.libvirt.vif [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2039669481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2039669481',id=118,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cf024d54545b4af882a87c721105742a',ramdisk_id='',reservation_id='r-ohvqqfqv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-468517745',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:23:39Z,user_data=None,user_id='f4d66dd0b7ff443cbcdb6e2c9f5c4c8c',uuid=4627608a-03b0-4a7e-bcb3-b709dade0334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.643 221324 DEBUG nova.network.os_vif_util [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converting VIF {"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.644 221324 DEBUG nova.network.os_vif_util [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.644 221324 DEBUG os_vif [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.645 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.646 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.649 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.649 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14f2f6f4-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.650 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap14f2f6f4-af, col_values=(('external_ids', {'iface-id': '14f2f6f4-afdf-41af-8c0f-05467025eed7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:35:86', 'vm-uuid': '4627608a-03b0-4a7e-bcb3-b709dade0334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:30 np0005603622 NetworkManager[49080]: <info>  [1769847870.6525] manager: (tap14f2f6f4-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.654 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.660 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.662 221324 INFO os_vif [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af')#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.822 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.823 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.823 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No VIF found with MAC fa:16:3e:9b:35:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.824 221324 INFO nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Using config drive#033[00m
Jan 31 03:24:30 np0005603622 nova_compute[221301]: 2026-01-31 08:24:30.858 221324 DEBUG nova.storage.rbd_utils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:30.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:30.879 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:30.879 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:30.879 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:31.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.459 221324 INFO nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Creating config drive at /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.467 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxaraq_rp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:31.522 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:31.523 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.601 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxaraq_rp" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.645 221324 DEBUG nova.storage.rbd_utils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.650 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:31 np0005603622 nova_compute[221301]: 2026-01-31 08:24:31.679 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:32.526 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:32.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:33.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:34.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:35.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.621 221324 DEBUG oslo_concurrency.processutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.970s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.621 221324 INFO nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Deleting local config drive /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config because it was imported into RBD.#033[00m
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.651 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:35 np0005603622 kernel: tap14f2f6f4-af: entered promiscuous mode
Jan 31 03:24:35 np0005603622 NetworkManager[49080]: <info>  [1769847875.7307] manager: (tap14f2f6f4-af): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 31 03:24:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:35Z|00512|binding|INFO|Claiming lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 for this chassis.
Jan 31 03:24:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:35Z|00513|binding|INFO|14f2f6f4-afdf-41af-8c0f-05467025eed7: Claiming fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:35 np0005603622 systemd-udevd[265673]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.759 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:35Z|00514|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 ovn-installed in OVS
Jan 31 03:24:35 np0005603622 nova_compute[221301]: 2026-01-31 08:24:35.763 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:35 np0005603622 systemd-machined[190406]: New machine qemu-51-instance-00000076.
Jan 31 03:24:35 np0005603622 NetworkManager[49080]: <info>  [1769847875.7743] device (tap14f2f6f4-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:24:35 np0005603622 NetworkManager[49080]: <info>  [1769847875.7752] device (tap14f2f6f4-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:24:35 np0005603622 systemd[1]: Started Virtual Machine qemu-51-instance-00000076.
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.022 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.023 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab bound to our chassis#033[00m
Jan 31 03:24:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:36Z|00515|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 up in Southbound
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.025 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 98be5db6-5633-4d23-b9a9-16382d8e99ab#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.035 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7857b1-5fb8-42c7-b3a3-ffc4dba7bdce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.037 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap98be5db6-51 in ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.040 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap98be5db6-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.040 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8121a234-7ec0-40e0-a6e9-e02aa86a703d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.041 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[59f5f49d-e6a4-4530-a9bd-e5ea26003117]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.055 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[0f44ad46-5640-4f4d-b697-3951a47d8f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.070 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[89c9fb8c-0e1e-4bbc-bb44-377169c53d4a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.095 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[91c590df-9aee-4ef1-bdd4-55fdea4288f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.100 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5eba60d1-95a0-4f31-a04e-77f8f9a01aa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 NetworkManager[49080]: <info>  [1769847876.1018] manager: (tap98be5db6-50): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.132 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0a34e6a0-666b-4746-8ec8-8fa297b1c802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.135 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc79645-1fb0-4940-8311-7ec6dd24a406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 NetworkManager[49080]: <info>  [1769847876.1513] device (tap98be5db6-50): carrier: link connected
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.153 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e4262314-ddab-44cf-9dcd-95e12f6fcf86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.166 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f938384-e30c-4f31-8bb9-cad1855f23c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712748, 'reachable_time': 30277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265707, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.175 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0792689d-4397-4960-91ca-5145edea2f8f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:3a3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 712748, 'tstamp': 712748}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265708, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.190 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1490a20e-a46b-4557-bfbc-b77c689e1aaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712748, 'reachable_time': 30277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265709, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.222 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b186f8a-e122-4e07-9aa1-b133c8470f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.282 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa9685d-6186-4d3d-91e8-1c0d22a3f21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.284 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.284 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.285 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98be5db6-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.287 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 kernel: tap98be5db6-50: entered promiscuous mode
Jan 31 03:24:36 np0005603622 NetworkManager[49080]: <info>  [1769847876.2890] manager: (tap98be5db6-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.292 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.293 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap98be5db6-50, col_values=(('external_ids', {'iface-id': 'dad27cfe-7e8a-4f55-a945-07f9cae848c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:36Z|00516|binding|INFO|Releasing lport dad27cfe-7e8a-4f55-a945-07f9cae848c1 from this chassis (sb_readonly=0)
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.296 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.297 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcde44aa-47a1-46ae-aa6e-15b7b5ed4fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.299 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:24:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:36.300 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'env', 'PROCESS_TAG=haproxy-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/98be5db6-5633-4d23-b9a9-16382d8e99ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.685 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:36 np0005603622 podman[265759]: 2026-01-31 08:24:36.736155852 +0000 UTC m=+0.099707811 container create 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:24:36 np0005603622 podman[265759]: 2026-01-31 08:24:36.677243818 +0000 UTC m=+0.040795817 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:24:36 np0005603622 systemd[1]: Started libpod-conmon-7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe.scope.
Jan 31 03:24:36 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:24:36 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fac9d5406ed867d86c780775ceeed54f8881fa5fc9f7cb0932beb1b16b79d09/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:24:36 np0005603622 podman[265759]: 2026-01-31 08:24:36.843191009 +0000 UTC m=+0.206742988 container init 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:24:36 np0005603622 podman[265759]: 2026-01-31 08:24:36.847762322 +0000 UTC m=+0.211314271 container start 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.849 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847876.8489347, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:36 np0005603622 nova_compute[221301]: 2026-01-31 08:24:36.850 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Started (Lifecycle Event)#033[00m
Jan 31 03:24:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:36.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:36 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [NOTICE]   (265802) : New worker (265804) forked
Jan 31 03:24:36 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [NOTICE]   (265802) : Loading success.
Jan 31 03:24:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:37.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.563 221324 DEBUG nova.compute.manager [req-181302d9-7436-4a71-b743-76b9f25027bf req-33f4ddf4-2031-4a89-88ce-b55b3a65c10f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.565 221324 DEBUG oslo_concurrency.lockutils [req-181302d9-7436-4a71-b743-76b9f25027bf req-33f4ddf4-2031-4a89-88ce-b55b3a65c10f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.566 221324 DEBUG oslo_concurrency.lockutils [req-181302d9-7436-4a71-b743-76b9f25027bf req-33f4ddf4-2031-4a89-88ce-b55b3a65c10f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.567 221324 DEBUG oslo_concurrency.lockutils [req-181302d9-7436-4a71-b743-76b9f25027bf req-33f4ddf4-2031-4a89-88ce-b55b3a65c10f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.567 221324 DEBUG nova.compute.manager [req-181302d9-7436-4a71-b743-76b9f25027bf req-33f4ddf4-2031-4a89-88ce-b55b3a65c10f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Processing event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.569 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.575 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.582 221324 INFO nova.virt.libvirt.driver [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance spawned successfully.#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.583 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.589 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.596 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:38.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.925 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.926 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.926 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.927 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.928 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.929 221324 DEBUG nova.virt.libvirt.driver [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.936 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.937 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847876.8527281, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:38 np0005603622 nova_compute[221301]: 2026-01-31 08:24:38.937 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.248 221324 INFO nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Took 9.25 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.252 221324 DEBUG nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.253 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.266 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847878.5737386, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.266 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:24:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:39.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.412 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.417 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.451 221324 INFO nova.compute.manager [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Took 61.14 seconds to build instance.#033[00m
Jan 31 03:24:39 np0005603622 nova_compute[221301]: 2026-01-31 08:24:39.525 221324 DEBUG oslo_concurrency.lockutils [None req-a95a5126-0e76-40a1-9393-01a6dc17f97b f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 61.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:40 np0005603622 nova_compute[221301]: 2026-01-31 08:24:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:40 np0005603622 nova_compute[221301]: 2026-01-31 08:24:40.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:40 np0005603622 nova_compute[221301]: 2026-01-31 08:24:40.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:24:40 np0005603622 nova_compute[221301]: 2026-01-31 08:24:40.654 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:40.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:41.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.647 221324 DEBUG nova.compute.manager [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.647 221324 DEBUG oslo_concurrency.lockutils [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.648 221324 DEBUG oslo_concurrency.lockutils [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.648 221324 DEBUG oslo_concurrency.lockutils [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.648 221324 DEBUG nova.compute.manager [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.648 221324 WARNING nova.compute.manager [req-fb2d8413-21a3-4145-b409-ee57873ec243 req-5463d7df-bb10-4199-a4e9-dc7fd6d3a976 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:24:41 np0005603622 nova_compute[221301]: 2026-01-31 08:24:41.730 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:42.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:43.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:43 np0005603622 nova_compute[221301]: 2026-01-31 08:24:43.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:44 np0005603622 nova_compute[221301]: 2026-01-31 08:24:44.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:44 np0005603622 nova_compute[221301]: 2026-01-31 08:24:44.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:24:44 np0005603622 nova_compute[221301]: 2026-01-31 08:24:44.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:24:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:44.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.127 221324 INFO nova.compute.manager [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Rescuing#033[00m
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.128 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.128 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.129 221324 DEBUG nova.network.neutron [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:24:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:45.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.625 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:24:45 np0005603622 nova_compute[221301]: 2026-01-31 08:24:45.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:46 np0005603622 nova_compute[221301]: 2026-01-31 08:24:46.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:46.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:47.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:48.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:49.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:50 np0005603622 podman[265814]: 2026-01-31 08:24:50.164100597 +0000 UTC m=+0.073072916 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:24:50 np0005603622 podman[265813]: 2026-01-31 08:24:50.249963135 +0000 UTC m=+0.158844631 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:24:50 np0005603622 nova_compute[221301]: 2026-01-31 08:24:50.656 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:50.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:51.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:51 np0005603622 nova_compute[221301]: 2026-01-31 08:24:51.737 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:52.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:53 np0005603622 nova_compute[221301]: 2026-01-31 08:24:53.072 221324 DEBUG nova.network.neutron [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:53.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:53 np0005603622 nova_compute[221301]: 2026-01-31 08:24:53.505 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:53 np0005603622 nova_compute[221301]: 2026-01-31 08:24:53.515 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:24:53 np0005603622 nova_compute[221301]: 2026-01-31 08:24:53.516 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:24:53 np0005603622 nova_compute[221301]: 2026-01-31 08:24:53.516 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:54Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:24:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:54Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:24:54 np0005603622 nova_compute[221301]: 2026-01-31 08:24:54.291 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:24:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:24:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:54.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:24:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:24:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:55.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:24:55 np0005603622 nova_compute[221301]: 2026-01-31 08:24:55.657 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:56 np0005603622 nova_compute[221301]: 2026-01-31 08:24:56.740 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:56.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:56 np0005603622 nova_compute[221301]: 2026-01-31 08:24:56.962 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:24:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:57.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.568 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.569 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.570 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.570 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.571 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:57 np0005603622 nova_compute[221301]: 2026-01-31 08:24:57.571 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.767 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.768 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.768 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.769 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.769 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:24:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:24:58 np0005603622 kernel: tap14f2f6f4-af (unregistering): left promiscuous mode
Jan 31 03:24:58 np0005603622 NetworkManager[49080]: <info>  [1769847898.8744] device (tap14f2f6f4-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:24:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:58Z|00517|binding|INFO|Releasing lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 from this chassis (sb_readonly=0)
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:58Z|00518|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 down in Southbound
Jan 31 03:24:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:24:58Z|00519|binding|INFO|Removing iface tap14f2f6f4-af ovn-installed in OVS
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603622 nova_compute[221301]: 2026-01-31 08:24:58.899 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:24:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:24:58.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:58 np0005603622 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 31 03:24:58 np0005603622 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000076.scope: Consumed 14.208s CPU time.
Jan 31 03:24:58 np0005603622 systemd-machined[190406]: Machine qemu-51-instance-00000076 terminated.
Jan 31 03:24:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:59.182 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:24:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:59.184 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab unbound from our chassis#033[00m
Jan 31 03:24:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:59.187 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 98be5db6-5633-4d23-b9a9-16382d8e99ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:24:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:59.189 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b423c6-95aa-4880-a579-27388dfbc073]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:24:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:24:59.191 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace which is not needed anymore#033[00m
Jan 31 03:24:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:24:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3702654886' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:24:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.305 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:24:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:24:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:24:59.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.320 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance shutdown successfully after 5 seconds.#033[00m
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.334 221324 INFO nova.virt.libvirt.driver [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance destroyed successfully.#033[00m
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.335 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'numa_topology' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.575 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Attempting a stable device rescue#033[00m
Jan 31 03:24:59 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [NOTICE]   (265802) : haproxy version is 2.8.14-c23fe91
Jan 31 03:24:59 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [NOTICE]   (265802) : path to executable is /usr/sbin/haproxy
Jan 31 03:24:59 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [WARNING]  (265802) : Exiting Master process...
Jan 31 03:24:59 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [ALERT]    (265802) : Current worker (265804) exited with code 143 (Terminated)
Jan 31 03:24:59 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[265797]: [WARNING]  (265802) : All workers exited. Exiting... (0)
Jan 31 03:24:59 np0005603622 systemd[1]: libpod-7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe.scope: Deactivated successfully.
Jan 31 03:24:59 np0005603622 podman[265916]: 2026-01-31 08:24:59.769575833 +0000 UTC m=+0.473228641 container died 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.830 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:24:59 np0005603622 nova_compute[221301]: 2026-01-31 08:24:59.831 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.035 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.041 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.042 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Creating image(s)#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.082 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.086 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.103 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.104 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4521MB free_disk=20.851356506347656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.104 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.104 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.221 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.263 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.268 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "526d2b81c98d8357a7817e885aa26e4ec9e0775c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.269 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "526d2b81c98d8357a7817e885aa26e4ec9e0775c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.520 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4627608a-03b0-4a7e-bcb3-b709dade0334 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.521 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.523 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:25:00 np0005603622 nova_compute[221301]: 2026-01-31 08:25:00.597 221324 DEBUG nova.virt.libvirt.imagebackend [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/27d563e8-5fbe-4653-b526-f8048559e6cd/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/27d563e8-5fbe-4653-b526-f8048559e6cd/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:25:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:00.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe-userdata-shm.mount: Deactivated successfully.
Jan 31 03:25:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay-7fac9d5406ed867d86c780775ceeed54f8881fa5fc9f7cb0932beb1b16b79d09-merged.mount: Deactivated successfully.
Jan 31 03:25:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:01.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:02 np0005603622 podman[265916]: 2026-01-31 08:25:02.439099351 +0000 UTC m=+3.142752139 container cleanup 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:25:02 np0005603622 nova_compute[221301]: 2026-01-31 08:25:02.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:02 np0005603622 systemd[1]: libpod-conmon-7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe.scope: Deactivated successfully.
Jan 31 03:25:02 np0005603622 nova_compute[221301]: 2026-01-31 08:25:02.449 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:02 np0005603622 nova_compute[221301]: 2026-01-31 08:25:02.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:25:02 np0005603622 nova_compute[221301]: 2026-01-31 08:25:02.494 221324 DEBUG nova.virt.libvirt.imagebackend [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/27d563e8-5fbe-4653-b526-f8048559e6cd/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:25:02 np0005603622 nova_compute[221301]: 2026-01-31 08:25:02.495 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] cloning images/27d563e8-5fbe-4653-b526-f8048559e6cd@snap to None/4627608a-03b0-4a7e-bcb3-b709dade0334_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:25:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:02.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:03 np0005603622 podman[266036]: 2026-01-31 08:25:03.078517758 +0000 UTC m=+0.614206571 container remove 7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.083 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[73b60243-4ce5-4b53-91ea-ca896494b3f0]: (4, ('Sat Jan 31 08:24:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe)\n7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe\nSat Jan 31 08:25:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe)\n7dacbc77e26799132dcf3474da48d4c02f39e313ea1d260baaa7bef4201a99fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.086 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b627f0e2-5831-4f1d-95d3-0e07d51a595f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.096 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:03 np0005603622 kernel: tap98be5db6-50: left promiscuous mode
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.098 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.110 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.114 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a1b37d-4630-49ee-bdf4-049f0f12dc6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.137 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ed554d4c-b46e-4465-990c-c69e999634c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.142 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c050434-c57f-4326-82ff-17c9e03f5716]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.155 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b65ed7a6-3a65-4777-b5b3-758e15dfa47b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712743, 'reachable_time': 36278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266111, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4010518336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.159 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:25:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:03.159 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9febe6-91dd-4204-8dce-8b54cdb56cee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:03 np0005603622 systemd[1]: run-netns-ovnmeta\x2d98be5db6\x2d5633\x2d4d23\x2db9a9\x2d16382d8e99ab.mount: Deactivated successfully.
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.173 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.724s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.178 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.206 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:25:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:03.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.582 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:25:03 np0005603622 nova_compute[221301]: 2026-01-31 08:25:03.582 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.570 221324 DEBUG nova.compute.manager [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.570 221324 DEBUG oslo_concurrency.lockutils [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.571 221324 DEBUG oslo_concurrency.lockutils [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.571 221324 DEBUG oslo_concurrency.lockutils [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.571 221324 DEBUG nova.compute.manager [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:04 np0005603622 nova_compute[221301]: 2026-01-31 08:25:04.571 221324 WARNING nova.compute.manager [req-2784945f-0274-4c9b-b189-9654b09c8abd req-a3652ddb-15d1-4101-8d74-bc710bcdfc00 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:25:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 31 03:25:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:04.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:05.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.482 221324 DEBUG oslo_concurrency.lockutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "526d2b81c98d8357a7817e885aa26e4ec9e0775c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.554 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'migration_context' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.641 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.647 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start _get_guest_xml network_info=[{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "vif_mac": "fa:16:3e:9b:35:86"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '27d563e8-5fbe-4653-b526-f8048559e6cd', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': 0, 'attachment_id': '13da4cb4-931b-4f9d-98dc-452bf3204cfa', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b717b315-86df-4d2d-96bf-25aba2759093', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b717b315-86df-4d2d-96bf-25aba2759093', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'attached_at': '', 'detached_at': '', 'volume_id': 'b717b315-86df-4d2d-96bf-25aba2759093', 'serial': 'b717b315-86df-4d2d-96bf-25aba2759093'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.648 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'resources' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.689 221324 WARNING nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.695 221324 DEBUG nova.virt.libvirt.host [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.697 221324 DEBUG nova.virt.libvirt.host [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.701 221324 DEBUG nova.virt.libvirt.host [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.701 221324 DEBUG nova.virt.libvirt.host [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.703 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.705 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.706 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.706 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.706 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.707 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.707 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.708 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.708 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.708 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.709 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.709 221324 DEBUG nova.virt.hardware [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.710 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:05 np0005603622 nova_compute[221301]: 2026-01-31 08:25:05.933 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:25:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/392920374' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.533 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.745 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.834 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.866 221324 DEBUG nova.compute.manager [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.867 221324 DEBUG oslo_concurrency.lockutils [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.868 221324 DEBUG oslo_concurrency.lockutils [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.868 221324 DEBUG oslo_concurrency.lockutils [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.869 221324 DEBUG nova.compute.manager [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:06 np0005603622 nova_compute[221301]: 2026-01-31 08:25:06.869 221324 WARNING nova.compute.manager [req-4b68ecd9-75be-4517-8d72-f01de6a74d18 req-914e8609-6b74-4794-ab3e-b99884e80f2f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:25:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:06.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:07.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:07 np0005603622 nova_compute[221301]: 2026-01-31 08:25:07.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:25:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2841132570' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.402 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.567s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.405 221324 DEBUG nova.virt.libvirt.vif [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2039669481',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2039669481',id=118,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cf024d54545b4af882a87c721105742a',ramdisk_id='',reservation_id='r-ohvqqfqv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-468517745',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:24:39Z,user_data=None,user_id='f4d66dd0b7ff443cbcdb6e2c9f5c4c8c',uuid=4627608a-03b0-4a7e-bcb3-b709dade0334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "vif_mac": "fa:16:3e:9b:35:86"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.406 221324 DEBUG nova.network.os_vif_util [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converting VIF {"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "vif_mac": "fa:16:3e:9b:35:86"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.408 221324 DEBUG nova.network.os_vif_util [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.411 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.947 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <uuid>4627608a-03b0-4a7e-bcb3-b709dade0334</uuid>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <name>instance-00000076</name>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-2039669481</nova:name>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:25:05</nova:creationTime>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:user uuid="f4d66dd0b7ff443cbcdb6e2c9f5c4c8c">tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member</nova:user>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:project uuid="cf024d54545b4af882a87c721105742a">tempest-ServerBootFromVolumeStableRescueTest-468517745</nova:project>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <nova:port uuid="14f2f6f4-afdf-41af-8c0f-05467025eed7">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="serial">4627608a-03b0-4a7e-bcb3-b709dade0334</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="uuid">4627608a-03b0-4a7e-bcb3-b709dade0334</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-b717b315-86df-4d2d-96bf-25aba2759093">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <serial>b717b315-86df-4d2d-96bf-25aba2759093</serial>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4627608a-03b0-4a7e-bcb3-b709dade0334_disk.rescue">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <boot order="1"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:9b:35:86"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <target dev="tap14f2f6f4-af"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/console.log" append="off"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:25:08 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:25:08 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:25:08 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:25:08 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:25:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:08.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:08 np0005603622 nova_compute[221301]: 2026-01-31 08:25:08.973 221324 INFO nova.virt.libvirt.driver [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance destroyed successfully.#033[00m
Jan 31 03:25:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:09.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.577 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.579 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.579 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.580 221324 DEBUG nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] No VIF found with MAC fa:16:3e:9b:35:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.580 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Using config drive#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.605 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:09 np0005603622 nova_compute[221301]: 2026-01-31 08:25:09.853 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:10 np0005603622 nova_compute[221301]: 2026-01-31 08:25:10.205 221324 DEBUG nova.objects.instance [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'keypairs' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:10.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.321 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Creating config drive at /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue#033[00m
Jan 31 03:25:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:11.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.328 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjc29ru76 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.463 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjc29ru76" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.505 221324 DEBUG nova.storage.rbd_utils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] rbd image 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.510 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.579 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:11 np0005603622 nova_compute[221301]: 2026-01-31 08:25:11.746 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:12 np0005603622 nova_compute[221301]: 2026-01-31 08:25:12.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:12.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:13.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:14 np0005603622 nova_compute[221301]: 2026-01-31 08:25:14.122 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847899.1202514, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:14 np0005603622 nova_compute[221301]: 2026-01-31 08:25:14.123 221324 INFO nova.compute.manager [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:25:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:14.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:15.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:16 np0005603622 nova_compute[221301]: 2026-01-31 08:25:16.748 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:16 np0005603622 nova_compute[221301]: 2026-01-31 08:25:16.822 221324 DEBUG nova.compute.manager [None req-00085bf1-34f7-45eb-a935-4e5fe5904f2b - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:16 np0005603622 nova_compute[221301]: 2026-01-31 08:25:16.829 221324 DEBUG nova.compute.manager [None req-00085bf1-34f7-45eb-a935-4e5fe5904f2b - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:16.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:17 np0005603622 nova_compute[221301]: 2026-01-31 08:25:17.022 221324 INFO nova.compute.manager [None req-00085bf1-34f7-45eb-a935-4e5fe5904f2b - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:25:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:17.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:17 np0005603622 nova_compute[221301]: 2026-01-31 08:25:17.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.111 221324 DEBUG oslo_concurrency.processutils [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue 4627608a-03b0-4a7e-bcb3-b709dade0334_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.601s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.113 221324 INFO nova.virt.libvirt.driver [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Deleting local config drive /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:25:18 np0005603622 kernel: tap14f2f6f4-af: entered promiscuous mode
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.1751] manager: (tap14f2f6f4-af): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Jan 31 03:25:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:18Z|00520|binding|INFO|Claiming lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 for this chassis.
Jan 31 03:25:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:18Z|00521|binding|INFO|14f2f6f4-afdf-41af-8c0f-05467025eed7: Claiming fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.177 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:18Z|00522|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 ovn-installed in OVS
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 systemd-udevd[266519]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:25:18 np0005603622 systemd-machined[190406]: New machine qemu-52-instance-00000076.
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.2124] device (tap14f2f6f4-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.2131] device (tap14f2f6f4-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:25:18 np0005603622 systemd[1]: Started Virtual Machine qemu-52-instance-00000076.
Jan 31 03:25:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:18Z|00523|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 up in Southbound
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.306 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.309 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab bound to our chassis#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.311 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 98be5db6-5633-4d23-b9a9-16382d8e99ab#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.353 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e66c460c-e964-43fd-b332-8a86944f66ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.355 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap98be5db6-51 in ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.357 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap98be5db6-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.357 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[35a327e9-335c-499b-81fa-3ef4a7ffe6a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.359 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[768a5c90-e8ac-4360-a21c-59767b964cd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.372 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[39331229-40c7-43f2-95b0-d1cd0b4201b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.402 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[26c94cdb-df59-4104-afc1-970ed6dae01e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.436 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f75b218-0c67-4d4e-a72a-d7147febaffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.4518] manager: (tap98be5db6-50): new Veth device (/org/freedesktop/NetworkManager/Devices/242)
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.454 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0b963784-6b72-4398-8c33-9186c91a64ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.497 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[13a0e174-9a0f-494b-97de-89033b6fca70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.501 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80596279-82f4-43a8-9712-60f00eda5226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.5229] device (tap98be5db6-50): carrier: link connected
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.531 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d58ac3-0248-410f-87d9-ff7dc410e41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.551 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a289bbe5-bf20-4422-8d9d-dd2f2938849e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716986, 'reachable_time': 29232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266554, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[37d8d67d-52e8-426b-9e4a-af8716284a01]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:3a3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 716986, 'tstamp': 716986}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266555, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.595 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[24d12487-53ae-4fab-add3-f54f56100427]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716986, 'reachable_time': 29232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266556, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.626 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f89c58c8-fb5b-416b-976b-0b50a1aa68a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.664 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cf7f168a-7554-43f4-8b6b-98d45c9c8ede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.666 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.666 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.666 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98be5db6-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.721 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 NetworkManager[49080]: <info>  [1769847918.7215] manager: (tap98be5db6-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 31 03:25:18 np0005603622 kernel: tap98be5db6-50: entered promiscuous mode
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.724 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap98be5db6-50, col_values=(('external_ids', {'iface-id': 'dad27cfe-7e8a-4f55-a945-07f9cae848c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:18Z|00524|binding|INFO|Releasing lport dad27cfe-7e8a-4f55-a945-07f9cae848c1 from this chassis (sb_readonly=0)
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.727 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.728 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.729 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3edd7764-d09a-4879-b7dd-e3eca3e35930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.731 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:25:18 np0005603622 nova_compute[221301]: 2026-01-31 08:25:18.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:25:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:18.732 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'env', 'PROCESS_TAG=haproxy-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/98be5db6-5633-4d23-b9a9-16382d8e99ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:25:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:18.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:19 np0005603622 podman[266613]: 2026-01-31 08:25:19.100681267 +0000 UTC m=+0.031123887 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:25:19 np0005603622 podman[266613]: 2026-01-31 08:25:19.209006329 +0000 UTC m=+0.139448869 container create 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:25:19 np0005603622 systemd[1]: Started libpod-conmon-6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7.scope.
Jan 31 03:25:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:25:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bb517756484d8091943fc4e3cc2b86bb2de60af080f455acefa7117d952f6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:25:19 np0005603622 podman[266613]: 2026-01-31 08:25:19.326510367 +0000 UTC m=+0.256952917 container init 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:25:19 np0005603622 podman[266613]: 2026-01-31 08:25:19.336136326 +0000 UTC m=+0.266578846 container start 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:25:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:19.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:19 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [NOTICE]   (266661) : New worker (266663) forked
Jan 31 03:25:19 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [NOTICE]   (266661) : Loading success.
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.366 221324 DEBUG nova.compute.manager [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.367 221324 DEBUG oslo_concurrency.lockutils [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.367 221324 DEBUG oslo_concurrency.lockutils [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.368 221324 DEBUG oslo_concurrency.lockutils [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.368 221324 DEBUG nova.compute.manager [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.368 221324 WARNING nova.compute.manager [req-7da6d03c-a3b2-44ea-b304-e70c29dd4352 req-b91a22e9-07c1-4c3a-b9c5-47cc06b6dd8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.501 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847919.5001988, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.501 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.504 221324 DEBUG nova.compute.manager [None req-3dc5e94b-1ca5-4044-8e77-9123af526df9 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.741 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.747 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.840 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.841 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847919.5014606, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:19 np0005603622 nova_compute[221301]: 2026-01-31 08:25:19.841 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Started (Lifecycle Event)#033[00m
Jan 31 03:25:20 np0005603622 nova_compute[221301]: 2026-01-31 08:25:20.118 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:20 np0005603622 nova_compute[221301]: 2026-01-31 08:25:20.121 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 31 03:25:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:20.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:21 np0005603622 podman[266679]: 2026-01-31 08:25:21.174011508 +0000 UTC m=+0.074360880 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:25:21 np0005603622 podman[266678]: 2026-01-31 08:25:21.216998424 +0000 UTC m=+0.117317845 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:25:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:21.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.978 221324 DEBUG nova.compute.manager [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.978 221324 DEBUG oslo_concurrency.lockutils [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.978 221324 DEBUG oslo_concurrency.lockutils [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.978 221324 DEBUG oslo_concurrency.lockutils [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.979 221324 DEBUG nova.compute.manager [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:21 np0005603622 nova_compute[221301]: 2026-01-31 08:25:21.979 221324 WARNING nova.compute.manager [req-c06ab48f-3d64-4864-a8fc-4c8048be4751 req-1d8854ab-83db-4f3a-8c8b-5aa033064532 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:25:22 np0005603622 nova_compute[221301]: 2026-01-31 08:25:22.499 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:22 np0005603622 nova_compute[221301]: 2026-01-31 08:25:22.919 221324 INFO nova.compute.manager [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Unrescuing#033[00m
Jan 31 03:25:22 np0005603622 nova_compute[221301]: 2026-01-31 08:25:22.921 221324 DEBUG oslo_concurrency.lockutils [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:22 np0005603622 nova_compute[221301]: 2026-01-31 08:25:22.922 221324 DEBUG oslo_concurrency.lockutils [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:22 np0005603622 nova_compute[221301]: 2026-01-31 08:25:22.922 221324 DEBUG nova.network.neutron [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:25:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:24.950299) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847924950436, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 2483, "num_deletes": 253, "total_data_size": 6145127, "memory_usage": 6218400, "flush_reason": "Manual Compaction"}
Jan 31 03:25:24 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 31 03:25:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:24.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925259813, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 3969027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 50073, "largest_seqno": 52551, "table_properties": {"data_size": 3958817, "index_size": 6512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21416, "raw_average_key_size": 20, "raw_value_size": 3938382, "raw_average_value_size": 3812, "num_data_blocks": 282, "num_entries": 1033, "num_filter_entries": 1033, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847719, "oldest_key_time": 1769847719, "file_creation_time": 1769847924, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 309516 microseconds, and 12651 cpu microseconds.
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.259874) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 3969027 bytes OK
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.259902) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.298484) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.298509) EVENT_LOG_v1 {"time_micros": 1769847925298502, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.298536) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 6134117, prev total WAL file size 6157255, number of live WAL files 2.
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.300407) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(3876KB)], [99(9289KB)]
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925300468, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 13481262, "oldest_snapshot_seqno": -1}
Jan 31 03:25:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7693 keys, 11541333 bytes, temperature: kUnknown
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925484452, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 11541333, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11490342, "index_size": 30655, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19269, "raw_key_size": 198563, "raw_average_key_size": 25, "raw_value_size": 11353913, "raw_average_value_size": 1475, "num_data_blocks": 1207, "num_entries": 7693, "num_filter_entries": 7693, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769847925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:25:25 np0005603622 nova_compute[221301]: 2026-01-31 08:25:25.538 221324 DEBUG nova.network.neutron [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:25 np0005603622 nova_compute[221301]: 2026-01-31 08:25:25.572 221324 DEBUG oslo_concurrency.lockutils [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:25 np0005603622 nova_compute[221301]: 2026-01-31 08:25:25.573 221324 DEBUG nova.objects.instance [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'flavor' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.484868) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 11541333 bytes
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.603494) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 73.2 rd, 62.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 9.1 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 8219, records dropped: 526 output_compression: NoCompression
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.603553) EVENT_LOG_v1 {"time_micros": 1769847925603531, "job": 62, "event": "compaction_finished", "compaction_time_micros": 184184, "compaction_time_cpu_micros": 47520, "output_level": 6, "num_output_files": 1, "total_output_size": 11541333, "num_input_records": 8219, "num_output_records": 7693, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925604610, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769847925607006, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.300266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.607094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.607103) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.607106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.607109) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:25 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:25:25.607111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:25:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 31 03:25:26 np0005603622 nova_compute[221301]: 2026-01-31 08:25:26.752 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:26.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:27.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:27 np0005603622 nova_compute[221301]: 2026-01-31 08:25:27.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 kernel: tap14f2f6f4-af (unregistering): left promiscuous mode
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.408 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 NetworkManager[49080]: <info>  [1769847928.4095] device (tap14f2f6f4-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00525|binding|INFO|Releasing lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 from this chassis (sb_readonly=0)
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00526|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 down in Southbound
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00527|binding|INFO|Removing iface tap14f2f6f4-af ovn-installed in OVS
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 31 03:25:28 np0005603622 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000076.scope: Consumed 7.197s CPU time.
Jan 31 03:25:28 np0005603622 systemd-machined[190406]: Machine qemu-52-instance-00000076 terminated.
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.670 221324 INFO nova.virt.libvirt.driver [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance destroyed successfully.#033[00m
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.671 221324 DEBUG nova.objects.instance [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'numa_topology' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:28.806 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:25:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:28.809 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab unbound from our chassis#033[00m
Jan 31 03:25:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:28.811 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 98be5db6-5633-4d23-b9a9-16382d8e99ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:25:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:28.813 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4c72d8-939f-44eb-be0b-b8bf8b7edcd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:28.814 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace which is not needed anymore#033[00m
Jan 31 03:25:28 np0005603622 kernel: tap14f2f6f4-af: entered promiscuous mode
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 NetworkManager[49080]: <info>  [1769847928.9067] manager: (tap14f2f6f4-af): new Tun device (/org/freedesktop/NetworkManager/Devices/244)
Jan 31 03:25:28 np0005603622 systemd-udevd[266726]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00528|binding|INFO|Claiming lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 for this chassis.
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00529|binding|INFO|14f2f6f4-afdf-41af-8c0f-05467025eed7: Claiming fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:25:28 np0005603622 NetworkManager[49080]: <info>  [1769847928.9191] device (tap14f2f6f4-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:25:28 np0005603622 NetworkManager[49080]: <info>  [1769847928.9199] device (tap14f2f6f4-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.923 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00530|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 ovn-installed in OVS
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 nova_compute[221301]: 2026-01-31 08:25:28.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:28 np0005603622 systemd-machined[190406]: New machine qemu-53-instance-00000076.
Jan 31 03:25:28 np0005603622 systemd[1]: Started Virtual Machine qemu-53-instance-00000076.
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [NOTICE]   (266661) : haproxy version is 2.8.14-c23fe91
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [NOTICE]   (266661) : path to executable is /usr/sbin/haproxy
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [WARNING]  (266661) : Exiting Master process...
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [WARNING]  (266661) : Exiting Master process...
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [ALERT]    (266661) : Current worker (266663) exited with code 143 (Terminated)
Jan 31 03:25:28 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266657]: [WARNING]  (266661) : All workers exited. Exiting... (0)
Jan 31 03:25:28 np0005603622 systemd[1]: libpod-6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7.scope: Deactivated successfully.
Jan 31 03:25:28 np0005603622 podman[266763]: 2026-01-31 08:25:28.96276 +0000 UTC m=+0.059777578 container died 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:25:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:28.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:28Z|00531|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 up in Southbound
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.000 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:25:29 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7-userdata-shm.mount: Deactivated successfully.
Jan 31 03:25:29 np0005603622 systemd[1]: var-lib-containers-storage-overlay-67bb517756484d8091943fc4e3cc2b86bb2de60af080f455acefa7117d952f6a-merged.mount: Deactivated successfully.
Jan 31 03:25:29 np0005603622 podman[266763]: 2026-01-31 08:25:29.064843394 +0000 UTC m=+0.161860982 container cleanup 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:25:29 np0005603622 systemd[1]: libpod-conmon-6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7.scope: Deactivated successfully.
Jan 31 03:25:29 np0005603622 podman[266809]: 2026-01-31 08:25:29.149605072 +0000 UTC m=+0.061991627 container remove 6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.154 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[78b3abb2-1165-4188-9733-62f4930a58cb]: (4, ('Sat Jan 31 08:25:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7)\n6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7\nSat Jan 31 08:25:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7)\n6546f8e70f21f7ca5991794d658d3398722b6c1e171fd34f12ed88a4cf65dff7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.156 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[392d86a2-603d-4acd-b0df-745e915edd84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.157 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:29 np0005603622 kernel: tap98be5db6-50: left promiscuous mode
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.163 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0cca772b-0e31-434a-a235-a370811cae0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.182 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48fadebb-1cbc-48c2-86ac-4c17f8a529b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.185 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[32dd7b98-782c-4081-9c39-1501461612e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.202 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6b55eb68-9d83-4b9b-a26b-a8594bde6ee1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 716977, 'reachable_time': 17229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266824, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 systemd[1]: run-netns-ovnmeta\x2d98be5db6\x2d5633\x2d4d23\x2db9a9\x2d16382d8e99ab.mount: Deactivated successfully.
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.207 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.207 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d85bb689-29ac-49cd-9b7b-6feb4e961113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.212 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab unbound from our chassis#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.214 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 98be5db6-5633-4d23-b9a9-16382d8e99ab#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.222 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6cc778-afa7-4f60-b4e8-86455c28203f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.223 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap98be5db6-51 in ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.226 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap98be5db6-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.226 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[57ca1d06-8a59-488c-93cb-82d03a33f9f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.227 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02f26efd-b97a-4f3f-80a4-e22fbaeb712a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.240 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5fc1d1-1ca2-40c4-ae3a-9be040e063a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.252 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5090a5-9707-4181-83f8-4ff200227277]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.278 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ac674f56-2b82-48de-b2d2-d11862fd8c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.284 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c40bdb86-f6a4-4d85-948f-eb692dee52a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 NetworkManager[49080]: <info>  [1769847929.2858] manager: (tap98be5db6-50): new Veth device (/org/freedesktop/NetworkManager/Devices/245)
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.313 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fabb5c3b-be86-472f-8be5-c1a5e3da077a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.319 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[837d6d7c-096e-4489-b18b-857e761f00cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 NetworkManager[49080]: <info>  [1769847929.3464] device (tap98be5db6-50): carrier: link connected
Jan 31 03:25:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.353 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[429cbafa-5f72-49b1-85d4-75050f67df6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:29.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.374 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[73118664-1ed1-40af-b736-1ec78f7fe2ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718068, 'reachable_time': 42955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266848, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.393 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[90ab7586-937c-4615-ba30-bdc518703ab7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:3a3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718068, 'tstamp': 718068}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266856, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.422 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68c0c8e6-010f-4b1e-bb5f-258da42a03c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap98be5db6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718068, 'reachable_time': 42955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266866, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.459 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d345e875-0b2d-4525-80c3-42b5dc697fec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.530 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1a08c1-c32a-49b7-bbee-61bc20fbd600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.532 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.533 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.534 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98be5db6-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.537 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:29 np0005603622 kernel: tap98be5db6-50: entered promiscuous mode
Jan 31 03:25:29 np0005603622 NetworkManager[49080]: <info>  [1769847929.5380] manager: (tap98be5db6-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.540 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap98be5db6-50, col_values=(('external_ids', {'iface-id': 'dad27cfe-7e8a-4f55-a945-07f9cae848c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.541 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:29 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:29Z|00532|binding|INFO|Releasing lport dad27cfe-7e8a-4f55-a945-07f9cae848c1 from this chassis (sb_readonly=0)
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.552 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.553 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.555 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d44a0b4a-617d-487a-9f80-96b385c7a624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.556 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/98be5db6-5633-4d23-b9a9-16382d8e99ab.pid.haproxy
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 98be5db6-5633-4d23-b9a9-16382d8e99ab
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:25:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:29.557 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'env', 'PROCESS_TAG=haproxy-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/98be5db6-5633-4d23-b9a9-16382d8e99ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.995 221324 DEBUG nova.compute.manager [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.996 221324 DEBUG oslo_concurrency.lockutils [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.997 221324 DEBUG oslo_concurrency.lockutils [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.997 221324 DEBUG oslo_concurrency.lockutils [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.997 221324 DEBUG nova.compute.manager [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:29 np0005603622 nova_compute[221301]: 2026-01-31 08:25:29.998 221324 WARNING nova.compute.manager [req-564d9f77-f046-4e4f-a08f-496050193c88 req-0c08951c-fafd-461b-aeb9-3d3521fbfed7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:25:30 np0005603622 podman[266907]: 2026-01-31 08:25:29.912417097 +0000 UTC m=+0.037383536 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.101 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 4627608a-03b0-4a7e-bcb3-b709dade0334 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.103 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847930.101319, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.104 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.224 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.229 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.298 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.299 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847930.1031706, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.299 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Started (Lifecycle Event)#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.339 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.343 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:25:30 np0005603622 nova_compute[221301]: 2026-01-31 08:25:30.396 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:25:30 np0005603622 podman[266907]: 2026-01-31 08:25:30.416578729 +0000 UTC m=+0.541545148 container create 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:25:30 np0005603622 systemd[1]: Started libpod-conmon-82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a.scope.
Jan 31 03:25:30 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:25:30 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9386ce9847ebcd6d024079eabbad81f0840c04568fd14dbd2112bacaafaad31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:25:30 np0005603622 podman[266907]: 2026-01-31 08:25:30.873629755 +0000 UTC m=+0.998596214 container init 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:25:30 np0005603622 podman[266907]: 2026-01-31 08:25:30.879085681 +0000 UTC m=+1.004052110 container start 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:30.879 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:30.880 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:30.881 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 31 03:25:30 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [NOTICE]   (266961) : New worker (266963) forked
Jan 31 03:25:30 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [NOTICE]   (266961) : Loading success.
Jan 31 03:25:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:30.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 03:25:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:31.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 03:25:31 np0005603622 nova_compute[221301]: 2026-01-31 08:25:31.754 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.378 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.379 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.380 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.380 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.381 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.381 221324 WARNING nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.382 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.382 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.382 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.383 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.383 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.384 221324 WARNING nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.384 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.385 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.385 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.385 221324 DEBUG oslo_concurrency.lockutils [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.386 221324 DEBUG nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.386 221324 WARNING nova.compute.manager [req-dbdd6b06-f518-4ccd-b922-379926a9eef2 req-57e0f48c-77ee-47e5-a63c-867f85e4d148 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:32 np0005603622 nova_compute[221301]: 2026-01-31 08:25:32.536 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:32.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:25:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:33.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:25:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:34.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:35.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:36 np0005603622 nova_compute[221301]: 2026-01-31 08:25:36.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:36.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:37.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:37 np0005603622 nova_compute[221301]: 2026-01-31 08:25:37.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:39.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:40 np0005603622 nova_compute[221301]: 2026-01-31 08:25:40.445 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:40 np0005603622 nova_compute[221301]: 2026-01-31 08:25:40.447 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:41.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:41.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:41 np0005603622 nova_compute[221301]: 2026-01-31 08:25:41.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:41 np0005603622 nova_compute[221301]: 2026-01-31 08:25:41.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:25:41 np0005603622 nova_compute[221301]: 2026-01-31 08:25:41.780 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:42 np0005603622 nova_compute[221301]: 2026-01-31 08:25:42.542 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:25:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:43.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:43.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:45.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:45Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:25:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:45Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:35:86 10.100.0.13
Jan 31 03:25:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:45.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.394 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.424 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.424 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.425 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.425 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:45 np0005603622 nova_compute[221301]: 2026-01-31 08:25:45.672 221324 DEBUG nova.compute.manager [None req-adf32648-6e51-47f7-a10f-6ce4f3c955bc f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:25:46 np0005603622 nova_compute[221301]: 2026-01-31 08:25:46.782 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:47.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.131 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [{"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.185 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-4627608a-03b0-4a7e-bcb3-b709dade0334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.186 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.186 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.187 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.187 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.228 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.229 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.229 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.230 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.230 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.368 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.368 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.369 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.369 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.369 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.371 221324 INFO nova.compute.manager [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Terminating instance#033[00m
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.373 221324 DEBUG nova.compute.manager [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:25:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:47.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.544 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3507328003' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:47 np0005603622 nova_compute[221301]: 2026-01-31 08:25:47.753 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:48 np0005603622 kernel: tap14f2f6f4-af (unregistering): left promiscuous mode
Jan 31 03:25:48 np0005603622 NetworkManager[49080]: <info>  [1769847948.2485] device (tap14f2f6f4-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:25:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:48Z|00533|binding|INFO|Releasing lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 from this chassis (sb_readonly=0)
Jan 31 03:25:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:48Z|00534|binding|INFO|Setting lport 14f2f6f4-afdf-41af-8c0f-05467025eed7 down in Southbound
Jan 31 03:25:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:48Z|00535|binding|INFO|Removing iface tap14f2f6f4-af ovn-installed in OVS
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.258 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.275 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 31 03:25:48 np0005603622 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000076.scope: Consumed 15.004s CPU time.
Jan 31 03:25:48 np0005603622 systemd-machined[190406]: Machine qemu-53-instance-00000076 terminated.
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.415 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:35:86 10.100.0.13'], port_security=['fa:16:3e:9b:35:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4627608a-03b0-4a7e-bcb3-b709dade0334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf024d54545b4af882a87c721105742a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '634aba40-50e3-4365-94d0-0773f42bafa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2028d9-0505-431d-85ea-94f27c9f5ff6, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=14f2f6f4-afdf-41af-8c0f-05467025eed7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.417 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 14f2f6f4-afdf-41af-8c0f-05467025eed7 in datapath 98be5db6-5633-4d23-b9a9-16382d8e99ab unbound from our chassis#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.419 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 98be5db6-5633-4d23-b9a9-16382d8e99ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.420 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3279bb8-c1b2-471a-8e41-8688c3b3f3f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.421 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab namespace which is not needed anymore#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.421 221324 INFO nova.virt.libvirt.driver [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Instance destroyed successfully.#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.422 221324 DEBUG nova.objects.instance [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lazy-loading 'resources' on Instance uuid 4627608a-03b0-4a7e-bcb3-b709dade0334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.448 221324 DEBUG nova.virt.libvirt.vif [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2039669481',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-2039669481',id=118,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cf024d54545b4af882a87c721105742a',ramdisk_id='',reservation_id='r-ohvqqfqv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-468517745',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-468517745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:25:45Z,user_data=None,user_id='f4d66dd0b7ff443cbcdb6e2c9f5c4c8c',uuid=4627608a-03b0-4a7e-bcb3-b709dade0334,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.448 221324 DEBUG nova.network.os_vif_util [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converting VIF {"id": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "address": "fa:16:3e:9b:35:86", "network": {"id": "98be5db6-5633-4d23-b9a9-16382d8e99ab", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-2138383352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cf024d54545b4af882a87c721105742a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f2f6f4-af", "ovs_interfaceid": "14f2f6f4-afdf-41af-8c0f-05467025eed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.449 221324 DEBUG nova.network.os_vif_util [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.450 221324 DEBUG os_vif [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.453 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f2f6f4-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.463 221324 INFO os_vif [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:35:86,bridge_name='br-int',has_traffic_filtering=True,id=14f2f6f4-afdf-41af-8c0f-05467025eed7,network=Network(98be5db6-5633-4d23-b9a9-16382d8e99ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14f2f6f4-af')#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.523 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.523 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [NOTICE]   (266961) : haproxy version is 2.8.14-c23fe91
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [NOTICE]   (266961) : path to executable is /usr/sbin/haproxy
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [WARNING]  (266961) : Exiting Master process...
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [WARNING]  (266961) : Exiting Master process...
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [ALERT]    (266961) : Current worker (266963) exited with code 143 (Terminated)
Jan 31 03:25:48 np0005603622 neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab[266957]: [WARNING]  (266961) : All workers exited. Exiting... (0)
Jan 31 03:25:48 np0005603622 systemd[1]: libpod-82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a.scope: Deactivated successfully.
Jan 31 03:25:48 np0005603622 podman[267100]: 2026-01-31 08:25:48.607960937 +0000 UTC m=+0.080726921 container died 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:25:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a-userdata-shm.mount: Deactivated successfully.
Jan 31 03:25:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay-a9386ce9847ebcd6d024079eabbad81f0840c04568fd14dbd2112bacaafaad31-merged.mount: Deactivated successfully.
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.686 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.688 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4513MB free_disk=20.785682678222656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.688 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.688 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:48 np0005603622 podman[267100]: 2026-01-31 08:25:48.727763167 +0000 UTC m=+0.200529181 container cleanup 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:25:48 np0005603622 systemd[1]: libpod-conmon-82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a.scope: Deactivated successfully.
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.796 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4627608a-03b0-4a7e-bcb3-b709dade0334 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.796 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.797 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.838 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:25:48 np0005603622 podman[267130]: 2026-01-31 08:25:48.841228247 +0000 UTC m=+0.097999915 container remove 82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.847 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[da51a463-0cb6-44f9-9306-e8b19d097aa4]: (4, ('Sat Jan 31 08:25:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a)\n82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a\nSat Jan 31 08:25:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab (82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a)\n82198c7824b7073d5a853181195196fa006de2b3e0025dbdfeea2baa1166709a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.849 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd08c70-05e5-44ae-b629-ccd530755728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.850 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98be5db6-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:25:48 np0005603622 kernel: tap98be5db6-50: left promiscuous mode
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.861 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.866 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1512e7c2-5339-46a6-ab85-0909cfd9efad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.887 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a53fcef-89ec-443a-b7b7-ec0dda546121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.889 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc11d93-08d4-455b-b904-84d648bd523a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.906 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[549a6b33-9a13-4c18-9ca5-f9ce5693bb9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718061, 'reachable_time': 20703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267146, 'error': None, 'target': 'ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.910 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-98be5db6-5633-4d23-b9a9-16382d8e99ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:25:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:25:48.910 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[60659f5b-97a2-4079-b246-70e890229f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:25:48 np0005603622 systemd[1]: run-netns-ovnmeta\x2d98be5db6\x2d5633\x2d4d23\x2db9a9\x2d16382d8e99ab.mount: Deactivated successfully.
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.968 221324 DEBUG nova.compute.manager [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.969 221324 DEBUG oslo_concurrency.lockutils [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.969 221324 DEBUG oslo_concurrency.lockutils [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.970 221324 DEBUG oslo_concurrency.lockutils [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.970 221324 DEBUG nova.compute.manager [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:48 np0005603622 nova_compute[221301]: 2026-01-31 08:25:48.970 221324 DEBUG nova.compute.manager [req-e1e24b81-2024-41c5-b35c-38e94a14ff66 req-c26ebe85-ea0a-442c-b4d0-a48e4ab045ae fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-unplugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:25:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:49.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:25:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/974417979' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.312 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.318 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.343 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.367 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.368 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:49.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:25:49 np0005603622 nova_compute[221301]: 2026-01-31 08:25:49.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:25:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:51.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.102 221324 DEBUG nova.compute.manager [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.103 221324 DEBUG oslo_concurrency.lockutils [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.104 221324 DEBUG oslo_concurrency.lockutils [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.104 221324 DEBUG oslo_concurrency.lockutils [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.105 221324 DEBUG nova.compute.manager [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] No waiting events found dispatching network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.105 221324 WARNING nova.compute.manager [req-373d7b7c-9618-4055-b68e-b1f2efbdbabb req-1c0db551-6610-43e9-895d-f97d1ea641e8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received unexpected event network-vif-plugged-14f2f6f4-afdf-41af-8c0f-05467025eed7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:25:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:51.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:51 np0005603622 nova_compute[221301]: 2026-01-31 08:25:51.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:52 np0005603622 podman[267169]: 2026-01-31 08:25:52.154829618 +0000 UTC m=+0.078005138 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:25:52 np0005603622 podman[267170]: 2026-01-31 08:25:52.168030093 +0000 UTC m=+0.081607775 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 31 03:25:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:53.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:25:53Z|00536|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 03:25:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:53.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:53 np0005603622 nova_compute[221301]: 2026-01-31 08:25:53.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:25:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:55.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:55.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:56 np0005603622 nova_compute[221301]: 2026-01-31 08:25:56.786 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:57.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:25:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:57.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:25:58 np0005603622 nova_compute[221301]: 2026-01-31 08:25:58.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:25:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:25:59.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:25:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:25:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:25:59.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:25:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:00 np0005603622 nova_compute[221301]: 2026-01-31 08:26:00.136 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:00 np0005603622 nova_compute[221301]: 2026-01-31 08:26:00.138 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:00 np0005603622 nova_compute[221301]: 2026-01-31 08:26:00.138 221324 INFO nova.compute.manager [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Unshelving#033[00m
Jan 31 03:26:00 np0005603622 nova_compute[221301]: 2026-01-31 08:26:00.854 221324 INFO nova.virt.block_device [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Booting with volume 40098f2d-bf66-4831-bd6c-f4f5c78b699e at /dev/vdc#033[00m
Jan 31 03:26:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:01.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.135 221324 DEBUG os_brick.utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.139 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.153 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.153 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[062db365-de4d-4693-9672-43e2d171185f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.155 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.164 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.165 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[1c59af25-c8e7-4928-a93a-3178fa1b79cb]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.166 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.176 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.176 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[34cdea66-c16a-48ca-954f-61bac5af8364]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.178 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[18eff586-5056-4c40-8eec-8167ee0faf10]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.178 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.199 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.201 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.202 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.202 221324 DEBUG os_brick.initiator.connectors.lightos [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.202 221324 DEBUG os_brick.utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.203 221324 DEBUG nova.virt.block_device [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating existing volume attachment record: 395572e4-5b80-4c49-b44a-e3d5caf7062e _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:26:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:01.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:01 np0005603622 nova_compute[221301]: 2026-01-31 08:26:01.789 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.059 221324 DEBUG nova.compute.manager [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.301 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.301 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.510 221324 DEBUG nova.objects.instance [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_requests' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.698 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.698 221324 INFO nova.compute.claims [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.699 221324 DEBUG nova.objects.instance [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:02 np0005603622 nova_compute[221301]: 2026-01-31 08:26:02.728 221324 DEBUG nova.objects.instance [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:03.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.247 221324 INFO nova.compute.resource_tracker [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating resource usage from migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.248 221324 DEBUG nova.compute.resource_tracker [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting to track incoming migration 0ecfaaa6-30a5-421d-affe-cfddf44ff0c2 with flavor f75c4aee-d826-4343-a7e3-f06a4b21de52 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 31 03:26:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:03.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.416 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769847948.4148173, 4627608a-03b0-4a7e-bcb3-b709dade0334 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.417 221324 INFO nova.compute.manager [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.421 221324 DEBUG nova.scheduler.client.report [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.452 221324 DEBUG nova.scheduler.client.report [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.453 221324 DEBUG nova.compute.provider_tree [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.482 221324 DEBUG nova.scheduler.client.report [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.510 221324 DEBUG nova.scheduler.client.report [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.609 221324 DEBUG nova.compute.manager [None req-5213865e-5706-4fd0-ab51-74ee6d1d69b8 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.613 221324 DEBUG nova.compute.manager [None req-5213865e-5706-4fd0-ab51-74ee6d1d69b8 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.852 221324 INFO nova.virt.libvirt.driver [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Deleting instance files /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334_del#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.853 221324 INFO nova.virt.libvirt.driver [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Deletion of /var/lib/nova/instances/4627608a-03b0-4a7e-bcb3-b709dade0334_del complete#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.870 221324 INFO nova.compute.manager [None req-5213865e-5706-4fd0-ab51-74ee6d1d69b8 - - - - - -] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.899 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.940 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.997 221324 INFO nova.compute.manager [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Took 16.62 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:26:03 np0005603622 nova_compute[221301]: 2026-01-31 08:26:03.999 221324 DEBUG oslo.service.loopingcall [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.000 221324 DEBUG nova.compute.manager [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.001 221324 DEBUG nova.network.neutron [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:26:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/503990443' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.378 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.384 221324 DEBUG nova.compute.provider_tree [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.463 221324 DEBUG nova.scheduler.client.report [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.630 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.630 221324 INFO nova.compute.manager [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Migrating#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.643 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.648 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_requests' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.778 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'numa_topology' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.886 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:26:04 np0005603622 nova_compute[221301]: 2026-01-31 08:26:04.887 221324 INFO nova.compute.claims [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:26:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:05.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.330 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:05.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3387222669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.822 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.833 221324 DEBUG nova.compute.provider_tree [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.867 221324 DEBUG nova.scheduler.client.report [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.924 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:05.935 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:05.937 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:05 np0005603622 nova_compute[221301]: 2026-01-31 08:26:05.973 221324 DEBUG nova.network.neutron [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.006 221324 DEBUG nova.compute.manager [req-4f65a281-a702-4c05-ae15-0c697a6768a4 req-b44f35a7-e475-4e11-8d9e-13cadaa76728 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Received event network-vif-deleted-14f2f6f4-afdf-41af-8c0f-05467025eed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.007 221324 INFO nova.compute.manager [req-4f65a281-a702-4c05-ae15-0c697a6768a4 req-b44f35a7-e475-4e11-8d9e-13cadaa76728 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Neutron deleted interface 14f2f6f4-afdf-41af-8c0f-05467025eed7; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.007 221324 DEBUG nova.network.neutron [req-4f65a281-a702-4c05-ae15-0c697a6768a4 req-b44f35a7-e475-4e11-8d9e-13cadaa76728 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.039 221324 INFO nova.compute.manager [-] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Took 2.04 seconds to deallocate network for instance.#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.056 221324 DEBUG nova.compute.manager [req-4f65a281-a702-4c05-ae15-0c697a6768a4 req-b44f35a7-e475-4e11-8d9e-13cadaa76728 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Detach interface failed, port_id=14f2f6f4-afdf-41af-8c0f-05467025eed7, reason: Instance 4627608a-03b0-4a7e-bcb3-b709dade0334 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.095 221324 INFO nova.network.neutron [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating port 19a4f194-5514-4b8e-b635-e6fe0255dc1a with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.337 221324 INFO nova.compute.manager [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] [instance: 4627608a-03b0-4a7e-bcb3-b709dade0334] Took 0.30 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.404 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.405 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.532 221324 DEBUG oslo_concurrency.processutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:06 np0005603622 nova_compute[221301]: 2026-01-31 08:26:06.791 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2792836587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.021 221324 DEBUG oslo_concurrency.processutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.029 221324 DEBUG nova.compute.provider_tree [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:07.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.057 221324 DEBUG nova.scheduler.client.report [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.065 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.066 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.067 221324 DEBUG nova.network.neutron [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.129 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.180 221324 INFO nova.scheduler.client.report [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Deleted allocations for instance 4627608a-03b0-4a7e-bcb3-b709dade0334#033[00m
Jan 31 03:26:07 np0005603622 nova_compute[221301]: 2026-01-31 08:26:07.302 221324 DEBUG oslo_concurrency.lockutils [None req-1640b93a-55f6-4266-94e9-62b221e52608 f4d66dd0b7ff443cbcdb6e2c9f5c4c8c cf024d54545b4af882a87c721105742a - - default default] Lock "4627608a-03b0-4a7e-bcb3-b709dade0334" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 19.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:07.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:08 np0005603622 nova_compute[221301]: 2026-01-31 08:26:08.123 221324 DEBUG nova.compute.manager [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:08 np0005603622 nova_compute[221301]: 2026-01-31 08:26:08.124 221324 DEBUG nova.compute.manager [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing instance network info cache due to event network-changed-19a4f194-5514-4b8e-b635-e6fe0255dc1a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:26:08 np0005603622 nova_compute[221301]: 2026-01-31 08:26:08.125 221324 DEBUG oslo_concurrency.lockutils [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:08 np0005603622 systemd-logind[817]: New session 57 of user nova.
Jan 31 03:26:08 np0005603622 systemd[1]: Created slice User Slice of UID 42436.
Jan 31 03:26:08 np0005603622 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 31 03:26:08 np0005603622 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 31 03:26:08 np0005603622 systemd[1]: Starting User Manager for UID 42436...
Jan 31 03:26:08 np0005603622 systemd[267290]: Queued start job for default target Main User Target.
Jan 31 03:26:08 np0005603622 systemd[267290]: Created slice User Application Slice.
Jan 31 03:26:08 np0005603622 systemd[267290]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:26:08 np0005603622 systemd[267290]: Started Daily Cleanup of User's Temporary Directories.
Jan 31 03:26:08 np0005603622 systemd[267290]: Reached target Paths.
Jan 31 03:26:08 np0005603622 systemd[267290]: Reached target Timers.
Jan 31 03:26:08 np0005603622 systemd[267290]: Starting D-Bus User Message Bus Socket...
Jan 31 03:26:08 np0005603622 systemd[267290]: Starting Create User's Volatile Files and Directories...
Jan 31 03:26:08 np0005603622 systemd[267290]: Listening on D-Bus User Message Bus Socket.
Jan 31 03:26:08 np0005603622 systemd[267290]: Finished Create User's Volatile Files and Directories.
Jan 31 03:26:08 np0005603622 systemd[267290]: Reached target Sockets.
Jan 31 03:26:08 np0005603622 systemd[267290]: Reached target Basic System.
Jan 31 03:26:08 np0005603622 systemd[267290]: Reached target Main User Target.
Jan 31 03:26:08 np0005603622 systemd[267290]: Startup finished in 148ms.
Jan 31 03:26:08 np0005603622 systemd[1]: Started User Manager for UID 42436.
Jan 31 03:26:08 np0005603622 systemd[1]: Started Session 57 of User nova.
Jan 31 03:26:08 np0005603622 systemd-logind[817]: Session 57 logged out. Waiting for processes to exit.
Jan 31 03:26:08 np0005603622 systemd[1]: session-57.scope: Deactivated successfully.
Jan 31 03:26:08 np0005603622 systemd-logind[817]: Removed session 57.
Jan 31 03:26:08 np0005603622 nova_compute[221301]: 2026-01-31 08:26:08.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:08 np0005603622 systemd-logind[817]: New session 59 of user nova.
Jan 31 03:26:08 np0005603622 systemd[1]: Started Session 59 of User nova.
Jan 31 03:26:08 np0005603622 systemd[1]: session-59.scope: Deactivated successfully.
Jan 31 03:26:08 np0005603622 systemd-logind[817]: Session 59 logged out. Waiting for processes to exit.
Jan 31 03:26:08 np0005603622 systemd-logind[817]: Removed session 59.
Jan 31 03:26:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:09.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.200 221324 DEBUG nova.network.neutron [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.320 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.323 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.323 221324 INFO nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating image(s)#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.354 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.360 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.362 221324 DEBUG oslo_concurrency.lockutils [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.363 221324 DEBUG nova.network.neutron [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Refreshing network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.411 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:09.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.460 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.465 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "249a20ddbb4b15785bc96dd7839ec40568c6c347" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.466 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "249a20ddbb4b15785bc96dd7839ec40568c6c347" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.839 221324 DEBUG nova.virt.libvirt.imagebackend [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/81322d27-2584-47ad-bfcd-7642da2c770e/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/81322d27-2584-47ad-bfcd-7642da2c770e/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.903 221324 DEBUG nova.virt.libvirt.imagebackend [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/81322d27-2584-47ad-bfcd-7642da2c770e/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:26:09 np0005603622 nova_compute[221301]: 2026-01-31 08:26:09.904 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] cloning images/81322d27-2584-47ad-bfcd-7642da2c770e@snap to None/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:26:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:26:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:11.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.255 221324 DEBUG nova.network.neutron [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated VIF entry in instance network info cache for port 19a4f194-5514-4b8e-b635-e6fe0255dc1a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.256 221324 DEBUG nova.network.neutron [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.340 221324 DEBUG oslo_concurrency.lockutils [req-b8eded9f-2b1e-48c4-8605-70149d0a53d0 req-e6953d6e-0b19-42fd-9d27-ee7f11db2803 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.416 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.417 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:26:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:11.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.439 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:26:11 np0005603622 nova_compute[221301]: 2026-01-31 08:26:11.793 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:13.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:13.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:13 np0005603622 nova_compute[221301]: 2026-01-31 08:26:13.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:13 np0005603622 nova_compute[221301]: 2026-01-31 08:26:13.593 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "249a20ddbb4b15785bc96dd7839ec40568c6c347" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:14 np0005603622 nova_compute[221301]: 2026-01-31 08:26:14.107 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'migration_context' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:14 np0005603622 nova_compute[221301]: 2026-01-31 08:26:14.192 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] flattening vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:26:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 31 03:26:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:14.940 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:15.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:15.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:16 np0005603622 nova_compute[221301]: 2026-01-31 08:26:16.795 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:17.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:26:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:17.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.921 221324 DEBUG nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.922 221324 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.923 221324 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.923 221324 DEBUG oslo_concurrency.lockutils [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.924 221324 DEBUG nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:17 np0005603622 nova_compute[221301]: 2026-01-31 08:26:17.924 221324 WARNING nova.compute.manager [req-87157d71-267d-4b7b-aa7d-6309ee1ef914 req-9b98a9c4-8304-44ba-b583-5f306c8138c8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:26:18 np0005603622 nova_compute[221301]: 2026-01-31 08:26:18.102 221324 INFO nova.network.neutron [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating port d2ecd824-47f7-4503-a326-2007f56a02e7 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:26:18 np0005603622 nova_compute[221301]: 2026-01-31 08:26:18.468 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:18 np0005603622 systemd[1]: Stopping User Manager for UID 42436...
Jan 31 03:26:18 np0005603622 systemd[267290]: Activating special unit Exit the Session...
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped target Main User Target.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped target Basic System.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped target Paths.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped target Sockets.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped target Timers.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 31 03:26:18 np0005603622 systemd[267290]: Closed D-Bus User Message Bus Socket.
Jan 31 03:26:18 np0005603622 systemd[267290]: Stopped Create User's Volatile Files and Directories.
Jan 31 03:26:18 np0005603622 systemd[267290]: Removed slice User Application Slice.
Jan 31 03:26:18 np0005603622 systemd[267290]: Reached target Shutdown.
Jan 31 03:26:18 np0005603622 systemd[267290]: Finished Exit the Session.
Jan 31 03:26:18 np0005603622 systemd[267290]: Reached target Exit the Session.
Jan 31 03:26:18 np0005603622 systemd[1]: user@42436.service: Deactivated successfully.
Jan 31 03:26:18 np0005603622 systemd[1]: Stopped User Manager for UID 42436.
Jan 31 03:26:18 np0005603622 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 31 03:26:18 np0005603622 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 31 03:26:18 np0005603622 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 31 03:26:18 np0005603622 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 31 03:26:18 np0005603622 systemd[1]: Removed slice User Slice of UID 42436.
Jan 31 03:26:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:19.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:19 np0005603622 nova_compute[221301]: 2026-01-31 08:26:19.352 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:19 np0005603622 nova_compute[221301]: 2026-01-31 08:26:19.353 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:19 np0005603622 nova_compute[221301]: 2026-01-31 08:26:19.353 221324 DEBUG nova.network.neutron [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:26:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:19.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.086 221324 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.086 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.086 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.087 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.087 221324 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.087 221324 WARNING nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.088 221324 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.088 221324 DEBUG nova.compute.manager [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.088 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.465 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Image rbd:vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.467 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.468 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Ensure instance console log exists: /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.468 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.469 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.469 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.475 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start _get_guest_xml network_info=[{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:24:54Z,direct_url=<?>,disk_format='raw',id=81322d27-2584-47ad-bfcd-7642da2c770e,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-657799937-shelved',owner='491937de020742d7b4e847dc3bf57950',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:25:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': None, 'attachment_id': '395572e4-5b80-4c49-b44a-e3d5caf7062e', 'mount_device': '/dev/vdc', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-40098f2d-bf66-4831-bd6c-f4f5c78b699e', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '40098f2d-bf66-4831-bd6c-f4f5c78b699e', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attached', 'instance': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac', 'attached_at': '', 'detached_at': '', 'volume_id': '40098f2d-bf66-4831-bd6c-f4f5c78b699e', 'serial': '40098f2d-bf66-4831-bd6c-f4f5c78b699e'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.481 221324 WARNING nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.489 221324 DEBUG nova.virt.libvirt.host [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.490 221324 DEBUG nova.virt.libvirt.host [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.495 221324 DEBUG nova.virt.libvirt.host [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.496 221324 DEBUG nova.virt.libvirt.host [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.498 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.498 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:24:54Z,direct_url=<?>,disk_format='raw',id=81322d27-2584-47ad-bfcd-7642da2c770e,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-657799937-shelved',owner='491937de020742d7b4e847dc3bf57950',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:25:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.499 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.500 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.500 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.500 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.501 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.501 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.502 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.502 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.503 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.503 221324 DEBUG nova.virt.hardware [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.504 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.529 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:26:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2660746638' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:26:20 np0005603622 nova_compute[221301]: 2026-01-31 08:26:20.995 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.039 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.045 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:26:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:21.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.221 221324 DEBUG nova.network.neutron [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.347 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.354 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.355 221324 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:26:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:21.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:26:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3503784552' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.547 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.550 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.551 221324 INFO nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Creating image(s)#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.606 221324 DEBUG nova.storage.rbd_utils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] creating snapshot(nova-resize) on rbd image(08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:26:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.838 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.793s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.924 221324 DEBUG nova.virt.libvirt.vif [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='81322d27-2584-47ad-bfcd-7642da2c770e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1582363486',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:25:35.963133',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='81322d27-2584-47ad-bfcd-7642da2c770e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:26:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.925 221324 DEBUG nova.network.os_vif_util [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.926 221324 DEBUG nova.network.os_vif_util [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.928 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.954 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <uuid>f3b36b5b-968c-4775-ac4f-93efc36f40ac</uuid>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <name>instance-00000077</name>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-657799937</nova:name>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:26:20</nova:creationTime>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:user uuid="432ac8867d8240408db455fc25bb5901">tempest-AttachVolumeShelveTestJSON-60119558-project-member</nova:user>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:project uuid="491937de020742d7b4e847dc3bf57950">tempest-AttachVolumeShelveTestJSON-60119558</nova:project>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="81322d27-2584-47ad-bfcd-7642da2c770e"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <nova:port uuid="19a4f194-5514-4b8e-b635-e6fe0255dc1a">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="serial">f3b36b5b-968c-4775-ac4f-93efc36f40ac</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="uuid">f3b36b5b-968c-4775-ac4f-93efc36f40ac</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-40098f2d-bf66-4831-bd6c-f4f5c78b699e">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <target dev="vdc" bus="virtio"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <serial>40098f2d-bf66-4831-bd6c-f4f5c78b699e</serial>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:59:9f:8d"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <target dev="tap19a4f194-55"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/console.log" append="off"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:26:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:26:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:26:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:26:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.955 221324 DEBUG nova.compute.manager [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Preparing to wait for external event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.956 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.957 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.957 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.958 221324 DEBUG nova.virt.libvirt.vif [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='81322d27-2584-47ad-bfcd-7642da2c770e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1582363486',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:24:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:25:35.963133',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='81322d27-2584-47ad-bfcd-7642da2c770e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:26:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.958 221324 DEBUG nova.network.os_vif_util [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.959 221324 DEBUG nova.network.os_vif_util [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.959 221324 DEBUG os_vif [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.960 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.961 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.961 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.965 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.965 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19a4f194-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.966 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap19a4f194-55, col_values=(('external_ids', {'iface-id': '19a4f194-5514-4b8e-b635-e6fe0255dc1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:9f:8d', 'vm-uuid': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:21 np0005603622 NetworkManager[49080]: <info>  [1769847981.9688] manager: (tap19a4f194-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.976 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:21 np0005603622 nova_compute[221301]: 2026-01-31 08:26:21.978 221324 INFO os_vif [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55')#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.033 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.034 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.034 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.034 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No VIF found with MAC fa:16:3e:59:9f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.035 221324 INFO nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Using config drive#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.758 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.819 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:22 np0005603622 nova_compute[221301]: 2026-01-31 08:26:22.874 221324 DEBUG nova.objects.instance [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'keypairs' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:23.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:23 np0005603622 podman[267645]: 2026-01-31 08:26:23.185130968 +0000 UTC m=+0.086564788 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Jan 31 03:26:23 np0005603622 podman[267644]: 2026-01-31 08:26:23.214951689 +0000 UTC m=+0.117003656 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.231 221324 INFO nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Creating config drive at /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.237 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_1agbcyb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.362 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_1agbcyb" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.408 221324 DEBUG nova.storage.rbd_utils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.412 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.439 221324 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.441 221324 DEBUG nova.network.neutron [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:23.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:23 np0005603622 nova_compute[221301]: 2026-01-31 08:26:23.465 221324 DEBUG oslo_concurrency.lockutils [req-438da7ef-6099-42d2-b71f-33360b737d0d req-e5a33176-461b-42ef-8bfe-a15a68b73db8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 31 03:26:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:25.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:25.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 31 03:26:26 np0005603622 nova_compute[221301]: 2026-01-31 08:26:26.802 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:26 np0005603622 nova_compute[221301]: 2026-01-31 08:26:26.968 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:27.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:27.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:29.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:30.880 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:30.880 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:30.881 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:31.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:31.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:31 np0005603622 nova_compute[221301]: 2026-01-31 08:26:31.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:31 np0005603622 nova_compute[221301]: 2026-01-31 08:26:31.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:33.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:33.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.252 221324 DEBUG nova.objects.instance [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'trusted_certs' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.380 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.381 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Ensure instance console log exists: /var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.382 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.383 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.384 221324 DEBUG oslo_concurrency.lockutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.388 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Start _get_guest_xml network_info=[{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.395 221324 WARNING nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.406 221324 DEBUG nova.virt.libvirt.host [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.406 221324 DEBUG nova.virt.libvirt.host [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.409 221324 DEBUG nova.virt.libvirt.host [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.409 221324 DEBUG nova.virt.libvirt.host [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.410 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.410 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f75c4aee-d826-4343-a7e3-f06a4b21de52',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.411 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.412 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.412 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.412 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.412 221324 DEBUG nova.virt.hardware [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.412 221324 DEBUG nova.objects.instance [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:34 np0005603622 nova_compute[221301]: 2026-01-31 08:26:34.433 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:26:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/226911233' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:26:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:35.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.192 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.759s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.240 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:35.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:26:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/250211266' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.673 221324 DEBUG oslo_concurrency.processutils [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.675 221324 DEBUG nova.virt.libvirt.vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:26:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.675 221324 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.676 221324 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.679 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <uuid>08dc30a4-60ae-4f4e-8b5e-e12610df2120</uuid>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <name>instance-00000078</name>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <memory>196608</memory>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-1941034472</nova:name>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:26:34</nova:creationTime>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.micro">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:memory>192</nova:memory>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <nova:port uuid="d2ecd824-47f7-4503-a326-2007f56a02e7">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="serial">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="uuid">08dc30a4-60ae-4f4e-8b5e-e12610df2120</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/08dc30a4-60ae-4f4e-8b5e-e12610df2120_disk.config">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:f9:0c:a9"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <target dev="tapd2ecd824-47"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/08dc30a4-60ae-4f4e-8b5e-e12610df2120/console.log" append="off"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:26:35 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:26:35 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:26:35 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:26:35 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.680 221324 DEBUG nova.virt.libvirt.vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:25:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:26:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.681 221324 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-924085117-network", "vif_mac": "fa:16:3e:f9:0c:a9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.681 221324 DEBUG nova.network.os_vif_util [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.681 221324 DEBUG os_vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.682 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.683 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.683 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.687 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.688 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2ecd824-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.689 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd2ecd824-47, col_values=(('external_ids', {'iface-id': 'd2ecd824-47f7-4503-a326-2007f56a02e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:0c:a9', 'vm-uuid': '08dc30a4-60ae-4f4e-8b5e-e12610df2120'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.692 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.6934] manager: (tapd2ecd824-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.705 221324 INFO os_vif [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.801 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.802 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.802 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:f9:0c:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.803 221324 INFO nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Using config drive#033[00m
Jan 31 03:26:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:35Z|00537|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:26:35 np0005603622 kernel: tapd2ecd824-47: entered promiscuous mode
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.8963] manager: (tapd2ecd824-47): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.899 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:35Z|00538|binding|INFO|Claiming lport d2ecd824-47f7-4503-a326-2007f56a02e7 for this chassis.
Jan 31 03:26:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:35Z|00539|binding|INFO|d2ecd824-47f7-4503-a326-2007f56a02e7: Claiming fa:16:3e:f9:0c:a9 10.100.0.5
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 systemd-udevd[267859]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.9315] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.9324] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 31 03:26:35 np0005603622 systemd-machined[190406]: New machine qemu-54-instance-00000078.
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.940 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.943 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.945 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.9490] device (tapd2ecd824-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:26:35 np0005603622 NetworkManager[49080]: <info>  [1769847995.9498] device (tapd2ecd824-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.955 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a2e0f33a-1453-40a8-8b8f-c1df3e5ce5b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.956 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.959 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.959 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcbbe7b-8a7c-41c5-973c-386d46f896a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.960 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d337e517-5b00-4cbf-b727-73b02a5b1123]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.975 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b681dd5e-ef97-4caa-9309-4a360c2b0bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:35 np0005603622 systemd[1]: Started Virtual Machine qemu-54-instance-00000078.
Jan 31 03:26:35 np0005603622 nova_compute[221301]: 2026-01-31 08:26:35.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:35.999 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ffd8b1-99ac-4e1c-91c4-7f5ba4c87e48]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.016 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:36Z|00540|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 ovn-installed in OVS
Jan 31 03:26:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:36Z|00541|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 up in Southbound
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.029 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1d0850-59ad-4223-8fda-0cb97191ccba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 systemd-udevd[267864]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:26:36 np0005603622 NetworkManager[49080]: <info>  [1769847996.0366] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/252)
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.035 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[279f27ec-1404-49a2-9d69-1ee7a4bcc5a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.074 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1efa5377-2021-4baf-8106-ed505920370d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.077 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[72c4a8d1-4fcd-4193-b91c-b8f4eaef9f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 NetworkManager[49080]: <info>  [1769847996.0988] device (tap1186b71b-00): carrier: link connected
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.103 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8640ad7b-99df-4701-97a1-864d94517b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.117 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[448b2a97-ba70-4de8-964c-42e58c7e013b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724743, 'reachable_time': 17160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267899, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.133 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a9950f84-4a9a-49bc-9d8c-5663b9022be9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 724743, 'tstamp': 724743}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267900, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.143 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02890ef2-a905-4ec7-806a-d656cd9d8702]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724743, 'reachable_time': 17160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267901, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.166 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d145cfcb-6370-440b-84f4-c63dbeabad0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.208 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[32ee4e5a-16ec-4fb7-8cdc-be1601ffe3cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.210 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.210 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.211 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:36 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.212 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 NetworkManager[49080]: <info>  [1769847996.2139] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.231 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:36Z|00542|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.242 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f09d78d8-18e2-4dac-98d2-878f12280bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.244 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:26:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:36.245 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.356 221324 DEBUG nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.357 221324 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.357 221324 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.358 221324 DEBUG oslo_concurrency.lockutils [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.358 221324 DEBUG nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.359 221324 WARNING nova.compute.manager [req-b38176f4-b40f-405d-a6fe-06db7920afff req-0cfc6673-be2b-42be-b9d3-1e33803659b9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.516 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847996.5160475, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.517 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.520 221324 DEBUG nova.compute.manager [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.525 221324 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance running successfully.#033[00m
Jan 31 03:26:36 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.529 221324 DEBUG nova.virt.libvirt.guest [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.529 221324 DEBUG nova.virt.libvirt.driver [None req-c39494f4-d515-4743-8089-b3ad97705959 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.625 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.629 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:36 np0005603622 podman[267980]: 2026-01-31 08:26:36.649780098 +0000 UTC m=+0.082137779 container create 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.686 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.686 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769847996.5163867, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.686 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Started (Lifecycle Event)#033[00m
Jan 31 03:26:36 np0005603622 podman[267980]: 2026-01-31 08:26:36.598556671 +0000 UTC m=+0.030914332 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:26:36 np0005603622 systemd[1]: Started libpod-conmon-2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e.scope.
Jan 31 03:26:36 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:26:36 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebda3e61a95c9cc530c003e9231dd3a57e270e720664df3fbb01ecffb5b62a88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:26:36 np0005603622 podman[267980]: 2026-01-31 08:26:36.756742574 +0000 UTC m=+0.189100225 container init 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:26:36 np0005603622 podman[267980]: 2026-01-31 08:26:36.760350211 +0000 UTC m=+0.192707852 container start 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.770 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.772 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:36 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [NOTICE]   (267999) : New worker (268001) forked
Jan 31 03:26:36 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [NOTICE]   (267999) : Loading success.
Jan 31 03:26:36 np0005603622 nova_compute[221301]: 2026-01-31 08:26:36.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.012 221324 DEBUG oslo_concurrency.processutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config f3b36b5b-968c-4775-ac4f-93efc36f40ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 13.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.013 221324 INFO nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deleting local config drive /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac/disk.config because it was imported into RBD.#033[00m
Jan 31 03:26:37 np0005603622 kernel: tap19a4f194-55: entered promiscuous mode
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.0829] manager: (tap19a4f194-55): new Tun device (/org/freedesktop/NetworkManager/Devices/254)
Jan 31 03:26:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:37Z|00543|binding|INFO|Claiming lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a for this chassis.
Jan 31 03:26:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:37Z|00544|binding|INFO|19a4f194-5514-4b8e-b635-e6fe0255dc1a: Claiming fa:16:3e:59:9f:8d 10.100.0.6
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 systemd-udevd[267880]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:26:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:37.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:37Z|00545|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a ovn-installed in OVS
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.101 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.1093] device (tap19a4f194-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.108 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.1098] device (tap19a4f194-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:26:37 np0005603622 systemd-machined[190406]: New machine qemu-55-instance-00000077.
Jan 31 03:26:37 np0005603622 systemd[1]: Started Virtual Machine qemu-55-instance-00000077.
Jan 31 03:26:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:37Z|00546|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a up in Southbound
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.340 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:9f:8d 10.100.0.6'], port_security=['fa:16:3e:59:9f:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'fbd8fca4-628f-4f27-9bbb-a2cbce3d02c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=19a4f194-5514-4b8e-b635-e6fe0255dc1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.341 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 19a4f194-5514-4b8e-b635-e6fe0255dc1a in datapath 6525247d-48b2-4359-a813-d7276403ba32 bound to our chassis#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.343 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6525247d-48b2-4359-a813-d7276403ba32#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.350 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[960fc283-6442-4f3c-af8d-c6dd0462b559]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.351 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6525247d-41 in ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.353 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6525247d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.353 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25536060-1318-4aee-be84-08aa9700b17e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.353 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bee111b4-398a-45fc-a2ea-0e8ebd1d3034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.363 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[0d682be5-f7aa-4086-84f7-82708e8173f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.374 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef35283-5f50-4368-a407-1133967aa5d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.398 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a97b9d35-34a0-4421-938e-192d82cea017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.404 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a7a8e3-077b-4b25-86eb-8a0243e940f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.4051] manager: (tap6525247d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/255)
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.429 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[04b1aa0a-4653-42e3-82a5-78da4bbd9de1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.433 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[225f9923-ef22-4c04-86e1-8e7c779bee8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.4491] device (tap6525247d-40): carrier: link connected
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.455 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2282c09d-4ea4-4358-8647-172c8129d6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:37.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.468 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8f982e-7387-4b7f-a831-1b67d9110f48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724878, 'reachable_time': 29380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268040, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.482 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[13a092ee-0314-4224-abdc-f79a89d5b7d1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:c843'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 724878, 'tstamp': 724878}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268041, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.497 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[978fc084-f152-486a-969b-2bb44f0b98c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724878, 'reachable_time': 29380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268042, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.526 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7137ef6d-8f99-47ce-9b8b-7614d7405d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.581 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74b18085-87bb-4cfd-a4ab-ba5a070a7006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.583 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.584 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.584 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6525247d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.587 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 NetworkManager[49080]: <info>  [1769847997.5880] manager: (tap6525247d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 31 03:26:37 np0005603622 kernel: tap6525247d-40: entered promiscuous mode
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.590 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6525247d-40, col_values=(('external_ids', {'iface-id': '044f1919-2550-4bba-9baa-5d3f39f69ec6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.592 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:37Z|00547|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:26:37 np0005603622 nova_compute[221301]: 2026-01-31 08:26:37.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.605 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.607 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b42a4255-7a6a-4d5c-bc09-39e597c23366]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.608 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:26:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:37.612 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'env', 'PROCESS_TAG=haproxy-6525247d-48b2-4359-a813-d7276403ba32', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6525247d-48b2-4359-a813-d7276403ba32.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:26:37 np0005603622 podman[268074]: 2026-01-31 08:26:37.969681638 +0000 UTC m=+0.041161288 container create ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:26:38 np0005603622 systemd[1]: Started libpod-conmon-ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609.scope.
Jan 31 03:26:38 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:26:38 np0005603622 podman[268074]: 2026-01-31 08:26:37.948977821 +0000 UTC m=+0.020457501 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:26:38 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43240a931b3c0737736b9b873e6843037ba240aa0cd7a2a213c256af516fe9cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:26:38 np0005603622 podman[268074]: 2026-01-31 08:26:38.075867622 +0000 UTC m=+0.147347292 container init ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:26:38 np0005603622 podman[268074]: 2026-01-31 08:26:38.081804902 +0000 UTC m=+0.153284582 container start ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 03:26:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [NOTICE]   (268093) : New worker (268095) forked
Jan 31 03:26:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [NOTICE]   (268093) : Loading success.
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.501 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.502 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.502 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.503 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.503 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.504 221324 WARNING nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.504 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.505 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.505 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.505 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.506 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Processing event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.506 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.507 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.507 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.508 221324 DEBUG oslo_concurrency.lockutils [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.508 221324 DEBUG nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:38 np0005603622 nova_compute[221301]: 2026-01-31 08:26:38.508 221324 WARNING nova.compute.manager [req-c873bec3-44c6-4020-9ba7-726836f5d0f9 req-efcc2870-1523-4d3b-8ceb-607dde0da18e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received unexpected event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 31 03:26:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:39.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 31 03:26:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:39.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.694 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.786 221324 DEBUG nova.compute.manager [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.787 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848000.786078, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.788 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Started (Lifecycle Event)#033[00m
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.791 221324 DEBUG nova.virt.libvirt.driver [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:26:40 np0005603622 nova_compute[221301]: 2026-01-31 08:26:40.795 221324 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance spawned successfully.#033[00m
Jan 31 03:26:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:41.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.376 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.380 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.418 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.419 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.455 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.456 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848000.7863843, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.457 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:26:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.556 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.563 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848000.790075, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.564 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.832 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.837 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:26:41 np0005603622 nova_compute[221301]: 2026-01-31 08:26:41.897 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:26:42 np0005603622 nova_compute[221301]: 2026-01-31 08:26:42.377 221324 DEBUG nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Port d2ecd824-47f7-4503-a326-2007f56a02e7 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 31 03:26:42 np0005603622 nova_compute[221301]: 2026-01-31 08:26:42.379 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:42 np0005603622 nova_compute[221301]: 2026-01-31 08:26:42.379 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:42 np0005603622 nova_compute[221301]: 2026-01-31 08:26:42.380 221324 DEBUG nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:26:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:43.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:26:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.761 221324 DEBUG nova.network.neutron [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.843 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:43 np0005603622 kernel: tapd2ecd824-47 (unregistering): left promiscuous mode
Jan 31 03:26:43 np0005603622 NetworkManager[49080]: <info>  [1769848003.9578] device (tapd2ecd824-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.965 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:43Z|00548|binding|INFO|Releasing lport d2ecd824-47f7-4503-a326-2007f56a02e7 from this chassis (sb_readonly=0)
Jan 31 03:26:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:43Z|00549|binding|INFO|Setting lport d2ecd824-47f7-4503-a326-2007f56a02e7 down in Southbound
Jan 31 03:26:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:43Z|00550|binding|INFO|Removing iface tapd2ecd824-47 ovn-installed in OVS
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:43 np0005603622 nova_compute[221301]: 2026-01-31 08:26:43.975 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:43.992 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:0c:a9 10.100.0.5'], port_security=['fa:16:3e:f9:0c:a9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08dc30a4-60ae-4f4e-8b5e-e12610df2120', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=d2ecd824-47f7-4503-a326-2007f56a02e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:43.994 139793 INFO neutron.agent.ovn.metadata.agent [-] Port d2ecd824-47f7-4503-a326-2007f56a02e7 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:26:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:43.995 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:43.998 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb94826-8e40-4243-bd28-b47b1969a809]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.000 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:26:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:26:44 np0005603622 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 31 03:26:44 np0005603622 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000078.scope: Consumed 8.019s CPU time.
Jan 31 03:26:44 np0005603622 systemd-machined[190406]: Machine qemu-54-instance-00000078 terminated.
Jan 31 03:26:44 np0005603622 NetworkManager[49080]: <info>  [1769848004.1217] manager: (tapd2ecd824-47): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.126 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.132 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [NOTICE]   (267999) : haproxy version is 2.8.14-c23fe91
Jan 31 03:26:44 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [NOTICE]   (267999) : path to executable is /usr/sbin/haproxy
Jan 31 03:26:44 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [WARNING]  (267999) : Exiting Master process...
Jan 31 03:26:44 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [ALERT]    (267999) : Current worker (268001) exited with code 143 (Terminated)
Jan 31 03:26:44 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[267995]: [WARNING]  (267999) : All workers exited. Exiting... (0)
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.147 221324 INFO nova.virt.libvirt.driver [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Instance destroyed successfully.#033[00m
Jan 31 03:26:44 np0005603622 systemd[1]: libpod-2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e.scope: Deactivated successfully.
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.148 221324 DEBUG nova.objects.instance [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:44 np0005603622 podman[268320]: 2026-01-31 08:26:44.156078979 +0000 UTC m=+0.069570701 container died 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.212 221324 DEBUG nova.virt.libvirt.vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1941034472',display_name='tempest-ServerActionsTestJSON-server-1941034472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1941034472',id=120,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:26:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-ra3xi6gb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:26:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=08dc30a4-60ae-4f4e-8b5e-e12610df2120,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.214 221324 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.215 221324 DEBUG nova.network.os_vif_util [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.215 221324 DEBUG os_vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.219 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.219 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2ecd824-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.223 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.227 221324 INFO os_vif [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0c:a9,bridge_name='br-int',has_traffic_filtering=True,id=d2ecd824-47f7-4503-a326-2007f56a02e7,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd2ecd824-47')#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.234 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.234 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:44 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:26:44 np0005603622 systemd[1]: var-lib-containers-storage-overlay-ebda3e61a95c9cc530c003e9231dd3a57e270e720664df3fbb01ecffb5b62a88-merged.mount: Deactivated successfully.
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.327 221324 DEBUG nova.objects.instance [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid 08dc30a4-60ae-4f4e-8b5e-e12610df2120 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:44 np0005603622 podman[268320]: 2026-01-31 08:26:44.354819971 +0000 UTC m=+0.268311693 container cleanup 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:26:44 np0005603622 systemd[1]: libpod-conmon-2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e.scope: Deactivated successfully.
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.424 221324 DEBUG nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.424 221324 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.425 221324 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.425 221324 DEBUG oslo_concurrency.lockutils [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.425 221324 DEBUG nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.425 221324 WARNING nova.compute.manager [req-f2bc58e9-e42b-4fff-badf-69d2ca1fb38b req-9fb23e5a-400c-429f-b758-3d2e33b65386 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-unplugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:26:44 np0005603622 podman[268353]: 2026-01-31 08:26:44.442298443 +0000 UTC m=+0.062574293 container remove 2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.447 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[66224790-5931-44be-a9fa-affc7b2159b9]: (4, ('Sat Jan 31 08:26:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e)\n2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e\nSat Jan 31 08:26:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e)\n2e427b96cae4d8bff98e045c7b23d379fb4831caa1d65369a7214090145d636e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.448 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3cbcc500-fd00-4d5a-837a-b521bd956724]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.449 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.463 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c28fe7-cabf-449f-bc8b-6337da4ebc2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.477 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1760afed-a99d-4b2b-82c6-7682616eb16a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.479 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5be6e9ed-57fc-4f66-b22f-1770b1d23e78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.491 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5e0326-69a9-47bc-a743-2837722b0282]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724736, 'reachable_time': 32670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268368, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.495 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:26:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:44.495 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[47917516-38eb-4677-b85e-6233851c0434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:26:44 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:26:44 np0005603622 nova_compute[221301]: 2026-01-31 08:26:44.555 221324 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2111460743' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:45 np0005603622 nova_compute[221301]: 2026-01-31 08:26:45.070 221324 DEBUG oslo_concurrency.processutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:45 np0005603622 nova_compute[221301]: 2026-01-31 08:26:45.076 221324 DEBUG nova.compute.provider_tree [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:45.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:45 np0005603622 nova_compute[221301]: 2026-01-31 08:26:45.180 221324 DEBUG nova.scheduler.client.report [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:45 np0005603622 nova_compute[221301]: 2026-01-31 08:26:45.271 221324 DEBUG oslo_concurrency.lockutils [None req-abf3ad4d-f325-44cc-9576-46801de7e720 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:45 np0005603622 nova_compute[221301]: 2026-01-31 08:26:45.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:45.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:26:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.836 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.837 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.837 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.837 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:26:46 np0005603622 nova_compute[221301]: 2026-01-31 08:26:46.989 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:46.989 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:26:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:46.991 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:26:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.176 221324 DEBUG nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.176 221324 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.177 221324 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.177 221324 DEBUG oslo_concurrency.lockutils [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.177 221324 DEBUG nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:26:47 np0005603622 nova_compute[221301]: 2026-01-31 08:26:47.177 221324 WARNING nova.compute.manager [req-184301e4-8407-45fc-bd5e-2f149c9a5d06 req-6ffb6de0-b7be-4bf9-a111-64b62b3326cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:26:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:47.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 31 03:26:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:49.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.290 221324 DEBUG nova.compute.manager [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.292 221324 DEBUG nova.compute.manager [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing instance network info cache due to event network-changed-d2ecd824-47f7-4503-a326-2007f56a02e7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.292 221324 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.292 221324 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:26:49 np0005603622 nova_compute[221301]: 2026-01-31 08:26:49.292 221324 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Refreshing network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:26:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:49.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 31 03:26:50 np0005603622 nova_compute[221301]: 2026-01-31 08:26:50.837 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [{"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:51.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.296 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-f3b36b5b-968c-4775-ac4f-93efc36f40ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.297 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.298 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.299 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.299 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.299 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.479 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.479 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.480 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.480 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.481 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:51.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2629107068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:51 np0005603622 nova_compute[221301]: 2026-01-31 08:26:51.897 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.049 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.050 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.050 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.234 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.236 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4296MB free_disk=20.760215759277344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.236 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.237 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.321 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance f3b36b5b-968c-4775-ac4f-93efc36f40ac actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.321 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.322 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.360 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.425 221324 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updated VIF entry in instance network info cache for port d2ecd824-47f7-4503-a326-2007f56a02e7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.426 221324 DEBUG nova.network.neutron [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Updating instance_info_cache with network_info: [{"id": "d2ecd824-47f7-4503-a326-2007f56a02e7", "address": "fa:16:3e:f9:0c:a9", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd2ecd824-47", "ovs_interfaceid": "d2ecd824-47f7-4503-a326-2007f56a02e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.518 221324 DEBUG oslo_concurrency.lockutils [req-a55c3795-2c90-4bd6-a7e4-a07936b1ae94 req-44354c2e-2b3b-4f93-bd5e-1a2a2b4f6298 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-08dc30a4-60ae-4f4e-8b5e-e12610df2120" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:26:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 31 03:26:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:26:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4002589368' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.853 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.859 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:26:52 np0005603622 nova_compute[221301]: 2026-01-31 08:26:52.891 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:26:53 np0005603622 nova_compute[221301]: 2026-01-31 08:26:53.088 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:26:53 np0005603622 nova_compute[221301]: 2026-01-31 08:26:53.089 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:26:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:53.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:53.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:26:53.994 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:26:54 np0005603622 podman[268437]: 2026-01-31 08:26:54.142793843 +0000 UTC m=+0.060982950 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:26:54 np0005603622 podman[268436]: 2026-01-31 08:26:54.179594413 +0000 UTC m=+0.096126825 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:26:54 np0005603622 nova_compute[221301]: 2026-01-31 08:26:54.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:26:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:55.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:55.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:56 np0005603622 nova_compute[221301]: 2026-01-31 08:26:56.858 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:26:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:26:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:57.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:26:58Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:9f:8d 10.100.0.6
Jan 31 03:26:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:26:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:26:59.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:26:59 np0005603622 nova_compute[221301]: 2026-01-31 08:26:59.142 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848004.1410217, 08dc30a4-60ae-4f4e-8b5e-e12610df2120 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:26:59 np0005603622 nova_compute[221301]: 2026-01-31 08:26:59.143 221324 INFO nova.compute.manager [-] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:26:59 np0005603622 nova_compute[221301]: 2026-01-31 08:26:59.267 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:26:59 np0005603622 nova_compute[221301]: 2026-01-31 08:26:59.308 221324 DEBUG nova.compute.manager [None req-3560c31a-3ea4-41e4-8b62-f50690a30a3e - - - - - -] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:26:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:26:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:26:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:26:59.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:26:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:00 np0005603622 nova_compute[221301]: 2026-01-31 08:27:00.086 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 31 03:27:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:01.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:01.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:01 np0005603622 nova_compute[221301]: 2026-01-31 08:27:01.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:03.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:03.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:04 np0005603622 nova_compute[221301]: 2026-01-31 08:27:04.270 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 31 03:27:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:05.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:05.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:05 np0005603622 nova_compute[221301]: 2026-01-31 08:27:05.892 221324 DEBUG nova.compute.manager [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:06 np0005603622 nova_compute[221301]: 2026-01-31 08:27:06.146 221324 DEBUG oslo_concurrency.lockutils [None req-f99ff621-02eb-4763-87ff-e9fa11586eaf 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 66.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:06 np0005603622 nova_compute[221301]: 2026-01-31 08:27:06.900 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:07.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:07.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:27:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:27:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:27:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:09.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.321 221324 DEBUG oslo_concurrency.lockutils [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.322 221324 DEBUG oslo_concurrency.lockutils [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.346 221324 INFO nova.compute.manager [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Detaching volume 40098f2d-bf66-4831-bd6c-f4f5c78b699e#033[00m
Jan 31 03:27:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:09.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.609 221324 INFO nova.virt.block_device [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Attempting to driver detach volume 40098f2d-bf66-4831-bd6c-f4f5c78b699e from mountpoint /dev/vdc#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.622 221324 DEBUG nova.virt.libvirt.driver [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Attempting to detach device vdc from instance f3b36b5b-968c-4775-ac4f-93efc36f40ac from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.622 221324 DEBUG nova.virt.libvirt.guest [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-40098f2d-bf66-4831-bd6c-f4f5c78b699e">
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <serial>40098f2d-bf66-4831-bd6c-f4f5c78b699e</serial>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:27:09 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.748 221324 INFO nova.virt.libvirt.driver [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully detached device vdc from instance f3b36b5b-968c-4775-ac4f-93efc36f40ac from the persistent domain config.#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.749 221324 DEBUG nova.virt.libvirt.driver [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance f3b36b5b-968c-4775-ac4f-93efc36f40ac from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.750 221324 DEBUG nova.virt.libvirt.guest [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-40098f2d-bf66-4831-bd6c-f4f5c78b699e">
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <serial>40098f2d-bf66-4831-bd6c-f4f5c78b699e</serial>
Jan 31 03:27:09 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:27:09 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:27:09 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.841 221324 DEBUG nova.compute.manager [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.842 221324 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.843 221324 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.843 221324 DEBUG oslo_concurrency.lockutils [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "08dc30a4-60ae-4f4e-8b5e-e12610df2120-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.843 221324 DEBUG nova.compute.manager [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] No waiting events found dispatching network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:09 np0005603622 nova_compute[221301]: 2026-01-31 08:27:09.843 221324 WARNING nova.compute.manager [req-ab0c0d25-e783-41da-ae77-389e49f2b5a3 req-e849a7a0-e479-4ea5-a8fa-60784d0a660b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 08dc30a4-60ae-4f4e-8b5e-e12610df2120] Received unexpected event network-vif-plugged-d2ecd824-47f7-4503-a326-2007f56a02e7 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:27:10 np0005603622 nova_compute[221301]: 2026-01-31 08:27:10.022 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769848030.0216718, f3b36b5b-968c-4775-ac4f-93efc36f40ac => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:27:10 np0005603622 nova_compute[221301]: 2026-01-31 08:27:10.025 221324 DEBUG nova.virt.libvirt.driver [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance f3b36b5b-968c-4775-ac4f-93efc36f40ac _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:27:10 np0005603622 nova_compute[221301]: 2026-01-31 08:27:10.028 221324 INFO nova.virt.libvirt.driver [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully detached device vdc from instance f3b36b5b-968c-4775-ac4f-93efc36f40ac from the live domain config.#033[00m
Jan 31 03:27:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:10 np0005603622 nova_compute[221301]: 2026-01-31 08:27:10.760 221324 DEBUG nova.objects.instance [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'flavor' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:10 np0005603622 nova_compute[221301]: 2026-01-31 08:27:10.854 221324 DEBUG oslo_concurrency.lockutils [None req-9a42ede7-f86f-45be-839f-4d3a7683b68b 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:11.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:11.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 31 03:27:11 np0005603622 nova_compute[221301]: 2026-01-31 08:27:11.950 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.971 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.972 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.972 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.973 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.973 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.975 221324 INFO nova.compute.manager [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Terminating instance#033[00m
Jan 31 03:27:12 np0005603622 nova_compute[221301]: 2026-01-31 08:27:12.977 221324 DEBUG nova.compute.manager [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:27:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:13.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:13 np0005603622 kernel: tap19a4f194-55 (unregistering): left promiscuous mode
Jan 31 03:27:13 np0005603622 NetworkManager[49080]: <info>  [1769848033.6495] device (tap19a4f194-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.649 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:13Z|00551|binding|INFO|Releasing lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a from this chassis (sb_readonly=0)
Jan 31 03:27:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:13Z|00552|binding|INFO|Setting lport 19a4f194-5514-4b8e-b635-e6fe0255dc1a down in Southbound
Jan 31 03:27:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:13Z|00553|binding|INFO|Removing iface tap19a4f194-55 ovn-installed in OVS
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.659 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.666 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603622 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 31 03:27:13 np0005603622 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000077.scope: Consumed 15.722s CPU time.
Jan 31 03:27:13 np0005603622 systemd-machined[190406]: Machine qemu-55-instance-00000077 terminated.
Jan 31 03:27:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:13.766 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:9f:8d 10.100.0.6'], port_security=['fa:16:3e:59:9f:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f3b36b5b-968c-4775-ac4f-93efc36f40ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'fbd8fca4-628f-4f27-9bbb-a2cbce3d02c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.250', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=19a4f194-5514-4b8e-b635-e6fe0255dc1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:13.768 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 19a4f194-5514-4b8e-b635-e6fe0255dc1a in datapath 6525247d-48b2-4359-a813-d7276403ba32 unbound from our chassis#033[00m
Jan 31 03:27:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:13.771 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6525247d-48b2-4359-a813-d7276403ba32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:27:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:13.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a70eaa40-0681-44f5-8f85-1277d669529b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:13.774 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace which is not needed anymore#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.829 221324 INFO nova.virt.libvirt.driver [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Instance destroyed successfully.#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.830 221324 DEBUG nova.objects.instance [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'resources' on Instance uuid f3b36b5b-968c-4775-ac4f-93efc36f40ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.904 221324 DEBUG nova.virt.libvirt.vif [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-657799937',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-657799937',id=119,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuPmLyvW8nH0zjMVVycFJqThAJ40QxmOiqbjQtqD9yxLSlZxgvi2M6cEnwp9NZOC4D7auSCKwZopexRDoMXTIOi6B9+vGiJqB0/tIgguNHeMJkz6XGWm9K9JFV8LaAZqQ==',key_name='tempest-keypair-1582363486',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-mev9f22v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:27:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=f3b36b5b-968c-4775-ac4f-93efc36f40ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.906 221324 DEBUG nova.network.os_vif_util [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "address": "fa:16:3e:59:9f:8d", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a4f194-55", "ovs_interfaceid": "19a4f194-5514-4b8e-b635-e6fe0255dc1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.908 221324 DEBUG nova.network.os_vif_util [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.909 221324 DEBUG os_vif [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.911 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.912 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19a4f194-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.920 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:27:13 np0005603622 nova_compute[221301]: 2026-01-31 08:27:13.924 221324 INFO os_vif [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:9f:8d,bridge_name='br-int',has_traffic_filtering=True,id=19a4f194-5514-4b8e-b635-e6fe0255dc1a,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a4f194-55')#033[00m
Jan 31 03:27:14 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [NOTICE]   (268093) : haproxy version is 2.8.14-c23fe91
Jan 31 03:27:14 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [NOTICE]   (268093) : path to executable is /usr/sbin/haproxy
Jan 31 03:27:14 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [WARNING]  (268093) : Exiting Master process...
Jan 31 03:27:14 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [ALERT]    (268093) : Current worker (268095) exited with code 143 (Terminated)
Jan 31 03:27:14 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[268089]: [WARNING]  (268093) : All workers exited. Exiting... (0)
Jan 31 03:27:14 np0005603622 systemd[1]: libpod-ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609.scope: Deactivated successfully.
Jan 31 03:27:14 np0005603622 podman[268567]: 2026-01-31 08:27:14.020134348 +0000 UTC m=+0.136084259 container died ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:27:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609-userdata-shm.mount: Deactivated successfully.
Jan 31 03:27:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay-43240a931b3c0737736b9b873e6843037ba240aa0cd7a2a213c256af516fe9cc-merged.mount: Deactivated successfully.
Jan 31 03:27:14 np0005603622 podman[268567]: 2026-01-31 08:27:14.705635155 +0000 UTC m=+0.821585046 container cleanup ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:27:14 np0005603622 systemd[1]: libpod-conmon-ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609.scope: Deactivated successfully.
Jan 31 03:27:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:15.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:15 np0005603622 podman[268615]: 2026-01-31 08:27:15.395997021 +0000 UTC m=+0.674468690 container remove ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.401 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e9356159-f7db-40b0-8e92-f7a67be75a31]: (4, ('Sat Jan 31 08:27:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609)\nebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609\nSat Jan 31 08:27:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (ebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609)\nebf0b89e25015ce6c697d4845e064b11d5feef3587b193cf4b0acc0ca1c8c609\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.402 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5d17cc2e-177e-4c7c-b149-6a5d5b1d659f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.403 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:15 np0005603622 nova_compute[221301]: 2026-01-31 08:27:15.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:15 np0005603622 kernel: tap6525247d-40: left promiscuous mode
Jan 31 03:27:15 np0005603622 nova_compute[221301]: 2026-01-31 08:27:15.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.451 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f551a033-44aa-4f7d-9981-7d9d75189e3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.465 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[22d35a73-de03-417a-96be-7d07ffdd1c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.466 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[474a03ac-6fee-49ba-be7d-cae64c9d29c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.485 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbc0373-1278-4280-9ab4-d103651c557e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724873, 'reachable_time': 44299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268631, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.489 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:27:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:15.489 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5f80767f-c785-4cb0-947b-a119ac7fb191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:15 np0005603622 systemd[1]: run-netns-ovnmeta\x2d6525247d\x2d48b2\x2d4359\x2da813\x2dd7276403ba32.mount: Deactivated successfully.
Jan 31 03:27:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:15.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.305 221324 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.306 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.306 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.307 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.307 221324 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.308 221324 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-unplugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.308 221324 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.309 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.309 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.310 221324 DEBUG oslo_concurrency.lockutils [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.310 221324 DEBUG nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] No waiting events found dispatching network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.311 221324 WARNING nova.compute.manager [req-e4575046-03ad-4446-878a-a4cb8ec75fec req-8d61a297-772f-4c73-96e1-53ad9a30deb0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received unexpected event network-vif-plugged-19a4f194-5514-4b8e-b635-e6fe0255dc1a for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:27:16 np0005603622 nova_compute[221301]: 2026-01-31 08:27:16.953 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:17.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:17.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 31 03:27:19 np0005603622 nova_compute[221301]: 2026-01-31 08:27:19.049 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:19.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:21.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:21.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:21 np0005603622 nova_compute[221301]: 2026-01-31 08:27:21.956 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:23.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 31 03:27:24 np0005603622 nova_compute[221301]: 2026-01-31 08:27:24.097 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:25 np0005603622 podman[268634]: 2026-01-31 08:27:25.179000702 +0000 UTC m=+0.082772906 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.199 221324 INFO nova.virt.libvirt.driver [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deleting instance files /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac_del#033[00m
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.202 221324 INFO nova.virt.libvirt.driver [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deletion of /var/lib/nova/instances/f3b36b5b-968c-4775-ac4f-93efc36f40ac_del complete#033[00m
Jan 31 03:27:25 np0005603622 podman[268633]: 2026-01-31 08:27:25.276567485 +0000 UTC m=+0.180370589 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.279 221324 INFO nova.compute.manager [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Took 12.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.280 221324 DEBUG oslo.service.loopingcall [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.281 221324 DEBUG nova.compute.manager [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:27:25 np0005603622 nova_compute[221301]: 2026-01-31 08:27:25.282 221324 DEBUG nova.network.neutron [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:27:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 31 03:27:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:25.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.840 221324 DEBUG nova.network.neutron [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.953 221324 INFO nova.compute.manager [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.961 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.987 221324 DEBUG nova.compute.manager [req-db08481f-678c-4ace-b1be-5c2e6aa52918 req-9f36d6bb-b259-40c4-9ba7-cf20ab311a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Received event network-vif-deleted-19a4f194-5514-4b8e-b635-e6fe0255dc1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.988 221324 INFO nova.compute.manager [req-db08481f-678c-4ace-b1be-5c2e6aa52918 req-9f36d6bb-b259-40c4-9ba7-cf20ab311a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Neutron deleted interface 19a4f194-5514-4b8e-b635-e6fe0255dc1a; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:27:26 np0005603622 nova_compute[221301]: 2026-01-31 08:27:26.988 221324 DEBUG nova.network.neutron [req-db08481f-678c-4ace-b1be-5c2e6aa52918 req-9f36d6bb-b259-40c4-9ba7-cf20ab311a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.136 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.138 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.146 221324 DEBUG nova.compute.manager [req-db08481f-678c-4ace-b1be-5c2e6aa52918 req-9f36d6bb-b259-40c4-9ba7-cf20ab311a5f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Detach interface failed, port_id=19a4f194-5514-4b8e-b635-e6fe0255dc1a, reason: Instance f3b36b5b-968c-4775-ac4f-93efc36f40ac could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:27:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:27.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.200 221324 DEBUG oslo_concurrency.processutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:27.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3155269517' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.670 221324 DEBUG oslo_concurrency.processutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.683 221324 DEBUG nova.compute.provider_tree [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.711 221324 DEBUG nova.scheduler.client.report [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.747 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.791 221324 INFO nova.scheduler.client.report [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Deleted allocations for instance f3b36b5b-968c-4775-ac4f-93efc36f40ac#033[00m
Jan 31 03:27:27 np0005603622 nova_compute[221301]: 2026-01-31 08:27:27.862 221324 DEBUG oslo_concurrency.lockutils [None req-f5b64885-8b83-44b1-8379-887edc940c88 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "f3b36b5b-968c-4775-ac4f-93efc36f40ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:28 np0005603622 nova_compute[221301]: 2026-01-31 08:27:28.828 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848033.8265266, f3b36b5b-968c-4775-ac4f-93efc36f40ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:28 np0005603622 nova_compute[221301]: 2026-01-31 08:27:28.829 221324 INFO nova.compute.manager [-] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:27:28 np0005603622 nova_compute[221301]: 2026-01-31 08:27:28.945 221324 DEBUG nova.compute.manager [None req-0309d9f1-6573-4f2b-88a3-2d590c2b0e82 - - - - - -] [instance: f3b36b5b-968c-4775-ac4f-93efc36f40ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:29 np0005603622 nova_compute[221301]: 2026-01-31 08:27:29.127 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:27:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:29.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:27:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:29.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:30.881 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:30.882 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:30.882 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:31.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:31.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 31 03:27:31 np0005603622 nova_compute[221301]: 2026-01-31 08:27:31.965 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.520 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.521 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.549 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.627 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.628 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.636 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.636 221324 INFO nova.compute.claims [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:27:32 np0005603622 nova_compute[221301]: 2026-01-31 08:27:32.764 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:33.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2469405617' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.265 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.270 221324 DEBUG nova.compute.provider_tree [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.354 221324 DEBUG nova.scheduler.client.report [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.381 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.382 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:27:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.442 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.443 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.494 221324 INFO nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.547 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:27:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:33.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.686 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.688 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.689 221324 INFO nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Creating image(s)#033[00m
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.816 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 31 03:27:33 np0005603622 nova_compute[221301]: 2026-01-31 08:27:33.970 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.009 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.014 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.042 221324 DEBUG nova.policy [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d03198d8ab846bda092e089b2d5a6c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.088 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.089 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.089 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.090 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.130 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.135 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:34 np0005603622 nova_compute[221301]: 2026-01-31 08:27:34.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:35.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:35.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 31 03:27:36 np0005603622 nova_compute[221301]: 2026-01-31 08:27:36.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:37 np0005603622 nova_compute[221301]: 2026-01-31 08:27:37.094 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Successfully created port: f854c0da-0188-4d67-8753-5ebedf3523f4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:27:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:37.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:37.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:38 np0005603622 nova_compute[221301]: 2026-01-31 08:27:38.751 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Successfully updated port: f854c0da-0188-4d67-8753-5ebedf3523f4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:27:38 np0005603622 nova_compute[221301]: 2026-01-31 08:27:38.772 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:27:38 np0005603622 nova_compute[221301]: 2026-01-31 08:27:38.772 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:27:38 np0005603622 nova_compute[221301]: 2026-01-31 08:27:38.772 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:27:38 np0005603622 nova_compute[221301]: 2026-01-31 08:27:38.962 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.157 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:39.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:39.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.771 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.637s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.809 221324 DEBUG nova.network.neutron [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.854 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.855 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance network_info: |[{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:27:39 np0005603622 nova_compute[221301]: 2026-01-31 08:27:39.861 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] resizing rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:27:40 np0005603622 nova_compute[221301]: 2026-01-31 08:27:40.246 221324 DEBUG nova.compute.manager [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-changed-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:40 np0005603622 nova_compute[221301]: 2026-01-31 08:27:40.246 221324 DEBUG nova.compute.manager [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Refreshing instance network info cache due to event network-changed-f854c0da-0188-4d67-8753-5ebedf3523f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:27:40 np0005603622 nova_compute[221301]: 2026-01-31 08:27:40.247 221324 DEBUG oslo_concurrency.lockutils [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:27:40 np0005603622 nova_compute[221301]: 2026-01-31 08:27:40.247 221324 DEBUG oslo_concurrency.lockutils [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:27:40 np0005603622 nova_compute[221301]: 2026-01-31 08:27:40.248 221324 DEBUG nova.network.neutron [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Refreshing network info cache for port f854c0da-0188-4d67-8753-5ebedf3523f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:27:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 31 03:27:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:41.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:41 np0005603622 nova_compute[221301]: 2026-01-31 08:27:41.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:41.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:41 np0005603622 nova_compute[221301]: 2026-01-31 08:27:41.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:42 np0005603622 nova_compute[221301]: 2026-01-31 08:27:42.314 221324 DEBUG nova.network.neutron [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updated VIF entry in instance network info cache for port f854c0da-0188-4d67-8753-5ebedf3523f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:27:42 np0005603622 nova_compute[221301]: 2026-01-31 08:27:42.316 221324 DEBUG nova.network.neutron [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:27:42 np0005603622 nova_compute[221301]: 2026-01-31 08:27:42.334 221324 DEBUG oslo_concurrency.lockutils [req-1fe272e9-33e0-456c-807f-ab707b99e6e9 req-8112c584-a0e5-4226-a88c-5abb5a92a28b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.201 221324 DEBUG nova.objects.instance [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'migration_context' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:43.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.228 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.229 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Ensure instance console log exists: /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.229 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.230 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.231 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.235 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start _get_guest_xml network_info=[{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.242 221324 WARNING nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.250 221324 DEBUG nova.virt.libvirt.host [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.252 221324 DEBUG nova.virt.libvirt.host [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.256 221324 DEBUG nova.virt.libvirt.host [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.257 221324 DEBUG nova.virt.libvirt.host [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.259 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.259 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.260 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.260 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.261 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.261 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.262 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.262 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.263 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.263 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.263 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.264 221324 DEBUG nova.virt.hardware [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.267 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:27:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:43.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2060085175' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.746 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.789 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:43 np0005603622 nova_compute[221301]: 2026-01-31 08:27:43.796 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:44 np0005603622 nova_compute[221301]: 2026-01-31 08:27:44.160 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:27:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3164324525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:27:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 31 03:27:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:45.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.275 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.277 221324 DEBUG nova.virt.libvirt.vif [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:27:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.277 221324 DEBUG nova.network.os_vif_util [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.278 221324 DEBUG nova.network.os_vif_util [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.279 221324 DEBUG nova.objects.instance [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.354 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <uuid>5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</uuid>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <name>instance-0000007a</name>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-2088246896</nova:name>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:27:43</nova:creationTime>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <nova:port uuid="f854c0da-0188-4d67-8753-5ebedf3523f4">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="serial">5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="uuid">5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:24:86:4f"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <target dev="tapf854c0da-01"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/console.log" append="off"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:27:45 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:27:45 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:27:45 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:27:45 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.355 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Preparing to wait for external event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.356 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.356 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.356 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.357 221324 DEBUG nova.virt.libvirt.vif [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:27:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.357 221324 DEBUG nova.network.os_vif_util [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.358 221324 DEBUG nova.network.os_vif_util [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.358 221324 DEBUG os_vif [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.358 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.359 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.360 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.363 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf854c0da-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.364 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf854c0da-01, col_values=(('external_ids', {'iface-id': 'f854c0da-0188-4d67-8753-5ebedf3523f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:86:4f', 'vm-uuid': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:45 np0005603622 NetworkManager[49080]: <info>  [1769848065.3688] manager: (tapf854c0da-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.372 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.373 221324 INFO os_vif [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01')#033[00m
Jan 31 03:27:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:45.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.732 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.732 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.733 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] No VIF found with MAC fa:16:3e:24:86:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:27:45 np0005603622 nova_compute[221301]: 2026-01-31 08:27:45.733 221324 INFO nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Using config drive#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.365 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.707 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.708 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.949 221324 INFO nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Creating config drive at /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.959 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjjsfor1g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:46 np0005603622 nova_compute[221301]: 2026-01-31 08:27:46.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.096 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjjsfor1g" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.136 221324 DEBUG nova.storage.rbd_utils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] rbd image 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.141 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:47.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:47.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.702 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.702 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.703 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.703 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:27:47 np0005603622 nova_compute[221301]: 2026-01-31 08:27:47.703 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4164567760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.103 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.537 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.537 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.745 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.746 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4489MB free_disk=20.906829833984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.747 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.747 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.895 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.896 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.897 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:27:48 np0005603622 nova_compute[221301]: 2026-01-31 08:27:48.949 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:27:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:49.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:27:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1633564784' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:27:49 np0005603622 nova_compute[221301]: 2026-01-31 08:27:49.403 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:49 np0005603622 nova_compute[221301]: 2026-01-31 08:27:49.411 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:27:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:49.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:49 np0005603622 nova_compute[221301]: 2026-01-31 08:27:49.643 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:27:49 np0005603622 nova_compute[221301]: 2026-01-31 08:27:49.686 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:27:49 np0005603622 nova_compute[221301]: 2026-01-31 08:27:49.687 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.368 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.427 221324 DEBUG oslo_concurrency.processutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.428 221324 INFO nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Deleting local config drive /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/disk.config because it was imported into RBD.#033[00m
Jan 31 03:27:50 np0005603622 kernel: tapf854c0da-01: entered promiscuous mode
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.4857] manager: (tapf854c0da-01): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Jan 31 03:27:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:50Z|00554|binding|INFO|Claiming lport f854c0da-0188-4d67-8753-5ebedf3523f4 for this chassis.
Jan 31 03:27:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:50Z|00555|binding|INFO|f854c0da-0188-4d67-8753-5ebedf3523f4: Claiming fa:16:3e:24:86:4f 10.100.0.10
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:50Z|00556|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 ovn-installed in OVS
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:50Z|00557|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 up in Southbound
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.544 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.545 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.546 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:27:50 np0005603622 systemd-udevd[269065]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.5567] device (tapf854c0da-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.5573] device (tapf854c0da-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:27:50 np0005603622 systemd-machined[190406]: New machine qemu-56-instance-0000007a.
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.561 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5eebbc27-3be7-4612-b116-2a0b22dd7c7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.562 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.564 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.565 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[82537401-db45-4e25-8152-f1fc78d2cc50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.565 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[96abf360-7b5c-42c2-ad5c-91cd0c554ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 systemd[1]: Started Virtual Machine qemu-56-instance-0000007a.
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.575 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5e257e74-a560-4d26-9279-5017d419dd97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.586 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c11c39d9-c587-4892-b2d7-6565c3a81474]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.610 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1df5d733-0ac3-49f7-a550-ddd3506863ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.6172] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.616 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[71416956-ac56-414d-9c7e-cbef253a26d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.651 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5bcc359b-ba35-4da6-a981-bb34d3803cf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.653 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[185d3a21-ee96-483b-9459-58fdc0f15c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.6672] device (tap1186b71b-00): carrier: link connected
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.671 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1374d3cd-3aff-4beb-8a80-1dc36dbd0ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.684 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e3e66e-d992-456b-818d-7e5c87148e46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732200, 'reachable_time': 25017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269099, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.687 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.688 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.688 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.704 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[110b0009-47a9-46c1-9f26-b31cfa6f134b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 732200, 'tstamp': 732200}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269100, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.718 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c4971b-a547-4bb3-97b7-3c598b931df8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732200, 'reachable_time': 25017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269101, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.761 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[46f0f957-2f12-4140-899d-dd8ea3c2d398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.828 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd2de70-4a17-48e1-99ac-06cd0156812c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.830 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.830 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.831 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.833 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 NetworkManager[49080]: <info>  [1769848070.8346] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 31 03:27:50 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.836 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.839 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.841 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:27:50Z|00558|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.843 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.845 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4cc8b9-49af-44a3-aba2-379a81321946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.846 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:27:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:50.848 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:27:50 np0005603622 nova_compute[221301]: 2026-01-31 08:27:50.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.023 221324 DEBUG nova.compute.manager [req-13ae80c0-33b4-4c1a-9fcf-d192c235955a req-1d497e1d-74e8-4946-b845-ba82df4d0566 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.024 221324 DEBUG oslo_concurrency.lockutils [req-13ae80c0-33b4-4c1a-9fcf-d192c235955a req-1d497e1d-74e8-4946-b845-ba82df4d0566 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.025 221324 DEBUG oslo_concurrency.lockutils [req-13ae80c0-33b4-4c1a-9fcf-d192c235955a req-1d497e1d-74e8-4946-b845-ba82df4d0566 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.025 221324 DEBUG oslo_concurrency.lockutils [req-13ae80c0-33b4-4c1a-9fcf-d192c235955a req-1d497e1d-74e8-4946-b845-ba82df4d0566 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.026 221324 DEBUG nova.compute.manager [req-13ae80c0-33b4-4c1a-9fcf-d192c235955a req-1d497e1d-74e8-4946-b845-ba82df4d0566 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Processing event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:27:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:51.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:51 np0005603622 podman[269170]: 2026-01-31 08:27:51.24943421 +0000 UTC m=+0.023849252 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.418 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848071.4175212, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.419 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Started (Lifecycle Event)#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.422 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.428 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.434 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance spawned successfully.#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.435 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.440 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.445 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.460 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.461 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.462 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.462 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.463 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.464 221324 DEBUG nova.virt.libvirt.driver [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.470 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.471 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848071.4190211, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.471 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.498 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.503 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848071.4270935, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.503 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.526 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.532 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.538 221324 INFO nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Took 17.85 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.538 221324 DEBUG nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.550 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:27:51 np0005603622 podman[269170]: 2026-01-31 08:27:51.585938455 +0000 UTC m=+0.360353507 container create 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:27:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:51.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.613 221324 INFO nova.compute.manager [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Took 19.01 seconds to build instance.#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.647 221324 DEBUG oslo_concurrency.lockutils [None req-5c36142b-c000-4c36-b574-c2fa74f78079 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:51 np0005603622 systemd[1]: Started libpod-conmon-9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524.scope.
Jan 31 03:27:51 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:27:51 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aca987e8b201779335e633ba0e8922bea4308ec12488137372e1bfcd22e0596a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:27:51 np0005603622 podman[269170]: 2026-01-31 08:27:51.781905753 +0000 UTC m=+0.556320865 container init 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:27:51 np0005603622 podman[269170]: 2026-01-31 08:27:51.787191275 +0000 UTC m=+0.561606297 container start 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:27:51 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [NOTICE]   (269195) : New worker (269197) forked
Jan 31 03:27:51 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [NOTICE]   (269195) : Loading success.
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.974 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:51.993 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:27:51 np0005603622 nova_compute[221301]: 2026-01-31 08:27:51.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:51.994 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.105 221324 DEBUG nova.compute.manager [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.106 221324 DEBUG oslo_concurrency.lockutils [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.107 221324 DEBUG oslo_concurrency.lockutils [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.107 221324 DEBUG oslo_concurrency.lockutils [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.107 221324 DEBUG nova.compute.manager [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:27:53 np0005603622 nova_compute[221301]: 2026-01-31 08:27:53.108 221324 WARNING nova.compute.manager [req-a9aaa652-e5e0-43b9-9b5b-1467ed8026d2 req-120904c0-2091-4dca-9b40-8451cad2bcac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:27:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:53.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:53.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:55.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:55 np0005603622 nova_compute[221301]: 2026-01-31 08:27:55.370 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:55.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:27:56 np0005603622 podman[269207]: 2026-01-31 08:27:56.138235882 +0000 UTC m=+0.059592303 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 03:27:56 np0005603622 podman[269206]: 2026-01-31 08:27:56.165529276 +0000 UTC m=+0.087146604 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:27:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 31 03:27:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:27:56.996 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:27:57 np0005603622 nova_compute[221301]: 2026-01-31 08:27:57.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:27:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:57.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:57.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:27:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:27:59.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:27:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:27:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:27:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:27:59.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:27:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 31 03:28:00 np0005603622 nova_compute[221301]: 2026-01-31 08:28:00.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:01.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:01.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:02 np0005603622 nova_compute[221301]: 2026-01-31 08:28:02.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:03.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:03.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:04 np0005603622 nova_compute[221301]: 2026-01-31 08:28:04.032 221324 DEBUG nova.compute.manager [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-changed-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:04 np0005603622 nova_compute[221301]: 2026-01-31 08:28:04.033 221324 DEBUG nova.compute.manager [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Refreshing instance network info cache due to event network-changed-f854c0da-0188-4d67-8753-5ebedf3523f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:28:04 np0005603622 nova_compute[221301]: 2026-01-31 08:28:04.033 221324 DEBUG oslo_concurrency.lockutils [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:04 np0005603622 nova_compute[221301]: 2026-01-31 08:28:04.033 221324 DEBUG oslo_concurrency.lockutils [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:04 np0005603622 nova_compute[221301]: 2026-01-31 08:28:04.034 221324 DEBUG nova.network.neutron [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Refreshing network info cache for port f854c0da-0188-4d67-8753-5ebedf3523f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:28:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 31 03:28:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:05.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:05 np0005603622 nova_compute[221301]: 2026-01-31 08:28:05.377 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:05.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e301 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:05 np0005603622 nova_compute[221301]: 2026-01-31 08:28:05.938 221324 DEBUG nova.network.neutron [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updated VIF entry in instance network info cache for port f854c0da-0188-4d67-8753-5ebedf3523f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:28:05 np0005603622 nova_compute[221301]: 2026-01-31 08:28:05.939 221324 DEBUG nova.network.neutron [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:06 np0005603622 nova_compute[221301]: 2026-01-31 08:28:06.025 221324 DEBUG oslo_concurrency.lockutils [req-c90df2c5-315d-438c-8066-faf7d8256c74 req-690a4ad2-3925-4f0b-bc93-2dc118fade39 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 31 03:28:07 np0005603622 nova_compute[221301]: 2026-01-31 08:28:07.076 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:07.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:07.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:07Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:86:4f 10.100.0.10
Jan 31 03:28:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:07Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:86:4f 10.100.0.10
Jan 31 03:28:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:08Z|00559|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:28:08 np0005603622 nova_compute[221301]: 2026-01-31 08:28:08.387 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:09.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:09 np0005603622 podman[269426]: 2026-01-31 08:28:09.510419926 +0000 UTC m=+0.307146417 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 31 03:28:09 np0005603622 podman[269426]: 2026-01-31 08:28:09.616687043 +0000 UTC m=+0.413413484 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 31 03:28:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:09.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.656470) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089656583, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1937, "num_deletes": 272, "total_data_size": 4568668, "memory_usage": 4634768, "flush_reason": "Manual Compaction"}
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089716131, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2999456, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52556, "largest_seqno": 54488, "table_properties": {"data_size": 2990892, "index_size": 5314, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17994, "raw_average_key_size": 20, "raw_value_size": 2973760, "raw_average_value_size": 3461, "num_data_blocks": 230, "num_entries": 859, "num_filter_entries": 859, "num_deletions": 272, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769847925, "oldest_key_time": 1769847925, "file_creation_time": 1769848089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 59713 microseconds, and 11426 cpu microseconds.
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.716202) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2999456 bytes OK
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.716225) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.735456) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.735492) EVENT_LOG_v1 {"time_micros": 1769848089735483, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.735518) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 4559709, prev total WAL file size 4559709, number of live WAL files 2.
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.736587) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373537' seq:72057594037927935, type:22 .. '6C6F676D0032303133' seq:0, type:0; will stop at (end)
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2929KB)], [102(11MB)]
Jan 31 03:28:09 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848089736641, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 14540789, "oldest_snapshot_seqno": -1}
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 8000 keys, 14407897 bytes, temperature: kUnknown
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090006965, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 14407897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14351348, "index_size": 35445, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20037, "raw_key_size": 206429, "raw_average_key_size": 25, "raw_value_size": 14206007, "raw_average_value_size": 1775, "num_data_blocks": 1406, "num_entries": 8000, "num_filter_entries": 8000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.007369) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 14407897 bytes
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.019037) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 53.8 rd, 53.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 11.0 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(9.7) write-amplify(4.8) OK, records in: 8552, records dropped: 552 output_compression: NoCompression
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.019082) EVENT_LOG_v1 {"time_micros": 1769848090019066, "job": 64, "event": "compaction_finished", "compaction_time_micros": 270427, "compaction_time_cpu_micros": 43975, "output_level": 6, "num_output_files": 1, "total_output_size": 14407897, "num_input_records": 8552, "num_output_records": 8000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090019498, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848090021024, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:09.736509) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.021084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.021091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.021095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.021098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:28:10.021101) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:28:10 np0005603622 nova_compute[221301]: 2026-01-31 08:28:10.381 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:28:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:11.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:28:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:11.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:12 np0005603622 nova_compute[221301]: 2026-01-31 08:28:12.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:12 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:13.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:13.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:28:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:28:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:28:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1473295762' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:28:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:28:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1473295762' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:28:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 31 03:28:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:15.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:15 np0005603622 nova_compute[221301]: 2026-01-31 08:28:15.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:28:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:15.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.080 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:17.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.523 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.523 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.560 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:28:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:17.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.693 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.693 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.703 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.705 221324 INFO nova.compute.claims [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:28:17 np0005603622 nova_compute[221301]: 2026-01-31 08:28:17.859 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2210879630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.344 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.379 221324 DEBUG nova.compute.provider_tree [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.478 221324 DEBUG nova.scheduler.client.report [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.608 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.609 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.737 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.738 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.811 221324 INFO nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.933 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:28:18 np0005603622 nova_compute[221301]: 2026-01-31 08:28:18.976 221324 DEBUG nova.policy [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef51681d234a4abc88ff433d0640b6e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '953a213fa5cb435ab3c04ad96152685f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.225 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.227 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.228 221324 INFO nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Creating image(s)#033[00m
Jan 31 03:28:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:19.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.280 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.336 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.367 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.371 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.449 221324 DEBUG oslo_concurrency.lockutils [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.449 221324 DEBUG oslo_concurrency.lockutils [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.450 221324 DEBUG nova.compute.manager [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.454 221324 DEBUG nova.compute.manager [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.455 221324 DEBUG nova.objects.instance [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.457 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.458 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.458 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.459 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.487 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.492 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 884be038-2a08-4b69-a345-5b6ac2661dfc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:19 np0005603622 nova_compute[221301]: 2026-01-31 08:28:19.523 221324 DEBUG nova.virt.libvirt.driver [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:28:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:19.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:20 np0005603622 nova_compute[221301]: 2026-01-31 08:28:20.027 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Successfully created port: dcdd9995-1bf7-4e6c-843e-fa270ad890d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:28:20 np0005603622 nova_compute[221301]: 2026-01-31 08:28:20.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:28:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:21.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.287 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Successfully updated port: dcdd9995-1bf7-4e6c-843e-fa270ad890d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.401 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.402 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.402 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.468 221324 DEBUG nova.compute.manager [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-changed-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.469 221324 DEBUG nova.compute.manager [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Refreshing instance network info cache due to event network-changed-dcdd9995-1bf7-4e6c-843e-fa270ad890d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.469 221324 DEBUG oslo_concurrency.lockutils [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:21.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.652 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:28:21 np0005603622 nova_compute[221301]: 2026-01-31 08:28:21.972 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 884be038-2a08-4b69-a345-5b6ac2661dfc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:22 np0005603622 nova_compute[221301]: 2026-01-31 08:28:22.130 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] resizing rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:28:23 np0005603622 nova_compute[221301]: 2026-01-31 08:28:23.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:23.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:23.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.341 221324 DEBUG nova.objects.instance [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'migration_context' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.344 221324 INFO nova.virt.libvirt.driver [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance shutdown successfully after 4 seconds.#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.376 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.377 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Ensure instance console log exists: /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.378 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.378 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.379 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.484 221324 DEBUG nova.network.neutron [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.595 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.596 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Instance network_info: |[{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.597 221324 DEBUG oslo_concurrency.lockutils [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.598 221324 DEBUG nova.network.neutron [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Refreshing network info cache for port dcdd9995-1bf7-4e6c-843e-fa270ad890d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.603 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Start _get_guest_xml network_info=[{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.610 221324 WARNING nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.617 221324 DEBUG nova.virt.libvirt.host [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.618 221324 DEBUG nova.virt.libvirt.host [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.622 221324 DEBUG nova.virt.libvirt.host [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.623 221324 DEBUG nova.virt.libvirt.host [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.626 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.627 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.628 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.628 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.629 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.629 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.630 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.630 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.630 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.631 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.631 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.632 221324 DEBUG nova.virt.hardware [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.637 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:24 np0005603622 kernel: tapf854c0da-01 (unregistering): left promiscuous mode
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.849 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603622 NetworkManager[49080]: <info>  [1769848104.8521] device (tapf854c0da-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:24Z|00560|binding|INFO|Releasing lport f854c0da-0188-4d67-8753-5ebedf3523f4 from this chassis (sb_readonly=0)
Jan 31 03:28:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:24Z|00561|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 down in Southbound
Jan 31 03:28:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:24Z|00562|binding|INFO|Removing iface tapf854c0da-01 ovn-installed in OVS
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.866 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:24 np0005603622 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 31 03:28:24 np0005603622 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Consumed 14.762s CPU time.
Jan 31 03:28:24 np0005603622 systemd-machined[190406]: Machine qemu-56-instance-0000007a terminated.
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.982 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance destroyed successfully.#033[00m
Jan 31 03:28:24 np0005603622 nova_compute[221301]: 2026-01-31 08:28:24.983 221324 DEBUG nova.objects.instance [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'numa_topology' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:25.012 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:25.013 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:28:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:25.015 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:28:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:25.016 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[951f09c2-a2c6-4520-947d-3d1e5a657e16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:25.017 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.030 221324 DEBUG nova.compute.manager [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.221 221324 DEBUG oslo_concurrency.lockutils [None req-694d22d2-373c-4877-b9d1-6dabe1b6a679 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 5.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:25.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:25 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [NOTICE]   (269195) : haproxy version is 2.8.14-c23fe91
Jan 31 03:28:25 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [NOTICE]   (269195) : path to executable is /usr/sbin/haproxy
Jan 31 03:28:25 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [WARNING]  (269195) : Exiting Master process...
Jan 31 03:28:25 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [ALERT]    (269195) : Current worker (269197) exited with code 143 (Terminated)
Jan 31 03:28:25 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[269191]: [WARNING]  (269195) : All workers exited. Exiting... (0)
Jan 31 03:28:25 np0005603622 systemd[1]: libpod-9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524.scope: Deactivated successfully.
Jan 31 03:28:25 np0005603622 podman[269945]: 2026-01-31 08:28:25.597248154 +0000 UTC m=+0.456215744 container died 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:28:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:25.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:28:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3219246525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.801 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.844 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:25 np0005603622 nova_compute[221301]: 2026-01-31 08:28:25.850 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.019 221324 DEBUG nova.compute.manager [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.020 221324 DEBUG oslo_concurrency.lockutils [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.021 221324 DEBUG oslo_concurrency.lockutils [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.021 221324 DEBUG oslo_concurrency.lockutils [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.022 221324 DEBUG nova.compute.manager [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.023 221324 WARNING nova.compute.manager [req-1794eb7b-0d3c-4ee7-9e03-c53eef47f033 req-25eb5ba6-16d5-4ce4-83bb-8845e9803d24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:28:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay-aca987e8b201779335e633ba0e8922bea4308ec12488137372e1bfcd22e0596a-merged.mount: Deactivated successfully.
Jan 31 03:28:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524-userdata-shm.mount: Deactivated successfully.
Jan 31 03:28:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.465 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:26 np0005603622 podman[269945]: 2026-01-31 08:28:26.472986133 +0000 UTC m=+1.331953753 container cleanup 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:28:26 np0005603622 systemd[1]: libpod-conmon-9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524.scope: Deactivated successfully.
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.554 221324 DEBUG oslo_concurrency.lockutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.554 221324 DEBUG oslo_concurrency.lockutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.555 221324 DEBUG nova.network.neutron [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.555 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'info_cache' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:28:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2539351269' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.996 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:26 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.998 221324 DEBUG nova.virt.libvirt.vif [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-67549667',display_name='tempest-ServerActionsTestOtherB-server-67549667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-67549667',id=126,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-ph9lb0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:28:19Z,user_data=None,user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=884be038-2a08-4b69-a345-5b6ac2661dfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:26.999 221324 DEBUG nova.network.os_vif_util [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.002 221324 DEBUG nova.network.os_vif_util [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.003 221324 DEBUG nova.objects.instance [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'pci_devices' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.062 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <uuid>884be038-2a08-4b69-a345-5b6ac2661dfc</uuid>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <name>instance-0000007e</name>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestOtherB-server-67549667</nova:name>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:28:24</nova:creationTime>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:user uuid="ef51681d234a4abc88ff433d0640b6e7">tempest-ServerActionsTestOtherB-1048458052-project-member</nova:user>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:project uuid="953a213fa5cb435ab3c04ad96152685f">tempest-ServerActionsTestOtherB-1048458052</nova:project>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <nova:port uuid="dcdd9995-1bf7-4e6c-843e-fa270ad890d0">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="serial">884be038-2a08-4b69-a345-5b6ac2661dfc</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="uuid">884be038-2a08-4b69-a345-5b6ac2661dfc</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/884be038-2a08-4b69-a345-5b6ac2661dfc_disk">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:df:97:89"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <target dev="tapdcdd9995-1b"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/console.log" append="off"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:28:27 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:28:27 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:28:27 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:28:27 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.062 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Preparing to wait for external event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.063 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.063 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.063 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.064 221324 DEBUG nova.virt.libvirt.vif [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-67549667',display_name='tempest-ServerActionsTestOtherB-server-67549667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-67549667',id=126,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-ph9lb0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:28:19Z,user_data=None,user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=884be038-2a08-4b69-a345-5b6ac2661dfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.064 221324 DEBUG nova.network.os_vif_util [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.065 221324 DEBUG nova.network.os_vif_util [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.065 221324 DEBUG os_vif [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.066 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.066 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.067 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.071 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcdd9995-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdcdd9995-1b, col_values=(('external_ids', {'iface-id': 'dcdd9995-1bf7-4e6c-843e-fa270ad890d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:97:89', 'vm-uuid': '884be038-2a08-4b69-a345-5b6ac2661dfc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:27 np0005603622 NetworkManager[49080]: <info>  [1769848107.1210] manager: (tapdcdd9995-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.125 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.133 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.136 221324 INFO os_vif [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b')#033[00m
Jan 31 03:28:27 np0005603622 podman[270044]: 2026-01-31 08:28:27.205239816 +0000 UTC m=+0.707311704 container remove 9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.210 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ee24af6a-90c8-453e-a2e7-2202e02afbf0]: (4, ('Sat Jan 31 08:28:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524)\n9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524\nSat Jan 31 08:28:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524)\n9b2e82b028f51d0f037567e253a2d051d5e9e35ec902fe7ab4909e33c12ea524\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.213 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[188da375-5a2e-4715-b1ae-194071a080a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.215 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:27 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.237 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c6e3cd-43de-4c8f-a2e1-1e71ee86b53a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.253 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2fae7676-d74d-456f-97c4-2885ac8d7086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.255 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[272e8ca0-8a2d-4880-b515-3501b8e0793e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.277 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d34108be-5de8-49cf-bde2-105ff70acb00]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732194, 'reachable_time': 31282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270102, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.282 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:28:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:27.282 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[f6431bc8-051a-48c0-bfdb-8fa36d2e38c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:27.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:27 np0005603622 podman[270051]: 2026-01-31 08:28:27.285199556 +0000 UTC m=+0.769396463 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:28:27 np0005603622 podman[270045]: 2026-01-31 08:28:27.300095096 +0000 UTC m=+0.783939124 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.541 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.542 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.542 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] No VIF found with MAC fa:16:3e:df:97:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.543 221324 INFO nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Using config drive#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.593 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:27.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.921 221324 DEBUG nova.network.neutron [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updated VIF entry in instance network info cache for port dcdd9995-1bf7-4e6c-843e-fa270ad890d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:28:27 np0005603622 nova_compute[221301]: 2026-01-31 08:28:27.922 221324 DEBUG nova.network.neutron [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.034 221324 DEBUG oslo_concurrency.lockutils [req-17bd0a0b-a574-463f-a6f3-ad0ae0716137 req-6acd60a0-3918-469a-9695-6a61aeb47da5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.081 221324 INFO nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Creating config drive at /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.087 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4do7ueda execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.128 221324 DEBUG nova.compute.manager [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.129 221324 DEBUG oslo_concurrency.lockutils [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.131 221324 DEBUG oslo_concurrency.lockutils [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.131 221324 DEBUG oslo_concurrency.lockutils [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.132 221324 DEBUG nova.compute.manager [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.133 221324 WARNING nova.compute.manager [req-3c1c0e63-43bb-4401-a7e4-f1356e2a8f77 req-c1f3bcab-ae3c-414b-a03d-7cc27655f3f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.214 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4do7ueda" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.256 221324 DEBUG nova.storage.rbd_utils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] rbd image 884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:28:28 np0005603622 nova_compute[221301]: 2026-01-31 08:28:28.262 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config 884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:29.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.437 221324 DEBUG nova.network.neutron [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.566 221324 DEBUG oslo_concurrency.lockutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.619 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance destroyed successfully.#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.619 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'numa_topology' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:29.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.743 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.803 221324 DEBUG nova.virt.libvirt.vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.804 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.805 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.806 221324 DEBUG os_vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.810 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.811 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf854c0da-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.814 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.821 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.826 221324 INFO os_vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01')#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.838 221324 DEBUG nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start _get_guest_xml network_info=[{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.844 221324 WARNING nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.852 221324 DEBUG nova.virt.libvirt.host [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.853 221324 DEBUG nova.virt.libvirt.host [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.857 221324 DEBUG nova.virt.libvirt.host [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.858 221324 DEBUG nova.virt.libvirt.host [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.859 221324 DEBUG nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.860 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.861 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.861 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.862 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.862 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.862 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.863 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.863 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.864 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.864 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.864 221324 DEBUG nova.virt.hardware [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.865 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:29 np0005603622 nova_compute[221301]: 2026-01-31 08:28:29.902 221324 DEBUG oslo_concurrency.processutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.053 221324 DEBUG oslo_concurrency.processutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config 884be038-2a08-4b69-a345-5b6ac2661dfc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.791s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.055 221324 INFO nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Deleting local config drive /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc/disk.config because it was imported into RBD.#033[00m
Jan 31 03:28:30 np0005603622 kernel: tapdcdd9995-1b: entered promiscuous mode
Jan 31 03:28:30 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:30Z|00563|binding|INFO|Claiming lport dcdd9995-1bf7-4e6c-843e-fa270ad890d0 for this chassis.
Jan 31 03:28:30 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:30Z|00564|binding|INFO|dcdd9995-1bf7-4e6c-843e-fa270ad890d0: Claiming fa:16:3e:df:97:89 10.100.0.5
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.130 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.1326] manager: (tapdcdd9995-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.148 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:97:89 10.100.0.5'], port_security=['fa:16:3e:df:97:89 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '884be038-2a08-4b69-a345-5b6ac2661dfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8c881e0-722d-4784-9f91-71ffaeb0ba02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=dcdd9995-1bf7-4e6c-843e-fa270ad890d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.150 139793 INFO neutron.agent.ovn.metadata.agent [-] Port dcdd9995-1bf7-4e6c-843e-fa270ad890d0 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 bound to our chassis#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.152 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44469d8b-ad30-4270-88fa-e67c568f3150#033[00m
Jan 31 03:28:30 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:30Z|00565|binding|INFO|Setting lport dcdd9995-1bf7-4e6c-843e-fa270ad890d0 ovn-installed in OVS
Jan 31 03:28:30 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:30Z|00566|binding|INFO|Setting lport dcdd9995-1bf7-4e6c-843e-fa270ad890d0 up in Southbound
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.166 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[60de00c5-233b-4ec5-9108-dbdd404e06f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.169 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44469d8b-a1 in ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.172 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44469d8b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.172 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd46ff7-4b6a-40e1-983f-c5e65791e827]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.173 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06ccb6be-62e7-4477-ae27-df5346fb7e10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.183 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[659048f2-55bb-48be-afc6-8f803fd2b580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 systemd-machined[190406]: New machine qemu-57-instance-0000007e.
Jan 31 03:28:30 np0005603622 systemd[1]: Started Virtual Machine qemu-57-instance-0000007e.
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.202 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cee985-61f4-4d4b-a9da-020ad36ca027]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 systemd-udevd[270205]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.2398] device (tapdcdd9995-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.2404] device (tapdcdd9995-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.253 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[20b44f02-f2ab-4c86-802a-66ab90d63103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 systemd-udevd[270210]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.2647] manager: (tap44469d8b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.267 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc4650b-b20d-4bc0-b701-150c6905191c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.307 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[790e1449-8e81-4781-89dc-0f3f6380cf6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.313 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[820f5062-a7cf-4f80-8ad3-bec8cb6a4a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.3382] device (tap44469d8b-a0): carrier: link connected
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.345 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[20bd8a56-8ee3-4fe5-a5eb-fcaf58187ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.364 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[78f07725-35d4-4023-ac1c-e5e8f9ce5955]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736167, 'reachable_time': 38080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270235, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1075187177' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.380 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c8083c50-36b4-40d9-89d6-ee80b712187d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:9820'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736167, 'tstamp': 736167}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270238, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.382 221324 DEBUG oslo_concurrency.processutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.397 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cc07d997-1b39-4b5b-ae16-ec3293497b2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44469d8b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:98:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736167, 'reachable_time': 38080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270239, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.431 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[abe11abc-9868-4f24-a6ee-6956319b99e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.496 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fdaf13-c85d-459c-94c2-5b881e09d943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44469d8b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:30 np0005603622 kernel: tap44469d8b-a0: entered promiscuous mode
Jan 31 03:28:30 np0005603622 NetworkManager[49080]: <info>  [1769848110.5501] manager: (tap44469d8b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.557 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44469d8b-a0, col_values=(('external_ids', {'iface-id': '7e288124-e200-4c03-8a4a-baab3e3f3d7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:30 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:30Z|00567|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.569 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.570 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d5cc14f2-a4ea-4659-acb6-e601da37af42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.571 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/44469d8b-ad30-4270-88fa-e67c568f3150.pid.haproxy
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 44469d8b-ad30-4270-88fa-e67c568f3150
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.573 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'env', 'PROCESS_TAG=haproxy-44469d8b-ad30-4270-88fa-e67c568f3150', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44469d8b-ad30-4270-88fa-e67c568f3150.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.585 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:30 np0005603622 nova_compute[221301]: 2026-01-31 08:28:30.596 221324 DEBUG oslo_concurrency.processutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.883 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.884 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:30.884 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:31 np0005603622 podman[270345]: 2026-01-31 08:28:30.941441386 +0000 UTC m=+0.030486540 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.035 221324 DEBUG nova.compute.manager [req-e1d5e318-ef50-4fe4-af8a-091b0dd67bd2 req-c3c151be-2a76-405a-82d8-891fe6f6e3c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.035 221324 DEBUG oslo_concurrency.lockutils [req-e1d5e318-ef50-4fe4-af8a-091b0dd67bd2 req-c3c151be-2a76-405a-82d8-891fe6f6e3c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.036 221324 DEBUG oslo_concurrency.lockutils [req-e1d5e318-ef50-4fe4-af8a-091b0dd67bd2 req-c3c151be-2a76-405a-82d8-891fe6f6e3c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.036 221324 DEBUG oslo_concurrency.lockutils [req-e1d5e318-ef50-4fe4-af8a-091b0dd67bd2 req-c3c151be-2a76-405a-82d8-891fe6f6e3c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.037 221324 DEBUG nova.compute.manager [req-e1d5e318-ef50-4fe4-af8a-091b0dd67bd2 req-c3c151be-2a76-405a-82d8-891fe6f6e3c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Processing event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:28:31 np0005603622 podman[270345]: 2026-01-31 08:28:31.062357357 +0000 UTC m=+0.151402451 container create de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.076 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848111.0763404, 884be038-2a08-4b69-a345-5b6ac2661dfc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.077 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] VM Started (Lifecycle Event)#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.081 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.087 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.091 221324 INFO nova.virt.libvirt.driver [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Instance spawned successfully.#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.092 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.102 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.107 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.125 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.126 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.127 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.127 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.128 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.128 221324 DEBUG nova.virt.libvirt.driver [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.141 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.142 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848111.0775151, 884be038-2a08-4b69-a345-5b6ac2661dfc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.142 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.174 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:31 np0005603622 systemd[1]: Started libpod-conmon-de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2.scope.
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.178 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848111.0853136, 884be038-2a08-4b69-a345-5b6ac2661dfc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.178 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:28:31 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:28:31 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c18bc93d1d11aaf4d33a9ba6576717a6d32b1969c62ce468409f87804c11d8e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.206 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.215 221324 INFO nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Took 11.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.215 221324 DEBUG nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.216 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:28:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3375457080' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.247 221324 DEBUG oslo_concurrency.processutils [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.651s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.249 221324 DEBUG nova.virt.libvirt.vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.249 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.250 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.251 221324 DEBUG nova.objects.instance [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.253 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.269 221324 DEBUG nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <uuid>5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</uuid>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <name>instance-0000007a</name>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerActionsTestJSON-server-2088246896</nova:name>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:28:29</nova:creationTime>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:user uuid="1d03198d8ab846bda092e089b2d5a6c7">tempest-ServerActionsTestJSON-1873947453-project-member</nova:user>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:project uuid="5b87da3b3f42494f96baeeeaf60b54df">tempest-ServerActionsTestJSON-1873947453</nova:project>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <nova:port uuid="f854c0da-0188-4d67-8753-5ebedf3523f4">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="serial">5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="uuid">5b114bfd-d1bc-4c02-bb60-5ba6b4395e28</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_disk.config">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:24:86:4f"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <target dev="tapf854c0da-01"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28/console.log" append="off"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:28:31 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:28:31 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:28:31 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:28:31 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.272 221324 DEBUG nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.273 221324 DEBUG nova.virt.libvirt.driver [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.275 221324 DEBUG nova.virt.libvirt.vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.276 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.277 221324 DEBUG nova.network.os_vif_util [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.277 221324 DEBUG os_vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.279 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.279 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.280 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.289 221324 INFO nova.compute.manager [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Took 13.62 seconds to build instance.#033[00m
Jan 31 03:28:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:31.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.292 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.292 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf854c0da-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.293 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf854c0da-01, col_values=(('external_ids', {'iface-id': 'f854c0da-0188-4d67-8753-5ebedf3523f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:86:4f', 'vm-uuid': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.2965] manager: (tapf854c0da-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.303 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.303 221324 INFO os_vif [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01')#033[00m
Jan 31 03:28:31 np0005603622 podman[270345]: 2026-01-31 08:28:31.311860533 +0000 UTC m=+0.400905607 container init de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.312 221324 DEBUG oslo_concurrency.lockutils [None req-36de6d87-246b-4b84-b470-845aa88fb3e3 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:31 np0005603622 podman[270345]: 2026-01-31 08:28:31.320079074 +0000 UTC m=+0.409124128 container start de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:28:31 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [NOTICE]   (270376) : New worker (270378) forked
Jan 31 03:28:31 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [NOTICE]   (270376) : Loading success.
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.5113] manager: (tapf854c0da-01): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 31 03:28:31 np0005603622 kernel: tapf854c0da-01: entered promiscuous mode
Jan 31 03:28:31 np0005603622 systemd-udevd[270220]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:31Z|00568|binding|INFO|Claiming lport f854c0da-0188-4d67-8753-5ebedf3523f4 for this chassis.
Jan 31 03:28:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:31Z|00569|binding|INFO|f854c0da-0188-4d67-8753-5ebedf3523f4: Claiming fa:16:3e:24:86:4f 10.100.0.10
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.519 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:31Z|00570|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 ovn-installed in OVS
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.532 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.5370] device (tapf854c0da-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.536 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.5378] device (tapf854c0da-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:28:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:31Z|00571|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 up in Southbound
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 systemd-machined[190406]: New machine qemu-58-instance-0000007a.
Jan 31 03:28:31 np0005603622 systemd[1]: Started Virtual Machine qemu-58-instance-0000007a.
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.612 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.615 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.622 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a02b92e2-032e-42ce-abc9-015fc24f1b58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.622 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.625 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.625 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06debdad-76a0-47e9-b722-e4eda3b8de99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.626 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3a3fd0-9496-4097-8b05-dc375c70d55a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.637 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[52f058fb-1f40-41f9-a44c-a42e9f151085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.659 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb100a3-29b0-424c-b351-39c76e066888]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:31.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.680 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fccec8c4-a5c9-4e81-850b-d8b636a7fdf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.6852] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.684 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0781d8-205f-4991-9a27-20c0649ca3da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.713 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f75620b9-c177-4598-806b-0a0ba691d1dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.716 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[964c19be-d56f-45ee-833e-c45f37972f19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.7384] device (tap1186b71b-00): carrier: link connected
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.744 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8ca67b-8def-4144-ae51-c96703b7ccba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.761 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1314d19b-b6f5-41c5-baca-8e011b7d27a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736307, 'reachable_time': 29759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270416, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.776 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f13c6a3-5143-4efa-a76e-d1182ab69fc9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736307, 'tstamp': 736307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270417, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.787 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7f6ece-095b-4a89-a86c-a1c8ad3e7b3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736307, 'reachable_time': 29759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270418, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.802 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[545923d2-0cbb-4fe3-bb98-3f87025243b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.834 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48d931f4-5f8f-4da9-a9e8-b1d7b1761a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.835 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.835 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.836 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.838 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 NetworkManager[49080]: <info>  [1769848111.8390] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 31 03:28:31 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.845 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:31Z|00572|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.851 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:28:31 np0005603622 nova_compute[221301]: 2026-01-31 08:28:31.852 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.853 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ec13723f-a0d7-4f53-8505-88fd836c9bd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.853 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:28:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:31.854 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.065 221324 DEBUG nova.compute.manager [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.066 221324 DEBUG oslo_concurrency.lockutils [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.066 221324 DEBUG oslo_concurrency.lockutils [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.067 221324 DEBUG oslo_concurrency.lockutils [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.068 221324 DEBUG nova.compute.manager [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.068 221324 WARNING nova.compute.manager [req-ca5c04a6-5e66-43b5-9432-a84077e7913a req-915c9e42-aeea-45f2-ae8e-2e775822552e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.269 221324 DEBUG nova.compute.manager [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.270 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.271 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848112.2681909, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.271 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.277 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance rebooted successfully.#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.278 221324 DEBUG nova.compute.manager [None req-ecb8a7d2-a718-4155-a791-d79957e54fc9 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.294 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.300 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:32 np0005603622 podman[270485]: 2026-01-31 08:28:32.221830734 +0000 UTC m=+0.034449657 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.340 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.341 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848112.2684433, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.341 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Started (Lifecycle Event)#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.367 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.370 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:32 np0005603622 podman[270485]: 2026-01-31 08:28:32.74993634 +0000 UTC m=+0.562555223 container create 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.798 221324 INFO nova.compute.manager [None req-0c99609c-e081-4699-90b3-ee15ecfc6936 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Get console output#033[00m
Jan 31 03:28:32 np0005603622 nova_compute[221301]: 2026-01-31 08:28:32.808 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:28:32 np0005603622 systemd[1]: Started libpod-conmon-02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177.scope.
Jan 31 03:28:32 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:28:32 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/776b3aaf8d441e51e4033c2a7501d8d3fa82377bc9ae0c84c7d57692ae4aee56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.117 221324 DEBUG nova.compute.manager [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.117 221324 DEBUG oslo_concurrency.lockutils [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.118 221324 DEBUG oslo_concurrency.lockutils [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.118 221324 DEBUG oslo_concurrency.lockutils [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.118 221324 DEBUG nova.compute.manager [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] No waiting events found dispatching network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.119 221324 WARNING nova.compute.manager [req-4e24351e-2d74-4021-b2a5-c2963ec45be1 req-be2304ea-ad2a-40cf-98f3-58653dfc2bef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received unexpected event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:28:33 np0005603622 podman[270485]: 2026-01-31 08:28:33.16811285 +0000 UTC m=+0.980731693 container init 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:28:33 np0005603622 podman[270485]: 2026-01-31 08:28:33.175119839 +0000 UTC m=+0.987738692 container start 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:28:33 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [NOTICE]   (270509) : New worker (270511) forked
Jan 31 03:28:33 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [NOTICE]   (270509) : Loading success.
Jan 31 03:28:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:33.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:33.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:33 np0005603622 nova_compute[221301]: 2026-01-31 08:28:33.974 221324 INFO nova.compute.manager [None req-67e5e3bd-1e9f-47a9-b114-aa10e536504e ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Get console output#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.194 221324 DEBUG nova.compute.manager [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.194 221324 DEBUG oslo_concurrency.lockutils [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.195 221324 DEBUG oslo_concurrency.lockutils [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.195 221324 DEBUG oslo_concurrency.lockutils [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.196 221324 DEBUG nova.compute.manager [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:34 np0005603622 nova_compute[221301]: 2026-01-31 08:28:34.196 221324 WARNING nova.compute.manager [req-267ac656-fea3-4817-8c00-964fb75fc4c8 req-0d272c5e-5c35-4970-a915-363612d74bf4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:28:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:35.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:35.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:36 np0005603622 nova_compute[221301]: 2026-01-31 08:28:36.295 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.110 221324 DEBUG nova.objects.instance [None req-99faeafb-a26f-471f-9f2c-16d2283cfb1e 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.135 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848117.1350908, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.136 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.177 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.202 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:37 np0005603622 nova_compute[221301]: 2026-01-31 08:28:37.227 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 03:28:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:37.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:37.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:38 np0005603622 kernel: tapf854c0da-01 (unregistering): left promiscuous mode
Jan 31 03:28:38 np0005603622 NetworkManager[49080]: <info>  [1769848118.1800] device (tapf854c0da-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:28:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:38Z|00573|binding|INFO|Releasing lport f854c0da-0188-4d67-8753-5ebedf3523f4 from this chassis (sb_readonly=0)
Jan 31 03:28:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:38Z|00574|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 down in Southbound
Jan 31 03:28:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:38Z|00575|binding|INFO|Removing iface tapf854c0da-01 ovn-installed in OVS
Jan 31 03:28:38 np0005603622 nova_compute[221301]: 2026-01-31 08:28:38.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:38 np0005603622 nova_compute[221301]: 2026-01-31 08:28:38.207 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:38 np0005603622 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 31 03:28:38 np0005603622 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007a.scope: Consumed 5.634s CPU time.
Jan 31 03:28:38 np0005603622 systemd-machined[190406]: Machine qemu-58-instance-0000007a terminated.
Jan 31 03:28:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:38.338 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:38.342 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:28:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:38.344 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:28:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:38.345 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8f7843-f291-4ad2-afdc-e9cdca5d5082]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:38.346 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:28:38 np0005603622 nova_compute[221301]: 2026-01-31 08:28:38.435 221324 DEBUG nova.compute.manager [None req-99faeafb-a26f-471f-9f2c-16d2283cfb1e 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [NOTICE]   (270509) : haproxy version is 2.8.14-c23fe91
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [NOTICE]   (270509) : path to executable is /usr/sbin/haproxy
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [WARNING]  (270509) : Exiting Master process...
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [WARNING]  (270509) : Exiting Master process...
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [ALERT]    (270509) : Current worker (270511) exited with code 143 (Terminated)
Jan 31 03:28:38 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270505]: [WARNING]  (270509) : All workers exited. Exiting... (0)
Jan 31 03:28:38 np0005603622 systemd[1]: libpod-02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177.scope: Deactivated successfully.
Jan 31 03:28:38 np0005603622 podman[270557]: 2026-01-31 08:28:38.680414891 +0000 UTC m=+0.197734006 container died 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:28:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177-userdata-shm.mount: Deactivated successfully.
Jan 31 03:28:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay-776b3aaf8d441e51e4033c2a7501d8d3fa82377bc9ae0c84c7d57692ae4aee56-merged.mount: Deactivated successfully.
Jan 31 03:28:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:39.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.324 221324 DEBUG nova.compute.manager [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.325 221324 DEBUG oslo_concurrency.lockutils [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.325 221324 DEBUG oslo_concurrency.lockutils [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.326 221324 DEBUG oslo_concurrency.lockutils [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.326 221324 DEBUG nova.compute.manager [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:39 np0005603622 nova_compute[221301]: 2026-01-31 08:28:39.327 221324 WARNING nova.compute.manager [req-f28dc6dd-64ae-4407-bdd7-035f82cb7370 req-c0643343-de8d-4e79-819f-e5b0ec42a381 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:28:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:39.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:40 np0005603622 podman[270557]: 2026-01-31 08:28:40.948545049 +0000 UTC m=+2.465864144 container cleanup 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:28:40 np0005603622 systemd[1]: libpod-conmon-02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177.scope: Deactivated successfully.
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.293 221324 INFO nova.compute.manager [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Resuming#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.294 221324 DEBUG nova.objects.instance [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'flavor' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.297 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:41.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.435 221324 DEBUG oslo_concurrency.lockutils [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.436 221324 DEBUG oslo_concurrency.lockutils [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.437 221324 DEBUG nova.network.neutron [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.457 221324 DEBUG nova.compute.manager [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.458 221324 DEBUG oslo_concurrency.lockutils [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.458 221324 DEBUG oslo_concurrency.lockutils [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.458 221324 DEBUG oslo_concurrency.lockutils [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.459 221324 DEBUG nova.compute.manager [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.459 221324 WARNING nova.compute.manager [req-24434e9a-0bf1-4637-8fd3-99f84760d1b7 req-b9af9a90-4c37-4c87-82b7-a4ddcd1ecc2b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 31 03:28:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:28:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:41.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:28:41 np0005603622 podman[270589]: 2026-01-31 08:28:41.701894679 +0000 UTC m=+0.732108090 container remove 02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.708 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4edb17e2-cdbd-4d77-8162-fad3c73d0453]: (4, ('Sat Jan 31 08:28:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177)\n02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177\nSat Jan 31 08:28:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177)\n02e4059f066a59697159162cb266c244fac5c2530823e87292036594e4122177\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.709 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88a88f61-8485-46bb-b5d2-be8d7d915025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.709 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:41 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.716 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:41 np0005603622 nova_compute[221301]: 2026-01-31 08:28:41.726 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.728 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0694dc0e-0860-4794-931e-46df800f71e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.742 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0af99a-4488-4f79-b98a-baef4e465e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.743 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[84c4a6cd-00ef-433b-8e72-9da74efeb410]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.762 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[350542c5-7bc1-48e9-a432-289cb50aa021]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736301, 'reachable_time': 28298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270606, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:41 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.766 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:28:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:41.766 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[70fd034f-7a81-41be-8db6-d941eb0d2151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:42 np0005603622 nova_compute[221301]: 2026-01-31 08:28:42.228 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:43.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:43 np0005603622 nova_compute[221301]: 2026-01-31 08:28:43.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:43 np0005603622 nova_compute[221301]: 2026-01-31 08:28:43.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:43 np0005603622 nova_compute[221301]: 2026-01-31 08:28:43.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:28:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:28:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:43.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.204 221324 DEBUG nova.network.neutron [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.407 221324 DEBUG oslo_concurrency.lockutils [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.416 221324 DEBUG nova.virt.libvirt.vif [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.417 221324 DEBUG nova.network.os_vif_util [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.418 221324 DEBUG nova.network.os_vif_util [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.419 221324 DEBUG os_vif [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.420 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.420 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.421 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.424 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf854c0da-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.425 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf854c0da-01, col_values=(('external_ids', {'iface-id': 'f854c0da-0188-4d67-8753-5ebedf3523f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:86:4f', 'vm-uuid': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.426 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.426 221324 INFO os_vif [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01')#033[00m
Jan 31 03:28:44 np0005603622 nova_compute[221301]: 2026-01-31 08:28:44.613 221324 DEBUG nova.objects.instance [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'numa_topology' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:45.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:45 np0005603622 kernel: tapf854c0da-01: entered promiscuous mode
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.6015] manager: (tapf854c0da-01): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:45Z|00576|binding|INFO|Claiming lport f854c0da-0188-4d67-8753-5ebedf3523f4 for this chassis.
Jan 31 03:28:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:45Z|00577|binding|INFO|f854c0da-0188-4d67-8753-5ebedf3523f4: Claiming fa:16:3e:24:86:4f 10.100.0.10
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.604 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:45Z|00578|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 ovn-installed in OVS
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 systemd-machined[190406]: New machine qemu-59-instance-0000007a.
Jan 31 03:28:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:45Z|00579|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 up in Southbound
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.648 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '7', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.650 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 bound to our chassis#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.652 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1186b71b-0c4b-47f0-a55d-4433241e46e7#033[00m
Jan 31 03:28:45 np0005603622 systemd[1]: Started Virtual Machine qemu-59-instance-0000007a.
Jan 31 03:28:45 np0005603622 systemd-udevd[270629]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.665 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6efa5c-f538-487a-9b73-0bd564794d24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.666 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1186b71b-01 in ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.668 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1186b71b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.668 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fd19cd5b-a0df-470c-b591-d7e26ee19468]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.669 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ba64a906-b83b-43d5-9286-16165eabe8dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.678 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d06bf7-820d-4e69-9286-1146aca63f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.6838] device (tapf854c0da-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.6846] device (tapf854c0da-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:28:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:45.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.689 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[53ae5041-1aa4-46bd-9454-3237102c0906]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.715 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b47509e5-d1ea-4552-8549-0a936224aa1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.721 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[92e2b183-f4e6-489e-99f0-88dd6323b882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 systemd-udevd[270634]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.7230] manager: (tap1186b71b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/271)
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.748 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ebe478-c281-4d7d-9731-97eed10add63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.751 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[cade134f-fea1-474a-bf34-2de170c82b67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.7746] device (tap1186b71b-00): carrier: link connected
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.777 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8b50a291-d1e4-4729-a2c6-86dbe7e18053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.795 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4513e1b4-a082-41c2-9683-6606677b968d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737711, 'reachable_time': 15205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270661, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.806 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[73138f4e-a2e1-4ff3-8ce4-34a857b5df07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:37ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 737711, 'tstamp': 737711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270662, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.819 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2213e5f0-3397-48fe-a406-57d3a1f1c8e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1186b71b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737711, 'reachable_time': 15205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270663, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.838 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4938d0a7-87a1-493c-a8fb-d894b85a1f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.885 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2be7e003-0ff8-4435-8ba6-43b540640bbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.886 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.886 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.887 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1186b71b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 NetworkManager[49080]: <info>  [1769848125.8895] manager: (tap1186b71b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 31 03:28:45 np0005603622 kernel: tap1186b71b-00: entered promiscuous mode
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.892 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1186b71b-00, col_values=(('external_ids', {'iface-id': '4375f262-ce22-40bf-bf9b-24f6862763a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:45Z|00580|binding|INFO|Releasing lport 4375f262-ce22-40bf-bf9b-24f6862763a2 from this chassis (sb_readonly=0)
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.894 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.895 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.896 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[efc8393e-1099-4480-8a96-9c2365714c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.896 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1186b71b-0c4b-47f0-a55d-4433241e46e7.pid.haproxy
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1186b71b-0c4b-47f0-a55d-4433241e46e7
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:28:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:45.897 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'env', 'PROCESS_TAG=haproxy-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1186b71b-0c4b-47f0-a55d-4433241e46e7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:28:45 np0005603622 nova_compute[221301]: 2026-01-31 08:28:45.900 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.289 221324 DEBUG nova.compute.manager [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.290 221324 DEBUG oslo_concurrency.lockutils [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.295 221324 DEBUG oslo_concurrency.lockutils [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.295 221324 DEBUG oslo_concurrency.lockutils [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.295 221324 DEBUG nova.compute.manager [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.295 221324 WARNING nova.compute.manager [req-1ed6988a-2bf8-4553-adc0-002d86b69630 req-5d45d000-17ea-4617-9e8d-a9ed55232579 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.299 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:46 np0005603622 podman[270736]: 2026-01-31 08:28:46.21933254 +0000 UTC m=+0.024245172 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.345 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.346 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848126.3452098, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.346 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Started (Lifecycle Event)#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.358 221324 DEBUG nova.compute.manager [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.358 221324 DEBUG nova.objects.instance [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.413 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.418 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.455 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance running successfully.#033[00m
Jan 31 03:28:46 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.458 221324 DEBUG nova.virt.libvirt.guest [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.459 221324 DEBUG nova.compute.manager [None req-7d3ff3df-60e1-431f-9ec0-46a33543546f 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.516 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.517 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848126.3511558, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.517 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:28:46 np0005603622 podman[270736]: 2026-01-31 08:28:46.596218901 +0000 UTC m=+0.401131533 container create a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.659 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:28:46 np0005603622 nova_compute[221301]: 2026-01-31 08:28:46.663 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:28:46 np0005603622 systemd[1]: Started libpod-conmon-a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20.scope.
Jan 31 03:28:46 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:28:46 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92e7fa271c1d2532a6ffab6a3729fdd2aa56f65f08a55e659a51a693b2a2e11c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:28:46 np0005603622 podman[270736]: 2026-01-31 08:28:46.898534118 +0000 UTC m=+0.703446750 container init a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 03:28:46 np0005603622 podman[270736]: 2026-01-31 08:28:46.903572343 +0000 UTC m=+0.708484955 container start a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:46 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [NOTICE]   (270755) : New worker (270757) forked
Jan 31 03:28:46 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [NOTICE]   (270755) : Loading success.
Jan 31 03:28:47 np0005603622 nova_compute[221301]: 2026-01-31 08:28:47.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:47.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:47Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:97:89 10.100.0.5
Jan 31 03:28:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:47Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:97:89 10.100.0.5
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.410 221324 DEBUG nova.compute.manager [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.411 221324 DEBUG oslo_concurrency.lockutils [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.411 221324 DEBUG oslo_concurrency.lockutils [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.412 221324 DEBUG oslo_concurrency.lockutils [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.412 221324 DEBUG nova.compute.manager [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.412 221324 WARNING nova.compute.manager [req-8d6fa2d7-4fa3-4e56-bbd5-576304190f45 req-48846326-1a74-48ef-a3ee-30f72915bd02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.897 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.898 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.898 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:28:48 np0005603622 nova_compute[221301]: 2026-01-31 08:28:48.898 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:49.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.502 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.502 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.503 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.503 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.504 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.505 221324 INFO nova.compute.manager [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Terminating instance#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.507 221324 DEBUG nova.compute.manager [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:28:49 np0005603622 kernel: tapf854c0da-01 (unregistering): left promiscuous mode
Jan 31 03:28:49 np0005603622 NetworkManager[49080]: <info>  [1769848129.5638] device (tapf854c0da-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:28:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:49Z|00581|binding|INFO|Releasing lport f854c0da-0188-4d67-8753-5ebedf3523f4 from this chassis (sb_readonly=0)
Jan 31 03:28:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:49Z|00582|binding|INFO|Setting lport f854c0da-0188-4d67-8753-5ebedf3523f4 down in Southbound
Jan 31 03:28:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:28:49Z|00583|binding|INFO|Removing iface tapf854c0da-01 ovn-installed in OVS
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.579 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:49 np0005603622 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 31 03:28:49 np0005603622 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007a.scope: Consumed 3.868s CPU time.
Jan 31 03:28:49 np0005603622 systemd-machined[190406]: Machine qemu-59-instance-0000007a terminated.
Jan 31 03:28:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:49.692 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:86:4f 10.100.0.10'], port_security=['fa:16:3e:24:86:4f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5b114bfd-d1bc-4c02-bb60-5ba6b4395e28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b87da3b3f42494f96baeeeaf60b54df', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9fcc0f91-c2a1-4d1a-a56d-473f8cfe93e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e51ea0db-c93c-43cf-bbdf-25868bfa3347, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=f854c0da-0188-4d67-8753-5ebedf3523f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:28:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:49.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:49.694 139793 INFO neutron.agent.ovn.metadata.agent [-] Port f854c0da-0188-4d67-8753-5ebedf3523f4 in datapath 1186b71b-0c4b-47f0-a55d-4433241e46e7 unbound from our chassis#033[00m
Jan 31 03:28:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:49.696 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1186b71b-0c4b-47f0-a55d-4433241e46e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:28:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:49.698 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b02c2fd8-8cd5-44e3-978f-9ee830a13da7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:49.699 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 namespace which is not needed anymore#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.752 221324 INFO nova.virt.libvirt.driver [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Instance destroyed successfully.#033[00m
Jan 31 03:28:49 np0005603622 nova_compute[221301]: 2026-01-31 08:28:49.753 221324 DEBUG nova.objects.instance [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lazy-loading 'resources' on Instance uuid 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:28:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 31 03:28:50 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [NOTICE]   (270755) : haproxy version is 2.8.14-c23fe91
Jan 31 03:28:50 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [NOTICE]   (270755) : path to executable is /usr/sbin/haproxy
Jan 31 03:28:50 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [WARNING]  (270755) : Exiting Master process...
Jan 31 03:28:50 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [ALERT]    (270755) : Current worker (270757) exited with code 143 (Terminated)
Jan 31 03:28:50 np0005603622 neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7[270751]: [WARNING]  (270755) : All workers exited. Exiting... (0)
Jan 31 03:28:50 np0005603622 systemd[1]: libpod-a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20.scope: Deactivated successfully.
Jan 31 03:28:50 np0005603622 podman[270802]: 2026-01-31 08:28:50.112038387 +0000 UTC m=+0.296956583 container died a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.114 221324 DEBUG nova.virt.libvirt.vif [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:27:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2088246896',display_name='tempest-ServerActionsTestJSON-server-2088246896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2088246896',id=122,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEMX7xtBP2HhBX3pWbzAif7M7f9s0Q4GDiATIrADvKnAB332Jld7UULtDNydBeR2j8JALy9nZflzvnNilxAIZziUDmv2Qmiq71a1XDr8XCpUlq2U0Bv+mSaDFTQKW8Mu0g==',key_name='tempest-keypair-496810804',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:27:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b87da3b3f42494f96baeeeaf60b54df',ramdisk_id='',reservation_id='r-tt67a630',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1873947453',owner_user_name='tempest-ServerActionsTestJSON-1873947453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1d03198d8ab846bda092e089b2d5a6c7',uuid=5b114bfd-d1bc-4c02-bb60-5ba6b4395e28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.115 221324 DEBUG nova.network.os_vif_util [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converting VIF {"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.116 221324 DEBUG nova.network.os_vif_util [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.117 221324 DEBUG os_vif [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.120 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf854c0da-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.125 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.129 221324 INFO os_vif [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:86:4f,bridge_name='br-int',has_traffic_filtering=True,id=f854c0da-0188-4d67-8753-5ebedf3523f4,network=Network(1186b71b-0c4b-47f0-a55d-4433241e46e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf854c0da-01')#033[00m
Jan 31 03:28:50 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20-userdata-shm.mount: Deactivated successfully.
Jan 31 03:28:50 np0005603622 systemd[1]: var-lib-containers-storage-overlay-92e7fa271c1d2532a6ffab6a3729fdd2aa56f65f08a55e659a51a693b2a2e11c-merged.mount: Deactivated successfully.
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.636 221324 DEBUG nova.compute.manager [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.637 221324 DEBUG oslo_concurrency.lockutils [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.637 221324 DEBUG oslo_concurrency.lockutils [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.637 221324 DEBUG oslo_concurrency.lockutils [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.638 221324 DEBUG nova.compute.manager [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:50 np0005603622 nova_compute[221301]: 2026-01-31 08:28:50.638 221324 DEBUG nova.compute.manager [req-05aedbf0-ac4c-4a04-99d1-43fe630c8b18 req-a76e7864-f9eb-4fe5-a7e3-4c77e837967f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-unplugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:28:50 np0005603622 podman[270802]: 2026-01-31 08:28:50.77837973 +0000 UTC m=+0.963297866 container cleanup a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:50 np0005603622 systemd[1]: libpod-conmon-a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20.scope: Deactivated successfully.
Jan 31 03:28:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e304 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:51.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:51 np0005603622 podman[270851]: 2026-01-31 08:28:51.364623018 +0000 UTC m=+0.560546199 container remove a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.371 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[797e2c90-7295-472e-a258-6ac1ffbcb156]: (4, ('Sat Jan 31 08:28:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20)\na44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20\nSat Jan 31 08:28:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 (a44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20)\na44279ab2265aaa80967ebfc5d9537e62bf65dd456badddbe71090440e0b1c20\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.374 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1d24aeb8-9a0e-43d1-b323-82cec7ec903d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.375 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1186b71b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:28:51 np0005603622 kernel: tap1186b71b-00: left promiscuous mode
Jan 31 03:28:51 np0005603622 nova_compute[221301]: 2026-01-31 08:28:51.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.385 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee77590-b520-4b42-9e76-a6e93af4c677]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 nova_compute[221301]: 2026-01-31 08:28:51.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.401 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3093bf-042d-4c79-8ffb-e6d298db3309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.404 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c2f4171a-c69e-44b5-91a7-474097df5feb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.417 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c434fd44-bd66-4334-9b23-95653e5a14b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737705, 'reachable_time': 17605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270867, 'error': None, 'target': 'ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.420 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1186b71b-0c4b-47f0-a55d-4433241e46e7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:28:51 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1186b71b\x2d0c4b\x2d47f0\x2da55d\x2d4433241e46e7.mount: Deactivated successfully.
Jan 31 03:28:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:28:51.420 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[80177912-38a2-48b6-b436-7e14d3fc2e9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:28:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.212 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [{"id": "f854c0da-0188-4d67-8753-5ebedf3523f4", "address": "fa:16:3e:24:86:4f", "network": {"id": "1186b71b-0c4b-47f0-a55d-4433241e46e7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-924085117-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b87da3b3f42494f96baeeeaf60b54df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf854c0da-01", "ovs_interfaceid": "f854c0da-0188-4d67-8753-5ebedf3523f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.331 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.562 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.563 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.564 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.564 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.564 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.565 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.692 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.692 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.693 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.693 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.693 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.917 221324 DEBUG nova.compute.manager [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.917 221324 DEBUG oslo_concurrency.lockutils [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.918 221324 DEBUG oslo_concurrency.lockutils [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.918 221324 DEBUG oslo_concurrency.lockutils [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.919 221324 DEBUG nova.compute.manager [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] No waiting events found dispatching network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:28:52 np0005603622 nova_compute[221301]: 2026-01-31 08:28:52.919 221324 WARNING nova.compute.manager [req-69e52c64-2c29-4e96-9d32-2cc7cdc28ab6 req-f2a2385f-fc67-419a-9dae-addbfc91dc28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received unexpected event network-vif-plugged-f854c0da-0188-4d67-8753-5ebedf3523f4 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:28:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1339445393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.145 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:53.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.351 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.352 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.357 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.357 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.499 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.501 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4233MB free_disk=20.715423583984375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.501 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.501 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:28:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:53.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.828 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.829 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.830 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.831 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:28:53 np0005603622 nova_compute[221301]: 2026-01-31 08:28:53.895 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:28:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:28:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3981231520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:28:54 np0005603622 nova_compute[221301]: 2026-01-31 08:28:54.371 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:28:54 np0005603622 nova_compute[221301]: 2026-01-31 08:28:54.377 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:28:54 np0005603622 nova_compute[221301]: 2026-01-31 08:28:54.447 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:28:54 np0005603622 nova_compute[221301]: 2026-01-31 08:28:54.496 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:28:54 np0005603622 nova_compute[221301]: 2026-01-31 08:28:54.497 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:28:55 np0005603622 nova_compute[221301]: 2026-01-31 08:28:55.124 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 31 03:28:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:55.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:55.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:28:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:57.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:57.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.854 221324 INFO nova.virt.libvirt.driver [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Deleting instance files /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_del#033[00m
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.856 221324 INFO nova.virt.libvirt.driver [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Deletion of /var/lib/nova/instances/5b114bfd-d1bc-4c02-bb60-5ba6b4395e28_del complete#033[00m
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.949 221324 INFO nova.compute.manager [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Took 8.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.950 221324 DEBUG oslo.service.loopingcall [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.950 221324 DEBUG nova.compute.manager [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:28:57 np0005603622 nova_compute[221301]: 2026-01-31 08:28:57.951 221324 DEBUG nova.network.neutron [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:28:58 np0005603622 podman[270917]: 2026-01-31 08:28:58.150476894 +0000 UTC m=+0.064249108 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 03:28:58 np0005603622 podman[270916]: 2026-01-31 08:28:58.183340058 +0000 UTC m=+0.102275100 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:28:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 31 03:28:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:28:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:28:59.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:28:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:28:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:28:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:28:59.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:00 np0005603622 nova_compute[221301]: 2026-01-31 08:29:00.128 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:00 np0005603622 nova_compute[221301]: 2026-01-31 08:29:00.514 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:00.514 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:29:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:00.516 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:29:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:29:00Z|00584|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:29:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:01.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:02 np0005603622 nova_compute[221301]: 2026-01-31 08:29:02.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:03.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:03.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:04 np0005603622 nova_compute[221301]: 2026-01-31 08:29:04.493 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:04 np0005603622 nova_compute[221301]: 2026-01-31 08:29:04.750 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848129.7485292, 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:29:04 np0005603622 nova_compute[221301]: 2026-01-31 08:29:04.750 221324 INFO nova.compute.manager [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:29:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 31 03:29:04 np0005603622 nova_compute[221301]: 2026-01-31 08:29:04.991 221324 DEBUG nova.compute.manager [None req-597c060c-a66a-47e9-a29d-c05bc2633ecf - - - - - -] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:29:05 np0005603622 nova_compute[221301]: 2026-01-31 08:29:05.131 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:29:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:05.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:29:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:05.519 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:29:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:05.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.014 221324 DEBUG nova.network.neutron [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.257 221324 DEBUG nova.compute.manager [req-296a8f39-5fd0-42a8-a04e-17ece1cef100 req-9fcd60a2-47aa-4a46-b837-e016bb8907b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Received event network-vif-deleted-f854c0da-0188-4d67-8753-5ebedf3523f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.257 221324 INFO nova.compute.manager [req-296a8f39-5fd0-42a8-a04e-17ece1cef100 req-9fcd60a2-47aa-4a46-b837-e016bb8907b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Neutron deleted interface f854c0da-0188-4d67-8753-5ebedf3523f4; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.258 221324 DEBUG nova.network.neutron [req-296a8f39-5fd0-42a8-a04e-17ece1cef100 req-9fcd60a2-47aa-4a46-b837-e016bb8907b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.286 221324 INFO nova.compute.manager [-] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Took 8.34 seconds to deallocate network for instance.#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.590 221324 DEBUG nova.compute.manager [req-296a8f39-5fd0-42a8-a04e-17ece1cef100 req-9fcd60a2-47aa-4a46-b837-e016bb8907b1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28] Detach interface failed, port_id=f854c0da-0188-4d67-8753-5ebedf3523f4, reason: Instance 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.742 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.743 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:06 np0005603622 nova_compute[221301]: 2026-01-31 08:29:06.814 221324 DEBUG oslo_concurrency.processutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3910124881' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.275 221324 DEBUG oslo_concurrency.processutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.283 221324 DEBUG nova.compute.provider_tree [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.344 221324 DEBUG nova.scheduler.client.report [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:29:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:07.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.383 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.548 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:07 np0005603622 nova_compute[221301]: 2026-01-31 08:29:07.628 221324 INFO nova.scheduler.client.report [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Deleted allocations for instance 5b114bfd-d1bc-4c02-bb60-5ba6b4395e28#033[00m
Jan 31 03:29:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:07.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:08 np0005603622 nova_compute[221301]: 2026-01-31 08:29:08.219 221324 DEBUG oslo_concurrency.lockutils [None req-dd919368-f63b-4867-968e-136619883b51 1d03198d8ab846bda092e089b2d5a6c7 5b87da3b3f42494f96baeeeaf60b54df - - default default] Lock "5b114bfd-d1bc-4c02-bb60-5ba6b4395e28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 18.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:09.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:09.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:10 np0005603622 nova_compute[221301]: 2026-01-31 08:29:10.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.038687) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151038787, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 950, "num_deletes": 253, "total_data_size": 1829100, "memory_usage": 1860448, "flush_reason": "Manual Compaction"}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151125102, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 1205899, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54493, "largest_seqno": 55438, "table_properties": {"data_size": 1201456, "index_size": 2095, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10360, "raw_average_key_size": 20, "raw_value_size": 1192318, "raw_average_value_size": 2337, "num_data_blocks": 91, "num_entries": 510, "num_filter_entries": 510, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848090, "oldest_key_time": 1769848090, "file_creation_time": 1769848151, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 86480 microseconds, and 3970 cpu microseconds.
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.125179) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 1205899 bytes OK
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.125209) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.131349) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.131413) EVENT_LOG_v1 {"time_micros": 1769848151131397, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.131452) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1824273, prev total WAL file size 1824273, number of live WAL files 2.
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.132671) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(1177KB)], [105(13MB)]
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151132811, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 15613796, "oldest_snapshot_seqno": -1}
Jan 31 03:29:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:11.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7984 keys, 13671751 bytes, temperature: kUnknown
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151600519, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 13671751, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13615836, "index_size": 34875, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19973, "raw_key_size": 206876, "raw_average_key_size": 25, "raw_value_size": 13471300, "raw_average_value_size": 1687, "num_data_blocks": 1375, "num_entries": 7984, "num_filter_entries": 7984, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848151, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.601061) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 13671751 bytes
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.622623) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 33.4 rd, 29.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 13.7 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(24.3) write-amplify(11.3) OK, records in: 8510, records dropped: 526 output_compression: NoCompression
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.622686) EVENT_LOG_v1 {"time_micros": 1769848151622664, "job": 66, "event": "compaction_finished", "compaction_time_micros": 467901, "compaction_time_cpu_micros": 45578, "output_level": 6, "num_output_files": 1, "total_output_size": 13671751, "num_input_records": 8510, "num_output_records": 7984, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151623190, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848151625981, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.132424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.626078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.626088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.626092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.626094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:11.626096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:12 np0005603622 nova_compute[221301]: 2026-01-31 08:29:12.416 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:13.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 31 03:29:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:29:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2157142077' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:29:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:29:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2157142077' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:29:15 np0005603622 nova_compute[221301]: 2026-01-31 08:29:15.137 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:15.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:15.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:17.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:17 np0005603622 nova_compute[221301]: 2026-01-31 08:29:17.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:17.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:19.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:19.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:20 np0005603622 nova_compute[221301]: 2026-01-31 08:29:20.140 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:21.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:21.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:22 np0005603622 nova_compute[221301]: 2026-01-31 08:29:22.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:23.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:29:24Z|00585|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:29:24 np0005603622 nova_compute[221301]: 2026-01-31 08:29:24.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:25 np0005603622 nova_compute[221301]: 2026-01-31 08:29:25.143 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:25.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:25.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:27.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:27 np0005603622 nova_compute[221301]: 2026-01-31 08:29:27.500 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:27.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:29 np0005603622 podman[271114]: 2026-01-31 08:29:29.163090666 +0000 UTC m=+0.071588345 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:29:29 np0005603622 podman[271113]: 2026-01-31 08:29:29.184734448 +0000 UTC m=+0.096340651 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:29:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:29.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:29:30 np0005603622 nova_compute[221301]: 2026-01-31 08:29:30.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:30.884 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:30.884 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:29:30.885 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:29:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:31.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:29:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 31 03:29:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:31.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:32 np0005603622 nova_compute[221301]: 2026-01-31 08:29:32.501 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:33.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:33.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.817131) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174817268, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 526, "num_deletes": 250, "total_data_size": 746706, "memory_usage": 756896, "flush_reason": "Manual Compaction"}
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174844427, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 400096, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55444, "largest_seqno": 55964, "table_properties": {"data_size": 397401, "index_size": 731, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7410, "raw_average_key_size": 20, "raw_value_size": 391745, "raw_average_value_size": 1106, "num_data_blocks": 32, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848152, "oldest_key_time": 1769848152, "file_creation_time": 1769848174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 27380 microseconds, and 3624 cpu microseconds.
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.844509) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 400096 bytes OK
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.844544) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.863898) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.863974) EVENT_LOG_v1 {"time_micros": 1769848174863956, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.864008) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 743602, prev total WAL file size 743602, number of live WAL files 2.
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.864861) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373533' seq:72057594037927935, type:22 .. '6D6772737461740032303034' seq:0, type:0; will stop at (end)
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(390KB)], [108(13MB)]
Jan 31 03:29:34 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848174864935, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 14071847, "oldest_snapshot_seqno": -1}
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7828 keys, 10300128 bytes, temperature: kUnknown
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175010935, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 10300128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10249752, "index_size": 29720, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19589, "raw_key_size": 203894, "raw_average_key_size": 26, "raw_value_size": 10112338, "raw_average_value_size": 1291, "num_data_blocks": 1160, "num_entries": 7828, "num_filter_entries": 7828, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.011350) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 10300128 bytes
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.021893) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.3 rd, 70.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(60.9) write-amplify(25.7) OK, records in: 8338, records dropped: 510 output_compression: NoCompression
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.021920) EVENT_LOG_v1 {"time_micros": 1769848175021909, "job": 68, "event": "compaction_finished", "compaction_time_micros": 146166, "compaction_time_cpu_micros": 48006, "output_level": 6, "num_output_files": 1, "total_output_size": 10300128, "num_input_records": 8338, "num_output_records": 7828, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175022100, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848175023421, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:34.864738) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.023615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.023624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.023627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.023630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:29:35.023633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:29:35 np0005603622 nova_compute[221301]: 2026-01-31 08:29:35.150 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:35.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:29:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:35.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:29:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:37.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:37 np0005603622 nova_compute[221301]: 2026-01-31 08:29:37.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:29:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 31 03:29:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:39.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 31 03:29:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:39.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 31 03:29:40 np0005603622 nova_compute[221301]: 2026-01-31 08:29:40.153 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:41.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:41.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:42 np0005603622 nova_compute[221301]: 2026-01-31 08:29:42.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:43 np0005603622 nova_compute[221301]: 2026-01-31 08:29:43.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:43 np0005603622 nova_compute[221301]: 2026-01-31 08:29:43.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:29:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:43.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:43.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:44 np0005603622 nova_compute[221301]: 2026-01-31 08:29:44.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:45 np0005603622 nova_compute[221301]: 2026-01-31 08:29:45.155 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:45 np0005603622 nova_compute[221301]: 2026-01-31 08:29:45.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:45.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:45 np0005603622 nova_compute[221301]: 2026-01-31 08:29:45.668 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:45 np0005603622 nova_compute[221301]: 2026-01-31 08:29:45.669 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:45 np0005603622 nova_compute[221301]: 2026-01-31 08:29:45.669 221324 INFO nova.compute.manager [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Unshelving#033[00m
Jan 31 03:29:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:45.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.056 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.056 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.061 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_requests' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.228 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'numa_topology' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.308 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:29:46 np0005603622 nova_compute[221301]: 2026-01-31 08:29:46.308 221324 INFO nova.compute.claims [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.192 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:47.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.538 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2592823740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.725 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.731 221324 DEBUG nova.compute.provider_tree [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:29:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:47.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.796 221324 DEBUG nova.scheduler.client.report [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:29:47 np0005603622 nova_compute[221301]: 2026-01-31 08:29:47.995 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:48 np0005603622 nova_compute[221301]: 2026-01-31 08:29:48.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:48 np0005603622 nova_compute[221301]: 2026-01-31 08:29:48.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:48 np0005603622 nova_compute[221301]: 2026-01-31 08:29:48.937 221324 INFO nova.network.neutron [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:29:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:49.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:49.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:50 np0005603622 nova_compute[221301]: 2026-01-31 08:29:50.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:50 np0005603622 nova_compute[221301]: 2026-01-31 08:29:50.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:50 np0005603622 nova_compute[221301]: 2026-01-31 08:29:50.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:29:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:51.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:51.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.025 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.026 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.026 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.238 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.239 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.239 221324 DEBUG nova.network.neutron [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:52 np0005603622 nova_compute[221301]: 2026-01-31 08:29:52.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:53.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:53.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:53 np0005603622 nova_compute[221301]: 2026-01-31 08:29:53.916 221324 DEBUG nova.compute.manager [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:29:53 np0005603622 nova_compute[221301]: 2026-01-31 08:29:53.917 221324 DEBUG nova.compute.manager [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing instance network info cache due to event network-changed-58bbf6e9-a33b-4f2b-81e8-812adc1221b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:29:53 np0005603622 nova_compute[221301]: 2026-01-31 08:29:53.917 221324 DEBUG oslo_concurrency.lockutils [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:29:55 np0005603622 nova_compute[221301]: 2026-01-31 08:29:55.161 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:55.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:29:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:29:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.372 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.469 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.469 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.470 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.471 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.471 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.551 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.552 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.552 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.553 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:29:56 np0005603622 nova_compute[221301]: 2026-01-31 08:29:56.553 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2845790041' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.095 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.267 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.268 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.359 221324 DEBUG nova.network.neutron [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.430 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.431 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4314MB free_disk=20.89712142944336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:57.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.467 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.470 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.471 221324 INFO nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating image(s)#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.507 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.513 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.515 221324 DEBUG oslo_concurrency.lockutils [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.516 221324 DEBUG nova.network.neutron [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Refreshing network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.542 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.587 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.621 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.628 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "c267d5efa863f03e26a39d617f781b76cbb3cfe3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.630 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "c267d5efa863f03e26a39d617f781b76cbb3cfe3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.670 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.671 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.671 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.671 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:29:57 np0005603622 nova_compute[221301]: 2026-01-31 08:29:57.786 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:29:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:57.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:29:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/542080142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.299 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.306 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.350 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.439 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.440 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.509 221324 DEBUG nova.virt.libvirt.imagebackend [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.590 221324 DEBUG nova.virt.libvirt.imagebackend [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:29:58 np0005603622 nova_compute[221301]: 2026-01-31 08:29:58.592 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] cloning images/0e9c5d39-8319-43cb-ac29-5664ec7c2a71@snap to None/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:29:59 np0005603622 nova_compute[221301]: 2026-01-31 08:29:59.017 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "c267d5efa863f03e26a39d617f781b76cbb3cfe3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:29:59 np0005603622 nova_compute[221301]: 2026-01-31 08:29:59.177 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'migration_context' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:29:59 np0005603622 nova_compute[221301]: 2026-01-31 08:29:59.285 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] flattening vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:29:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:29:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:29:59.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:29:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:29:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:29:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:29:59.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:29:59 np0005603622 nova_compute[221301]: 2026-01-31 08:29:59.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:00 np0005603622 podman[271490]: 2026-01-31 08:30:00.147595173 +0000 UTC m=+0.071623606 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:30:00 np0005603622 nova_compute[221301]: 2026-01-31 08:30:00.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:00 np0005603622 podman[271489]: 2026-01-31 08:30:00.168349721 +0000 UTC m=+0.092945759 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:30:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:01 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:30:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:30:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:01.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.776 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Image rbd:vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.777 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.777 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Ensure instance console log exists: /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.777 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.778 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.778 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.781 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start _get_guest_xml network_info=[{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:28:42Z,direct_url=<?>,disk_format='raw',id=0e9c5d39-8319-43cb-ac29-5664ec7c2a71,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-2092089502-shelved',owner='491937de020742d7b4e847dc3bf57950',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:29:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.785 221324 WARNING nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.788 221324 DEBUG nova.virt.libvirt.host [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.789 221324 DEBUG nova.virt.libvirt.host [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.799 221324 DEBUG nova.virt.libvirt.host [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.800 221324 DEBUG nova.virt.libvirt.host [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.801 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.801 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:28:42Z,direct_url=<?>,disk_format='raw',id=0e9c5d39-8319-43cb-ac29-5664ec7c2a71,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-2092089502-shelved',owner='491937de020742d7b4e847dc3bf57950',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:29:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.802 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.802 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.802 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.803 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.803 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.803 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.804 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.804 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.804 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.804 221324 DEBUG nova.virt.hardware [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.805 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:01.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:01 np0005603622 nova_compute[221301]: 2026-01-31 08:30:01.886 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.045 221324 DEBUG nova.network.neutron [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updated VIF entry in instance network info cache for port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.046 221324 DEBUG nova.network.neutron [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [{"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.087 221324 DEBUG oslo_concurrency.lockutils [req-4abfddd4-9b3c-492a-b7b4-c3aa894501b5 req-1073d5ae-f145-49b5-9369-0b42728e8b91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-b9f38b79-63fc-48a1-a367-6998b8d6a9dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2605049488' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.369 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.399 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.405 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:02 np0005603622 nova_compute[221301]: 2026-01-31 08:30:02.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2023718486' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.207 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.803s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.209 221324 DEBUG nova.virt.libvirt.vif [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='0e9c5d39-8319-43cb-ac29-5664ec7c2a71',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1914515639',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:28:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:29:05.068624',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='0e9c5d39-8319-43cb-ac29-5664ec7c2a71'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:29:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.209 221324 DEBUG nova.network.os_vif_util [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.210 221324 DEBUG nova.network.os_vif_util [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.212 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.368 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <uuid>b9f38b79-63fc-48a1-a367-6998b8d6a9dc</uuid>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <name>instance-0000007b</name>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-2092089502</nova:name>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:30:01</nova:creationTime>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:user uuid="432ac8867d8240408db455fc25bb5901">tempest-AttachVolumeShelveTestJSON-60119558-project-member</nova:user>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:project uuid="491937de020742d7b4e847dc3bf57950">tempest-AttachVolumeShelveTestJSON-60119558</nova:project>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="0e9c5d39-8319-43cb-ac29-5664ec7c2a71"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <nova:port uuid="58bbf6e9-a33b-4f2b-81e8-812adc1221b5">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="serial">b9f38b79-63fc-48a1-a367-6998b8d6a9dc</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="uuid">b9f38b79-63fc-48a1-a367-6998b8d6a9dc</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ff:fd:8c"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <target dev="tap58bbf6e9-a3"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/console.log" append="off"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:30:03 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:30:03 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:30:03 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:30:03 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.370 221324 DEBUG nova.compute.manager [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Preparing to wait for external event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.371 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.371 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.372 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.373 221324 DEBUG nova.virt.libvirt.vif [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='0e9c5d39-8319-43cb-ac29-5664ec7c2a71',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1914515639',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:28:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member',shelved_at='2026-01-31T08:29:05.068624',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='0e9c5d39-8319-43cb-ac29-5664ec7c2a71'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:29:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.373 221324 DEBUG nova.network.os_vif_util [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.374 221324 DEBUG nova.network.os_vif_util [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.374 221324 DEBUG os_vif [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.376 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.376 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.381 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.381 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58bbf6e9-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.382 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58bbf6e9-a3, col_values=(('external_ids', {'iface-id': '58bbf6e9-a33b-4f2b-81e8-812adc1221b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:fd:8c', 'vm-uuid': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.423 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:03 np0005603622 NetworkManager[49080]: <info>  [1769848203.4264] manager: (tap58bbf6e9-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.430 221324 INFO os_vif [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3')#033[00m
Jan 31 03:30:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:03.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.597 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.598 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.598 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] No VIF found with MAC fa:16:3e:ff:fd:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.598 221324 INFO nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Using config drive#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.622 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:03 np0005603622 nova_compute[221301]: 2026-01-31 08:30:03.697 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:03.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:04 np0005603622 nova_compute[221301]: 2026-01-31 08:30:04.060 221324 DEBUG nova.objects.instance [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'keypairs' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:04 np0005603622 nova_compute[221301]: 2026-01-31 08:30:04.946 221324 INFO nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Creating config drive at /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config#033[00m
Jan 31 03:30:04 np0005603622 nova_compute[221301]: 2026-01-31 08:30:04.951 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgwsay483 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:05 np0005603622 nova_compute[221301]: 2026-01-31 08:30:05.080 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgwsay483" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:05 np0005603622 nova_compute[221301]: 2026-01-31 08:30:05.115 221324 DEBUG nova.storage.rbd_utils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] rbd image b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:05 np0005603622 nova_compute[221301]: 2026-01-31 08:30:05.120 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:05.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:05.781 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:05.783 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:30:05 np0005603622 nova_compute[221301]: 2026-01-31 08:30:05.790 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:05.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.121 221324 DEBUG oslo_concurrency.processutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config b9f38b79-63fc-48a1-a367-6998b8d6a9dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.001s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.123 221324 INFO nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deleting local config drive /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc/disk.config because it was imported into RBD.#033[00m
Jan 31 03:30:07 np0005603622 kernel: tap58bbf6e9-a3: entered promiscuous mode
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.1716] manager: (tap58bbf6e9-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/274)
Jan 31 03:30:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:07Z|00586|binding|INFO|Claiming lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for this chassis.
Jan 31 03:30:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:07Z|00587|binding|INFO|58bbf6e9-a33b-4f2b-81e8-812adc1221b5: Claiming fa:16:3e:ff:fd:8c 10.100.0.11
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:07Z|00588|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 ovn-installed in OVS
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 systemd-machined[190406]: New machine qemu-60-instance-0000007b.
Jan 31 03:30:07 np0005603622 systemd-udevd[271670]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:30:07 np0005603622 systemd[1]: Started Virtual Machine qemu-60-instance-0000007b.
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.2235] device (tap58bbf6e9-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.2241] device (tap58bbf6e9-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:30:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:07.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:07Z|00589|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 up in Southbound
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.478 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:fd:8c 10.100.0.11'], port_security=['fa:16:3e:ff:fd:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1bba0198-0b6e-463b-bfab-427572262107', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=58bbf6e9-a33b-4f2b-81e8-812adc1221b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.480 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 in datapath 6525247d-48b2-4359-a813-d7276403ba32 bound to our chassis#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.481 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6525247d-48b2-4359-a813-d7276403ba32#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.493 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8731931e-1d23-4b36-9c44-678017268092]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.494 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6525247d-41 in ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.498 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6525247d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.499 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[57554b6a-ae94-498b-ad9d-d6625b5c041c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.499 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[feac0848-5ec1-4d5b-ae31-b21645665f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.516 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1920e849-7983-418a-b661-5b4a231abb1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.531 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[81cfcf8f-e3f9-49cd-a902-cb97b68c1545]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.546 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.558 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e20984a1-791a-4c5f-a980-1b8e4940ccb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.563 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8490fc03-3677-47cb-b789-c2ba137b2f57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.5641] manager: (tap6525247d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/275)
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.589 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb4aa7b-8bac-4117-bfab-06020b148da6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.592 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[18aee565-dae9-4caa-a92e-492706adcd5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.6098] device (tap6525247d-40): carrier: link connected
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.616 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d54f1183-984d-4e64-8ca9-d961bcb2ece3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.633 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8f49cefa-373d-4288-93f7-be4b13531985]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 745895, 'reachable_time': 28433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271739, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.644 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[32be855a-b1c4-4259-be3b-c7c8d102dd81]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:c843'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 745894, 'tstamp': 745894}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271740, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.658 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88ffb074-e550-4490-b44d-fc8bbfa37fa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6525247d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:c8:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 745895, 'reachable_time': 28433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271741, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.685 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[54ef42f2-0ae3-4224-a4f1-8319e613f72a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.737 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5eb0a9-e45f-42ab-8885-20e7092cc611]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.739 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.740 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.740 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6525247d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:07 np0005603622 NetworkManager[49080]: <info>  [1769848207.7442] manager: (tap6525247d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.743 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 kernel: tap6525247d-40: entered promiscuous mode
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.746 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.748 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6525247d-40, col_values=(('external_ids', {'iface-id': '044f1919-2550-4bba-9baa-5d3f39f69ec6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:07Z|00590|binding|INFO|Releasing lport 044f1919-2550-4bba-9baa-5d3f39f69ec6 from this chassis (sb_readonly=0)
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.750 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.750 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.751 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea1f31e-ec34-4398-a5b7-ed5fa03deb1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.752 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/6525247d-48b2-4359-a813-d7276403ba32.pid.haproxy
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 6525247d-48b2-4359-a813-d7276403ba32
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:30:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:07.753 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'env', 'PROCESS_TAG=haproxy-6525247d-48b2-4359-a813-d7276403ba32', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6525247d-48b2-4359-a813-d7276403ba32.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:07.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.855 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848207.8545337, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.855 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Started (Lifecycle Event)#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.900 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.906 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848207.8595295, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:07 np0005603622 nova_compute[221301]: 2026-01-31 08:30:07.906 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:30:08 np0005603622 nova_compute[221301]: 2026-01-31 08:30:08.015 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:08 np0005603622 nova_compute[221301]: 2026-01-31 08:30:08.021 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:08 np0005603622 podman[271779]: 2026-01-31 08:30:08.105686249 +0000 UTC m=+0.052265846 container create f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:30:08 np0005603622 systemd[1]: Started libpod-conmon-f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6.scope.
Jan 31 03:30:08 np0005603622 nova_compute[221301]: 2026-01-31 08:30:08.156 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:08 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:30:08 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11d3392cf61b624201e5528f9f7910a2e726bcd1462d6a0859c4ef8586b07649/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:30:08 np0005603622 podman[271779]: 2026-01-31 08:30:08.07671704 +0000 UTC m=+0.023296667 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:30:08 np0005603622 podman[271779]: 2026-01-31 08:30:08.177967332 +0000 UTC m=+0.124546979 container init f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:30:08 np0005603622 podman[271779]: 2026-01-31 08:30:08.182522174 +0000 UTC m=+0.129101781 container start f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:30:08 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [NOTICE]   (271798) : New worker (271800) forked
Jan 31 03:30:08 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [NOTICE]   (271798) : Loading success.
Jan 31 03:30:08 np0005603622 nova_compute[221301]: 2026-01-31 08:30:08.423 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:30:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:09.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:30:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:09.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.700 221324 DEBUG nova.compute.manager [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.700 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.701 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.702 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.702 221324 DEBUG nova.compute.manager [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Processing event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.702 221324 DEBUG nova.compute.manager [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.703 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.703 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.704 221324 DEBUG oslo_concurrency.lockutils [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.704 221324 DEBUG nova.compute.manager [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.705 221324 WARNING nova.compute.manager [req-8ee3c87e-60e0-41dd-89a7-75a60399092a req-0473f5a5-7aba-423d-a45f-b29d1641dc52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received unexpected event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.706 221324 DEBUG nova.compute.manager [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.713 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848210.7128384, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.713 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.716 221324 DEBUG nova.virt.libvirt.driver [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.721 221324 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance spawned successfully.#033[00m
Jan 31 03:30:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:10.785 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.927 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:10 np0005603622 nova_compute[221301]: 2026-01-31 08:30:10.933 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:11 np0005603622 nova_compute[221301]: 2026-01-31 08:30:11.088 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:11.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:11.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:12 np0005603622 nova_compute[221301]: 2026-01-31 08:30:12.593 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:13 np0005603622 nova_compute[221301]: 2026-01-31 08:30:13.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:13.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 31 03:30:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:15.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:15.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:17 np0005603622 nova_compute[221301]: 2026-01-31 08:30:17.452 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:17 np0005603622 nova_compute[221301]: 2026-01-31 08:30:17.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:17 np0005603622 nova_compute[221301]: 2026-01-31 08:30:17.811 221324 DEBUG nova.compute.manager [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:17.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:18 np0005603622 nova_compute[221301]: 2026-01-31 08:30:18.200 221324 DEBUG oslo_concurrency.lockutils [None req-16c9dc95-d865-46ab-afa2-8a24e866255e 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 32.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:18 np0005603622 nova_compute[221301]: 2026-01-31 08:30:18.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:19.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 31 03:30:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:19.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:21.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:21.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:22 np0005603622 nova_compute[221301]: 2026-01-31 08:30:22.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:23 np0005603622 nova_compute[221301]: 2026-01-31 08:30:23.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:23.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:24Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:fd:8c 10.100.0.11
Jan 31 03:30:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:25.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:26 np0005603622 nova_compute[221301]: 2026-01-31 08:30:26.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:27 np0005603622 nova_compute[221301]: 2026-01-31 08:30:27.605 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:27.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:28 np0005603622 nova_compute[221301]: 2026-01-31 08:30:28.472 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:29.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:30.885 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:30.886 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:31 np0005603622 podman[271810]: 2026-01-31 08:30:31.136265868 +0000 UTC m=+0.049062909 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 03:30:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:31 np0005603622 podman[271809]: 2026-01-31 08:30:31.187255859 +0000 UTC m=+0.100083501 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:30:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:31.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:32 np0005603622 nova_compute[221301]: 2026-01-31 08:30:32.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:33 np0005603622 nova_compute[221301]: 2026-01-31 08:30:33.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:33.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:33 np0005603622 nova_compute[221301]: 2026-01-31 08:30:33.893 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:33 np0005603622 nova_compute[221301]: 2026-01-31 08:30:33.894 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:34 np0005603622 nova_compute[221301]: 2026-01-31 08:30:34.502 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:30:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:35.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:35 np0005603622 nova_compute[221301]: 2026-01-31 08:30:35.658 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:35 np0005603622 nova_compute[221301]: 2026-01-31 08:30:35.658 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:35 np0005603622 nova_compute[221301]: 2026-01-31 08:30:35.701 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:30:35 np0005603622 nova_compute[221301]: 2026-01-31 08:30:35.702 221324 INFO nova.compute.claims [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:30:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:35.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:36 np0005603622 nova_compute[221301]: 2026-01-31 08:30:36.766 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/449198042' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.235 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.240 221324 DEBUG nova.compute.provider_tree [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.346 221324 DEBUG nova.scheduler.client.report [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.367 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.367 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.368 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.368 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.369 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.370 221324 INFO nova.compute.manager [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Terminating instance#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.371 221324 DEBUG nova.compute.manager [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.390 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.391 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.464 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.464 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:30:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:37.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.542 221324 INFO nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.586 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.609 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:37 np0005603622 kernel: tap58bbf6e9-a3 (unregistering): left promiscuous mode
Jan 31 03:30:37 np0005603622 NetworkManager[49080]: <info>  [1769848237.7245] device (tap58bbf6e9-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:37Z|00591|binding|INFO|Releasing lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 from this chassis (sb_readonly=0)
Jan 31 03:30:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:37Z|00592|binding|INFO|Setting lport 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 down in Southbound
Jan 31 03:30:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:37Z|00593|binding|INFO|Removing iface tap58bbf6e9-a3 ovn-installed in OVS
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.741 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:37.746 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:fd:8c 10.100.0.11'], port_security=['fa:16:3e:ff:fd:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b9f38b79-63fc-48a1-a367-6998b8d6a9dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6525247d-48b2-4359-a813-d7276403ba32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '491937de020742d7b4e847dc3bf57950', 'neutron:revision_number': '9', 'neutron:security_group_ids': '1bba0198-0b6e-463b-bfab-427572262107', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c7370ba-0307-4b10-bef7-8ff686d828f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=58bbf6e9-a33b-4f2b-81e8-812adc1221b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:37.748 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 58bbf6e9-a33b-4f2b-81e8-812adc1221b5 in datapath 6525247d-48b2-4359-a813-d7276403ba32 unbound from our chassis#033[00m
Jan 31 03:30:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:37.749 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6525247d-48b2-4359-a813-d7276403ba32, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:30:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:37.751 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0293026b-6ec3-4cd2-ab65-8276ad73497b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:37.752 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 namespace which is not needed anymore#033[00m
Jan 31 03:30:37 np0005603622 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 31 03:30:37 np0005603622 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007b.scope: Consumed 13.974s CPU time.
Jan 31 03:30:37 np0005603622 systemd-machined[190406]: Machine qemu-60-instance-0000007b terminated.
Jan 31 03:30:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:37.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.898 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.900 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.901 221324 INFO nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Creating image(s)#033[00m
Jan 31 03:30:37 np0005603622 nova_compute[221301]: 2026-01-31 08:30:37.955 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:37 np0005603622 kernel: tap58bbf6e9-a3: entered promiscuous mode
Jan 31 03:30:37 np0005603622 kernel: tap58bbf6e9-a3 (unregistering): left promiscuous mode
Jan 31 03:30:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [NOTICE]   (271798) : haproxy version is 2.8.14-c23fe91
Jan 31 03:30:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [NOTICE]   (271798) : path to executable is /usr/sbin/haproxy
Jan 31 03:30:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [WARNING]  (271798) : Exiting Master process...
Jan 31 03:30:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [ALERT]    (271798) : Current worker (271800) exited with code 143 (Terminated)
Jan 31 03:30:38 np0005603622 neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32[271794]: [WARNING]  (271798) : All workers exited. Exiting... (0)
Jan 31 03:30:38 np0005603622 systemd[1]: libpod-f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6.scope: Deactivated successfully.
Jan 31 03:30:38 np0005603622 podman[271900]: 2026-01-31 08:30:38.169198585 +0000 UTC m=+0.333035173 container died f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.577 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.621 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.627 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.648 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.655 221324 DEBUG nova.policy [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb3f20f0143d465ebfe98f6a13200890', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '40db421b27d84f809f8074c58151327f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.660 221324 DEBUG nova.compute.manager [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.660 221324 DEBUG oslo_concurrency.lockutils [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.661 221324 DEBUG oslo_concurrency.lockutils [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.661 221324 DEBUG oslo_concurrency.lockutils [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.661 221324 DEBUG nova.compute.manager [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.661 221324 DEBUG nova.compute.manager [req-815c11ac-b85f-4c89-8886-85de909abbfe req-60b3fa5c-3b8a-4606-8b1a-7b014a921088 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-unplugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.666 221324 INFO nova.virt.libvirt.driver [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Instance destroyed successfully.#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.666 221324 DEBUG nova.objects.instance [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lazy-loading 'resources' on Instance uuid b9f38b79-63fc-48a1-a367-6998b8d6a9dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.693 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.694 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.695 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.695 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.731 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.738 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.766 221324 DEBUG nova.virt.libvirt.vif [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2092089502',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-2092089502',id=123,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqJ2u98tnCfusFKrUeql0ngSDNf86DLAElp/RNmhRZkam9aFuB8mUdP/dAMmSCZVQ6AaZGjQO8tc+tThhzKBQRodouufnRusHHQiOXeUQ9hnIPnIcTcQ3b1LbRSS3JzxA==',key_name='tempest-keypair-1914515639',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:30:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='491937de020742d7b4e847dc3bf57950',ramdisk_id='',reservation_id='r-zq57k11v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-60119558',owner_user_name='tempest-AttachVolumeShelveTestJSON-60119558-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:30:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='432ac8867d8240408db455fc25bb5901',uuid=b9f38b79-63fc-48a1-a367-6998b8d6a9dc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.767 221324 DEBUG nova.network.os_vif_util [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converting VIF {"id": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "address": "fa:16:3e:ff:fd:8c", "network": {"id": "6525247d-48b2-4359-a813-d7276403ba32", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1996499356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "491937de020742d7b4e847dc3bf57950", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58bbf6e9-a3", "ovs_interfaceid": "58bbf6e9-a33b-4f2b-81e8-812adc1221b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.769 221324 DEBUG nova.network.os_vif_util [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.770 221324 DEBUG os_vif [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.774 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.775 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58bbf6e9-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.781 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:30:38 np0005603622 nova_compute[221301]: 2026-01-31 08:30:38.786 221324 INFO os_vif [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:fd:8c,bridge_name='br-int',has_traffic_filtering=True,id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5,network=Network(6525247d-48b2-4359-a813-d7276403ba32),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58bbf6e9-a3')#033[00m
Jan 31 03:30:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay-11d3392cf61b624201e5528f9f7910a2e726bcd1462d6a0859c4ef8586b07649-merged.mount: Deactivated successfully.
Jan 31 03:30:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6-userdata-shm.mount: Deactivated successfully.
Jan 31 03:30:39 np0005603622 podman[271900]: 2026-01-31 08:30:39.179149633 +0000 UTC m=+1.342986211 container cleanup f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:30:39 np0005603622 systemd[1]: libpod-conmon-f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6.scope: Deactivated successfully.
Jan 31 03:30:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:39.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:39 np0005603622 podman[272165]: 2026-01-31 08:30:39.617028733 +0000 UTC m=+0.404010521 container remove f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.621 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2742c133-89f5-457c-a423-17803eaade8e]: (4, ('Sat Jan 31 08:30:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6)\nf577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6\nSat Jan 31 08:30:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 (f577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6)\nf577cd7896eb1af384a099c72d6f88377b6a6e625d18081501022cd404b8bda6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.624 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0597fd23-647e-4e27-a1bd-dbf0eef0f306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.625 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6525247d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:39 np0005603622 nova_compute[221301]: 2026-01-31 08:30:39.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:39 np0005603622 kernel: tap6525247d-40: left promiscuous mode
Jan 31 03:30:39 np0005603622 nova_compute[221301]: 2026-01-31 08:30:39.637 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.900s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.640 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7711a650-9b7b-4ec5-8f4c-8c9b0805f0bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.661 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[059f731f-d659-4475-920a-1363335cd928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.663 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c68c08-ea75-45d1-aaa2-91db98c4e9cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 nova_compute[221301]: 2026-01-31 08:30:39.680 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.684 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7ae030-ad86-434e-8f21-b141282921da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 745889, 'reachable_time': 36210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272213, 'error': None, 'target': 'ovnmeta-6525247d-48b2-4359-a813-d7276403ba32', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.689 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6525247d-48b2-4359-a813-d7276403ba32 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:30:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:39.689 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a7862fb6-bdf8-4f88-924c-93d9a7616c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:39 np0005603622 systemd[1]: run-netns-ovnmeta\x2d6525247d\x2d48b2\x2d4359\x2da813\x2dd7276403ba32.mount: Deactivated successfully.
Jan 31 03:30:39 np0005603622 nova_compute[221301]: 2026-01-31 08:30:39.733 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] resizing rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:30:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:39.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:39 np0005603622 nova_compute[221301]: 2026-01-31 08:30:39.988 221324 DEBUG nova.objects.instance [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'migration_context' on Instance uuid e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.040 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.041 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Ensure instance console log exists: /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.042 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.043 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.043 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:30:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.309 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Successfully created port: ebff0bdd-64a8-478c-a24f-94fc3ea465c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.643 221324 DEBUG nova.compute.manager [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.643 221324 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.644 221324 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.644 221324 DEBUG oslo_concurrency.lockutils [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.644 221324 DEBUG nova.compute.manager [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] No waiting events found dispatching network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:30:40 np0005603622 nova_compute[221301]: 2026-01-31 08:30:40.645 221324 WARNING nova.compute.manager [req-7f4192bf-1348-4037-bbca-49736756f513 req-0ec09e65-34a2-4181-b8ec-3b48be829117 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received unexpected event network-vif-plugged-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:30:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:41.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:42 np0005603622 nova_compute[221301]: 2026-01-31 08:30:42.612 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.214 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Successfully updated port: ebff0bdd-64a8-478c-a24f-94fc3ea465c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.262 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.263 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquired lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.263 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.463 221324 DEBUG nova.compute.manager [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-changed-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.463 221324 DEBUG nova.compute.manager [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Refreshing instance network info cache due to event network-changed-ebff0bdd-64a8-478c-a24f-94fc3ea465c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.464 221324 DEBUG oslo_concurrency.lockutils [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:43.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:43 np0005603622 nova_compute[221301]: 2026-01-31 08:30:43.779 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.088 221324 INFO nova.virt.libvirt.driver [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deleting instance files /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_del#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.090 221324 INFO nova.virt.libvirt.driver [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deletion of /var/lib/nova/instances/b9f38b79-63fc-48a1-a367-6998b8d6a9dc_del complete#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.097 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.266 221324 INFO nova.compute.manager [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Took 6.89 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.266 221324 DEBUG oslo.service.loopingcall [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.267 221324 DEBUG nova.compute.manager [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:30:44 np0005603622 nova_compute[221301]: 2026-01-31 08:30:44.267 221324 DEBUG nova.network.neutron [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:30:45 np0005603622 nova_compute[221301]: 2026-01-31 08:30:45.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:45 np0005603622 nova_compute[221301]: 2026-01-31 08:30:45.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:45 np0005603622 nova_compute[221301]: 2026-01-31 08:30:45.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:30:45 np0005603622 nova_compute[221301]: 2026-01-31 08:30:45.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:45.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:45Z|00594|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:30:45 np0005603622 nova_compute[221301]: 2026-01-31 08:30:45.582 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:45.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:46 np0005603622 nova_compute[221301]: 2026-01-31 08:30:46.530 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.010 221324 DEBUG nova.network.neutron [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:30:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:47.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.590 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Releasing lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.590 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Instance network_info: |[{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.590 221324 DEBUG oslo_concurrency.lockutils [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.591 221324 DEBUG nova.network.neutron [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Refreshing network info cache for port ebff0bdd-64a8-478c-a24f-94fc3ea465c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.593 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Start _get_guest_xml network_info=[{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.598 221324 WARNING nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.602 221324 DEBUG nova.virt.libvirt.host [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.602 221324 DEBUG nova.virt.libvirt.host [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.605 221324 DEBUG nova.virt.libvirt.host [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.605 221324 DEBUG nova.virt.libvirt.host [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.607 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.607 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.607 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.608 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.608 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.608 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.608 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.609 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.609 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.609 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.610 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.610 221324 DEBUG nova.virt.hardware [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.613 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:47 np0005603622 nova_compute[221301]: 2026-01-31 08:30:47.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:48 np0005603622 nova_compute[221301]: 2026-01-31 08:30:48.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2111100228' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:48 np0005603622 nova_compute[221301]: 2026-01-31 08:30:48.613 221324 DEBUG nova.network.neutron [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:48 np0005603622 nova_compute[221301]: 2026-01-31 08:30:48.760 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.027 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.031 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.061 221324 DEBUG nova.compute.manager [req-cf13a32c-f671-4301-a025-08754a3ffde4 req-905bf1b9-0563-4e04-a53a-931ea5eacce3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Received event network-vif-deleted-58bbf6e9-a33b-4f2b-81e8-812adc1221b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.061 221324 INFO nova.compute.manager [req-cf13a32c-f671-4301-a025-08754a3ffde4 req-905bf1b9-0563-4e04-a53a-931ea5eacce3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Neutron deleted interface 58bbf6e9-a33b-4f2b-81e8-812adc1221b5; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.061 221324 DEBUG nova.network.neutron [req-cf13a32c-f671-4301-a025-08754a3ffde4 req-905bf1b9-0563-4e04-a53a-931ea5eacce3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.063 221324 INFO nova.compute.manager [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Took 4.80 seconds to deallocate network for instance.#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.273 221324 DEBUG nova.compute.manager [req-cf13a32c-f671-4301-a025-08754a3ffde4 req-905bf1b9-0563-4e04-a53a-931ea5eacce3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Detach interface failed, port_id=58bbf6e9-a33b-4f2b-81e8-812adc1221b5, reason: Instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.428 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.429 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:30:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1302541289' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.494 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.496 221324 DEBUG nova.virt.libvirt.vif [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-170523285',display_name='tempest-₡-170523285',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--170523285',id=129,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-l8obdkvp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:30:37Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.496 221324 DEBUG nova.network.os_vif_util [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.497 221324 DEBUG nova.network.os_vif_util [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.498 221324 DEBUG nova.objects.instance [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'pci_devices' on Instance uuid e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:49.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.620 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <uuid>e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c</uuid>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <name>instance-00000081</name>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:name>tempest-₡-170523285</nova:name>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:30:47</nova:creationTime>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:user uuid="fb3f20f0143d465ebfe98f6a13200890">tempest-ServersTestJSON-1064072764-project-member</nova:user>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:project uuid="40db421b27d84f809f8074c58151327f">tempest-ServersTestJSON-1064072764</nova:project>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <nova:port uuid="ebff0bdd-64a8-478c-a24f-94fc3ea465c4">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="serial">e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="uuid">e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:d7:63:61"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <target dev="tapebff0bdd-64"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/console.log" append="off"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:30:49 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:30:49 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:30:49 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:30:49 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.621 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Preparing to wait for external event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.621 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.621 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.621 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.622 221324 DEBUG nova.virt.libvirt.vif [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-170523285',display_name='tempest-₡-170523285',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--170523285',id=129,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-l8obdkvp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:30:37Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.622 221324 DEBUG nova.network.os_vif_util [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.623 221324 DEBUG nova.network.os_vif_util [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.623 221324 DEBUG os_vif [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.624 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.624 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.626 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebff0bdd-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.627 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebff0bdd-64, col_values=(('external_ids', {'iface-id': 'ebff0bdd-64a8-478c-a24f-94fc3ea465c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:63:61', 'vm-uuid': 'e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:49 np0005603622 NetworkManager[49080]: <info>  [1769848249.6289] manager: (tapebff0bdd-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:49 np0005603622 nova_compute[221301]: 2026-01-31 08:30:49.636 221324 INFO os_vif [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64')#033[00m
Jan 31 03:30:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.031 221324 DEBUG oslo_concurrency.processutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.068 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.069 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.070 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] No VIF found with MAC fa:16:3e:d7:63:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.072 221324 INFO nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Using config drive#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.112 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1402457334' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.481 221324 DEBUG oslo_concurrency.processutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.486 221324 DEBUG nova.compute.provider_tree [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.545 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.741 221324 DEBUG nova.scheduler.client.report [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.820 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.823 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.824 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.824 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.824 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.941 221324 INFO nova.scheduler.client.report [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Deleted allocations for instance b9f38b79-63fc-48a1-a367-6998b8d6a9dc#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.989 221324 DEBUG nova.network.neutron [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updated VIF entry in instance network info cache for port ebff0bdd-64a8-478c-a24f-94fc3ea465c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:30:50 np0005603622 nova_compute[221301]: 2026-01-31 08:30:50.989 221324 DEBUG nova.network.neutron [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.162 221324 DEBUG oslo_concurrency.lockutils [req-f892e884-2d69-4faf-a011-5be7d55f37c6 req-f1bcfd4c-9ab5-46ba-89e9-a90e4e98b2cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.268 221324 INFO nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Creating config drive at /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.275 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr0iibjw4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1440532586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.304 221324 DEBUG oslo_concurrency.lockutils [None req-9e50a887-26d6-4aee-bf5f-67b49af71d9c 432ac8867d8240408db455fc25bb5901 491937de020742d7b4e847dc3bf57950 - - default default] Lock "b9f38b79-63fc-48a1-a367-6998b8d6a9dc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.305 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.408 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpr0iibjw4" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.449 221324 DEBUG nova.storage.rbd_utils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] rbd image e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.455 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:51.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.650 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.650 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.655 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.656 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.676 221324 DEBUG oslo_concurrency.processutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.677 221324 INFO nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Deleting local config drive /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c/disk.config because it was imported into RBD.#033[00m
Jan 31 03:30:51 np0005603622 kernel: tapebff0bdd-64: entered promiscuous mode
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:51Z|00595|binding|INFO|Claiming lport ebff0bdd-64a8-478c-a24f-94fc3ea465c4 for this chassis.
Jan 31 03:30:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:51Z|00596|binding|INFO|ebff0bdd-64a8-478c-a24f-94fc3ea465c4: Claiming fa:16:3e:d7:63:61 10.100.0.9
Jan 31 03:30:51 np0005603622 NetworkManager[49080]: <info>  [1769848251.7251] manager: (tapebff0bdd-64): new Tun device (/org/freedesktop/NetworkManager/Devices/278)
Jan 31 03:30:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:51Z|00597|binding|INFO|Setting lport ebff0bdd-64a8-478c-a24f-94fc3ea465c4 ovn-installed in OVS
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.731 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.734 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:51 np0005603622 systemd-machined[190406]: New machine qemu-61-instance-00000081.
Jan 31 03:30:51 np0005603622 systemd[1]: Started Virtual Machine qemu-61-instance-00000081.
Jan 31 03:30:51 np0005603622 systemd-udevd[272501]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:30:51 np0005603622 NetworkManager[49080]: <info>  [1769848251.7774] device (tapebff0bdd-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:30:51 np0005603622 NetworkManager[49080]: <info>  [1769848251.7779] device (tapebff0bdd-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.867 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.869 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4303MB free_disk=20.890708923339844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.870 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:51 np0005603622 nova_compute[221301]: 2026-01-31 08:30:51.870 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:51Z|00598|binding|INFO|Setting lport ebff0bdd-64a8-478c-a24f-94fc3ea465c4 up in Southbound
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.874 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:61 10.100.0.9'], port_security=['fa:16:3e:d7:63:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ebff0bdd-64a8-478c-a24f-94fc3ea465c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.876 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ebff0bdd-64a8-478c-a24f-94fc3ea465c4 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b bound to our chassis#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.877 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6071a46-64a6-45aa-97c6-06e6c564195b#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.884 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2adafd1b-3760-42a3-ba7c-a7f91f5248d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.885 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6071a46-61 in ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.887 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6071a46-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.887 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8b0545-1fc0-405b-badc-c0c94870bf82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.887 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[58533a6f-ec06-4eb6-b7d4-cecc1183b70f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.896 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d04d00-4023-4094-b817-c68f524249f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.904 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b78fd386-fce0-4cac-b89b-ec354ff6a107]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:51.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.934 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0bdb53-3191-48e6-8029-899452753cc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 NetworkManager[49080]: <info>  [1769848251.9448] manager: (tapf6071a46-60): new Veth device (/org/freedesktop/NetworkManager/Devices/279)
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.944 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a458022e-e911-4503-9d23-79cd9400ed0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.974 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[faaf81ef-2a63-4425-9bed-05c851e17662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:51.978 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f00fc147-b284-42fa-a0ab-769246e9cf3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:51 np0005603622 NetworkManager[49080]: <info>  [1769848251.9962] device (tapf6071a46-60): carrier: link connected
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.002 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac316c7-00f1-478a-a311-c9897ca2241d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.014 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9fca98-acea-4cb4-a499-441409b55952]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750333, 'reachable_time': 39461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272536, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.024 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[84ea00a7-05d6-40dc-bd31-001937e18842]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:8c48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 750333, 'tstamp': 750333}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272537, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.034 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[728388c9-6139-4b3a-a3da-c85259bbe1e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6071a46-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:8c:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750333, 'reachable_time': 39461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272538, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.040 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.040 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.040 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.041 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.060 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd7953a-ef0e-4d34-968e-714ed0adddfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.103 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b63347-34b4-45fd-a07d-a39ae3282f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.104 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.105 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.105 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6071a46-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:52 np0005603622 kernel: tapf6071a46-60: entered promiscuous mode
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.107 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:52 np0005603622 NetworkManager[49080]: <info>  [1769848252.1108] manager: (tapf6071a46-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.114 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6071a46-60, col_values=(('external_ids', {'iface-id': 'e9a7861c-c6ea-4166-9252-dc2aacdf4771'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:30:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:30:52Z|00599|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.116 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.118 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.119 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f3b948-94a0-4405-962e-bf2ec2860756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.120 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/f6071a46-64a6-45aa-97c6-06e6c564195b.pid.haproxy
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID f6071a46-64a6-45aa-97c6-06e6c564195b
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:30:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:30:52.121 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'env', 'PROCESS_TAG=haproxy-f6071a46-64a6-45aa-97c6-06e6c564195b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6071a46-64a6-45aa-97c6-06e6c564195b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.125 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.141 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.426 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848252.426344, e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.427 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] VM Started (Lifecycle Event)#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.505 221324 DEBUG nova.compute.manager [req-9907abef-1306-4f1e-944d-8855d130de05 req-3d3372b1-a609-4bdb-bae0-40eaaded2d9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.505 221324 DEBUG oslo_concurrency.lockutils [req-9907abef-1306-4f1e-944d-8855d130de05 req-3d3372b1-a609-4bdb-bae0-40eaaded2d9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.505 221324 DEBUG oslo_concurrency.lockutils [req-9907abef-1306-4f1e-944d-8855d130de05 req-3d3372b1-a609-4bdb-bae0-40eaaded2d9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.505 221324 DEBUG oslo_concurrency.lockutils [req-9907abef-1306-4f1e-944d-8855d130de05 req-3d3372b1-a609-4bdb-bae0-40eaaded2d9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.505 221324 DEBUG nova.compute.manager [req-9907abef-1306-4f1e-944d-8855d130de05 req-3d3372b1-a609-4bdb-bae0-40eaaded2d9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Processing event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.506 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.510 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.512 221324 INFO nova.virt.libvirt.driver [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Instance spawned successfully.#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.512 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:30:52 np0005603622 podman[272631]: 2026-01-31 08:30:52.414274009 +0000 UTC m=+0.018718264 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:30:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:30:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2302796181' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:30:52 np0005603622 podman[272631]: 2026-01-31 08:30:52.539753192 +0000 UTC m=+0.144197447 container create 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.550 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.555 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:30:52 np0005603622 systemd[1]: Started libpod-conmon-74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45.scope.
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.590 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.596 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.612 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:30:52 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:52 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37a20b3e7681bb4f2fe958aa100d348a24800ab6d3a409bf9be3de90e62fae8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.640 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.641 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.642 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.643 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.643 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.644 221324 DEBUG nova.virt.libvirt.driver [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:30:52 np0005603622 podman[272631]: 2026-01-31 08:30:52.721445706 +0000 UTC m=+0.325890031 container init 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:30:52 np0005603622 podman[272631]: 2026-01-31 08:30:52.727233182 +0000 UTC m=+0.331677457 container start 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 03:30:52 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [NOTICE]   (272653) : New worker (272655) forked
Jan 31 03:30:52 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [NOTICE]   (272653) : Loading success.
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.859 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.860 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848252.4264948, e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:52 np0005603622 nova_compute[221301]: 2026-01-31 08:30:52.861 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.035 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.041 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848252.5093207, e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.042 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.059 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.060 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.154 221324 INFO nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Took 15.25 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.154 221324 DEBUG nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.217 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.222 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.316 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.347 221324 INFO nova.compute.manager [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Took 18.00 seconds to build instance.#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.392 221324 DEBUG oslo_concurrency.lockutils [None req-9a73c74a-0ac3-443a-8da5-be6b9a3b5301 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:53.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.653 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848238.0144417, b9f38b79-63fc-48a1-a367-6998b8d6a9dc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.654 221324 INFO nova.compute.manager [-] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:30:53 np0005603622 nova_compute[221301]: 2026-01-31 08:30:53.808 221324 DEBUG nova.compute.manager [None req-7f69548d-cbe5-4dc4-9fe0-22b38b16aa6c - - - - - -] [instance: b9f38b79-63fc-48a1-a367-6998b8d6a9dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:30:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.061 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.061 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.062 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.610 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.611 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.611 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.611 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.683 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.767 221324 DEBUG nova.compute.manager [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.767 221324 DEBUG oslo_concurrency.lockutils [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.768 221324 DEBUG oslo_concurrency.lockutils [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.768 221324 DEBUG oslo_concurrency.lockutils [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.769 221324 DEBUG nova.compute.manager [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] No waiting events found dispatching network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:30:54 np0005603622 nova_compute[221301]: 2026-01-31 08:30:54.769 221324 WARNING nova.compute.manager [req-62e1b40a-85b7-4cd5-957b-edfb9d88d7e5 req-7f999ae9-4235-42e5-92f6-5e366943cb8c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received unexpected event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:30:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:55.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:30:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:55.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:30:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:57.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:57 np0005603622 nova_compute[221301]: 2026-01-31 08:30:57.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:30:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:57.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:30:58 np0005603622 nova_compute[221301]: 2026-01-31 08:30:58.475 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:30:58 np0005603622 nova_compute[221301]: 2026-01-31 08:30:58.571 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:30:58 np0005603622 nova_compute[221301]: 2026-01-31 08:30:58.572 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:30:58 np0005603622 nova_compute[221301]: 2026-01-31 08:30:58.573 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:58 np0005603622 nova_compute[221301]: 2026-01-31 08:30:58.573 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:30:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:30:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:30:59.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:30:59 np0005603622 nova_compute[221301]: 2026-01-31 08:30:59.685 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:30:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:30:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:30:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:30:59.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:01.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:01.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:02 np0005603622 podman[272666]: 2026-01-31 08:31:02.137505421 +0000 UTC m=+0.059128590 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:31:02 np0005603622 podman[272665]: 2026-01-31 08:31:02.174113705 +0000 UTC m=+0.100749749 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:31:02 np0005603622 nova_compute[221301]: 2026-01-31 08:31:02.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:02 np0005603622 nova_compute[221301]: 2026-01-31 08:31:02.661 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:02 np0005603622 nova_compute[221301]: 2026-01-31 08:31:02.705 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:02 np0005603622 nova_compute[221301]: 2026-01-31 08:31:02.705 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:31:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:03.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:04 np0005603622 nova_compute[221301]: 2026-01-31 08:31:04.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:05.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:05.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:06 np0005603622 nova_compute[221301]: 2026-01-31 08:31:06.890 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:07.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:07 np0005603622 nova_compute[221301]: 2026-01-31 08:31:07.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:07.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:31:08Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:63:61 10.100.0.9
Jan 31 03:31:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:31:08Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:63:61 10.100.0.9
Jan 31 03:31:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:09.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:09 np0005603622 nova_compute[221301]: 2026-01-31 08:31:09.691 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:09.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:10.692 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:31:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:10.693 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:31:10 np0005603622 nova_compute[221301]: 2026-01-31 08:31:10.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:10.694 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:31:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:11.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:11.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:12 np0005603622 nova_compute[221301]: 2026-01-31 08:31:12.666 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:13.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:31:13Z|00600|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:31:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:31:13Z|00601|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:31:13 np0005603622 nova_compute[221301]: 2026-01-31 08:31:13.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:13.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:14 np0005603622 nova_compute[221301]: 2026-01-31 08:31:14.694 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:15.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:15.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:17.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:17 np0005603622 nova_compute[221301]: 2026-01-31 08:31:17.669 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:17.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.046 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.089 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 884be038-2a08-4b69-a345-5b6ac2661dfc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.089 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.090 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.091 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.091 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.093 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.164 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:18 np0005603622 nova_compute[221301]: 2026-01-31 08:31:18.171 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:19.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:19 np0005603622 nova_compute[221301]: 2026-01-31 08:31:19.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:19.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:21.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:21.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:22 np0005603622 nova_compute[221301]: 2026-01-31 08:31:22.102 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:22 np0005603622 nova_compute[221301]: 2026-01-31 08:31:22.671 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:23.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:23.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:24 np0005603622 nova_compute[221301]: 2026-01-31 08:31:24.699 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:25 np0005603622 nova_compute[221301]: 2026-01-31 08:31:25.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:25 np0005603622 nova_compute[221301]: 2026-01-31 08:31:25.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:31:25 np0005603622 nova_compute[221301]: 2026-01-31 08:31:25.429 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:31:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:25.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:25.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:31:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:27.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:31:27 np0005603622 nova_compute[221301]: 2026-01-31 08:31:27.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:27.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:29 np0005603622 nova_compute[221301]: 2026-01-31 08:31:29.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:29.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:30.886 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:30.887 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:31:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:31.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:31.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:32 np0005603622 nova_compute[221301]: 2026-01-31 08:31:32.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:33 np0005603622 podman[272709]: 2026-01-31 08:31:33.140824057 +0000 UTC m=+0.058489723 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 31 03:31:33 np0005603622 podman[272708]: 2026-01-31 08:31:33.162030687 +0000 UTC m=+0.085201281 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:31:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:33.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:33.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:34 np0005603622 nova_compute[221301]: 2026-01-31 08:31:34.705 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:35.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:37.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:37 np0005603622 nova_compute[221301]: 2026-01-31 08:31:37.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:38.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:39.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:39 np0005603622 nova_compute[221301]: 2026-01-31 08:31:39.661 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:39 np0005603622 nova_compute[221301]: 2026-01-31 08:31:39.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:40.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:41.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:42.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:42 np0005603622 nova_compute[221301]: 2026-01-31 08:31:42.741 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:44.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:44 np0005603622 nova_compute[221301]: 2026-01-31 08:31:44.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:45 np0005603622 nova_compute[221301]: 2026-01-31 08:31:45.430 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:45 np0005603622 nova_compute[221301]: 2026-01-31 08:31:45.431 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:45 np0005603622 nova_compute[221301]: 2026-01-31 08:31:45.431 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:31:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:31:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:46.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:31:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:46 np0005603622 nova_compute[221301]: 2026-01-31 08:31:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:47.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:47 np0005603622 nova_compute[221301]: 2026-01-31 08:31:47.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:48.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:48 np0005603622 nova_compute[221301]: 2026-01-31 08:31:48.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:49.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:49 np0005603622 nova_compute[221301]: 2026-01-31 08:31:49.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:50.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.497 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.497 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.498 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.498 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.498 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:31:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/925373866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:31:50 np0005603622 nova_compute[221301]: 2026-01-31 08:31:50.937 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:51.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:52.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.596 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.597 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.601 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.602 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.746 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.755 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.756 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4070MB free_disk=20.830673217773438GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.757 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:31:52 np0005603622 nova_compute[221301]: 2026-01-31 08:31:52.757 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:31:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:53.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:53 np0005603622 nova_compute[221301]: 2026-01-31 08:31:53.891 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:31:53 np0005603622 nova_compute[221301]: 2026-01-31 08:31:53.892 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:31:53 np0005603622 nova_compute[221301]: 2026-01-31 08:31:53.892 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:31:53 np0005603622 nova_compute[221301]: 2026-01-31 08:31:53.893 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:31:53 np0005603622 nova_compute[221301]: 2026-01-31 08:31:53.938 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:31:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:54.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:54 np0005603622 nova_compute[221301]: 2026-01-31 08:31:54.060 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:31:54 np0005603622 nova_compute[221301]: 2026-01-31 08:31:54.060 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:31:54 np0005603622 nova_compute[221301]: 2026-01-31 08:31:54.085 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:31:54 np0005603622 nova_compute[221301]: 2026-01-31 08:31:54.714 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:55 np0005603622 nova_compute[221301]: 2026-01-31 08:31:55.143 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:31:55 np0005603622 nova_compute[221301]: 2026-01-31 08:31:55.274 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:31:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:55.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:31:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1555266944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:31:55 np0005603622 nova_compute[221301]: 2026-01-31 08:31:55.722 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:31:55 np0005603622 nova_compute[221301]: 2026-01-31 08:31:55.727 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:31:55 np0005603622 nova_compute[221301]: 2026-01-31 08:31:55.996 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:31:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:56.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:56 np0005603622 nova_compute[221301]: 2026-01-31 08:31:56.294 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:31:56 np0005603622 nova_compute[221301]: 2026-01-31 08:31:56.295 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:31:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:31:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:57.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:57 np0005603622 nova_compute[221301]: 2026-01-31 08:31:57.748 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:31:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:31:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:31:58.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:31:58 np0005603622 nova_compute[221301]: 2026-01-31 08:31:58.295 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:31:58 np0005603622 nova_compute[221301]: 2026-01-31 08:31:58.296 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:31:58 np0005603622 nova_compute[221301]: 2026-01-31 08:31:58.783 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:31:58 np0005603622 nova_compute[221301]: 2026-01-31 08:31:58.783 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:31:58 np0005603622 nova_compute[221301]: 2026-01-31 08:31:58.784 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:31:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:31:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:31:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:31:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:31:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:31:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:31:59.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:31:59 np0005603622 nova_compute[221301]: 2026-01-31 08:31:59.716 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:00.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:01.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:02.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 41K writes, 161K keys, 41K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.04 MB/s#012Cumulative WAL: 41K writes, 14K syncs, 2.79 writes per sync, written: 0.15 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6806 writes, 26K keys, 6806 commit groups, 1.0 writes per commit group, ingest: 26.61 MB, 0.04 MB/s#012Interval WAL: 6806 writes, 2576 syncs, 2.64 writes per sync, written: 0.03 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.683 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.750 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.910 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.910 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.911 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:02 np0005603622 nova_compute[221301]: 2026-01-31 08:32:02.912 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:03.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:04.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:04 np0005603622 podman[272930]: 2026-01-31 08:32:04.138388285 +0000 UTC m=+0.055535193 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 31 03:32:04 np0005603622 podman[272929]: 2026-01-31 08:32:04.169896676 +0000 UTC m=+0.090789524 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:32:04 np0005603622 nova_compute[221301]: 2026-01-31 08:32:04.719 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 31 03:32:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:05.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:06.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:07.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:07 np0005603622 nova_compute[221301]: 2026-01-31 08:32:07.752 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:08.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:32:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:32:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:09.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:09 np0005603622 nova_compute[221301]: 2026-01-31 08:32:09.722 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:10.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:11 np0005603622 nova_compute[221301]: 2026-01-31 08:32:11.130 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:11.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:12.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:12.180 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:32:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:12.181 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:32:12 np0005603622 nova_compute[221301]: 2026-01-31 08:32:12.202 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:12 np0005603622 nova_compute[221301]: 2026-01-31 08:32:12.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:13.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:14.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:14.184 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:14 np0005603622 nova_compute[221301]: 2026-01-31 08:32:14.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:15.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:16.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:17.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:17 np0005603622 nova_compute[221301]: 2026-01-31 08:32:17.757 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:18.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:19.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:19 np0005603622 nova_compute[221301]: 2026-01-31 08:32:19.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:19 np0005603622 nova_compute[221301]: 2026-01-31 08:32:19.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:20.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:21.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:22.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:22 np0005603622 ceph-mds[84731]: mds.beacon.cephfs.compute-1.bkrghs missed beacon ack from the monitors
Jan 31 03:32:22 np0005603622 nova_compute[221301]: 2026-01-31 08:32:22.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_commit, latency = 6.240270138s
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency slow operation observed for kv_sync, latency = 6.240270138s
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 6.240572929s, txc = 0x55bb292aaf00
Jan 31 03:32:22 np0005603622 ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-mon-compute-1[81564]: 2026-01-31T08:32:22.817+0000 7f65ea1de640 -1 mon.compute-1@2(peon).paxos(paxos updating c 4770..5501) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 1.240462065s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 03:32:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).paxos(paxos updating c 4770..5501) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 1.240462065s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 31 03:32:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.933868408s, txc = 0x55bb278ec900
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.560904503s, txc = 0x55bb27910c00
Jan 31 03:32:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.560413361s, txc = 0x55bb26fd1800
Jan 31 03:32:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:23.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:24.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:24 np0005603622 nova_compute[221301]: 2026-01-31 08:32:24.727 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:25.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:26.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 57K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1576 writes, 7391 keys, 1576 commit groups, 1.0 writes per commit group, ingest: 16.33 MB, 0.03 MB/s#012Interval WAL: 1575 writes, 1575 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     40.5      1.72              0.22        34    0.051       0      0       0.0       0.0#012  L6      1/0    9.82 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.6     87.4     73.6      4.35              1.05        33    0.132    207K    18K       0.0       0.0#012 Sum      1/0    9.82 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.6     62.6     64.2      6.07              1.26        67    0.091    207K    18K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.8     35.5     35.9      1.55              0.22         8    0.194     33K   2114       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0     87.4     73.6      4.35              1.05        33    0.132    207K    18K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     40.6      1.72              0.22        33    0.052       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.068, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.38 GB write, 0.09 MB/s write, 0.37 GB read, 0.09 MB/s read, 6.1 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 1.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 41.63 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.00062 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2378,40.07 MB,13.1803%) FilterBlock(67,581.30 KB,0.186734%) IndexBlock(67,1019.58 KB,0.327527%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:32:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:27.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:27 np0005603622 nova_compute[221301]: 2026-01-31 08:32:27.759 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:28.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:29.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:29 np0005603622 nova_compute[221301]: 2026-01-31 08:32:29.730 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:30.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:30.887 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:30.887 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:32:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:31.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:32.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:32 np0005603622 nova_compute[221301]: 2026-01-31 08:32:32.813 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:33.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:34.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:34 np0005603622 nova_compute[221301]: 2026-01-31 08:32:34.732 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:35 np0005603622 podman[273026]: 2026-01-31 08:32:35.174709818 +0000 UTC m=+0.078791074 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:32:35 np0005603622 podman[273025]: 2026-01-31 08:32:35.260898308 +0000 UTC m=+0.172918846 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:32:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:35.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:36.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 31 03:32:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 03:32:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:37.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 03:32:37 np0005603622 nova_compute[221301]: 2026-01-31 08:32:37.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:38.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.488085) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359488205, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 2149, "num_deletes": 253, "total_data_size": 4938049, "memory_usage": 4996736, "flush_reason": "Manual Compaction"}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359531784, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 3233228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55969, "largest_seqno": 58113, "table_properties": {"data_size": 3224598, "index_size": 5252, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18800, "raw_average_key_size": 20, "raw_value_size": 3207014, "raw_average_value_size": 3520, "num_data_blocks": 229, "num_entries": 911, "num_filter_entries": 911, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848176, "oldest_key_time": 1769848176, "file_creation_time": 1769848359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 43774 microseconds, and 11107 cpu microseconds.
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.531874) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 3233228 bytes OK
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.531900) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.544219) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.544251) EVENT_LOG_v1 {"time_micros": 1769848359544241, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.544327) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 4928525, prev total WAL file size 4928525, number of live WAL files 2.
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.545885) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(3157KB)], [111(10058KB)]
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359545970, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 13533356, "oldest_snapshot_seqno": -1}
Jan 31 03:32:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:39.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:39 np0005603622 nova_compute[221301]: 2026-01-31 08:32:39.735 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 8213 keys, 11456480 bytes, temperature: kUnknown
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359847199, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 11456480, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11402655, "index_size": 32209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20549, "raw_key_size": 212781, "raw_average_key_size": 25, "raw_value_size": 11257714, "raw_average_value_size": 1370, "num_data_blocks": 1261, "num_entries": 8213, "num_filter_entries": 8213, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.847619) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 11456480 bytes
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.887653) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 44.9 rd, 38.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 9.8 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 8739, records dropped: 526 output_compression: NoCompression
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.887693) EVENT_LOG_v1 {"time_micros": 1769848359887674, "job": 70, "event": "compaction_finished", "compaction_time_micros": 301405, "compaction_time_cpu_micros": 43579, "output_level": 6, "num_output_files": 1, "total_output_size": 11456480, "num_input_records": 8739, "num_output_records": 8213, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359888420, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848359890147, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.545663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.890186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.890192) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.890195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.890198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:39 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:32:39.890201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:32:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:40.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:41.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:42.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:42 np0005603622 nova_compute[221301]: 2026-01-31 08:32:42.818 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:43 np0005603622 nova_compute[221301]: 2026-01-31 08:32:43.369 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:43 np0005603622 nova_compute[221301]: 2026-01-31 08:32:43.369 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:43 np0005603622 nova_compute[221301]: 2026-01-31 08:32:43.622 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:32:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:43.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.123 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.124 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:44.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.149 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.150 221324 INFO nova.compute.claims [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.739 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:44 np0005603622 nova_compute[221301]: 2026-01-31 08:32:44.816 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:32:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2801254445' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.328 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.335 221324 DEBUG nova.compute.provider_tree [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.405 221324 DEBUG nova.scheduler.client.report [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:32:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.553 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.554 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.672 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.673 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:32:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:45.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.718 221324 INFO nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.778 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.991 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.993 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:32:45 np0005603622 nova_compute[221301]: 2026-01-31 08:32:45.994 221324 INFO nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Creating image(s)#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.040 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.091 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:46.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.141 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.146 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.177 221324 DEBUG nova.policy [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8efe4dc047d44a0fbd21b243ddc921d8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5fe2a450b6ff4f73a7fe8def47304a29', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.229 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.230 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.231 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.231 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.266 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:46 np0005603622 nova_compute[221301]: 2026-01-31 08:32:46.273 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8e88da8b-f1f3-484b-a632-555780f0de0f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:47 np0005603622 nova_compute[221301]: 2026-01-31 08:32:47.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:47.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:47 np0005603622 nova_compute[221301]: 2026-01-31 08:32:47.820 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:32:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:48.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:32:48 np0005603622 nova_compute[221301]: 2026-01-31 08:32:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:48 np0005603622 nova_compute[221301]: 2026-01-31 08:32:48.487 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Successfully created port: 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:32:49 np0005603622 nova_compute[221301]: 2026-01-31 08:32:49.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:49.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:49 np0005603622 nova_compute[221301]: 2026-01-31 08:32:49.741 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:50.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.399 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Successfully updated port: 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:32:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:51.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.746 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.747 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.747 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.748 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.749 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.790 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.791 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquired lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.791 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.922 221324 DEBUG nova.compute.manager [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-changed-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.923 221324 DEBUG nova.compute.manager [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Refreshing instance network info cache due to event network-changed-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:32:51 np0005603622 nova_compute[221301]: 2026-01-31 08:32:51.923 221324 DEBUG oslo_concurrency.lockutils [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.010 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 8e88da8b-f1f3-484b-a632-555780f0de0f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.737s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.096 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] resizing rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:32:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:52.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.202 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:32:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:32:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3469385992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.342 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.593s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.605 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.605 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.612 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.612 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.912 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.914 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4077MB free_disk=20.785003662109375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.914 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:52 np0005603622 nova_compute[221301]: 2026-01-31 08:32:52.914 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.136 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.137 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.137 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 8e88da8b-f1f3-484b-a632-555780f0de0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.138 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.138 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.253 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.323 221324 DEBUG nova.objects.instance [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.391 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.392 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Ensure instance console log exists: /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.392 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.392 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.393 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:32:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1616037620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.663 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.671 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:32:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:53.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.723 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.925 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:32:53 np0005603622 nova_compute[221301]: 2026-01-31 08:32:53.926 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:54.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:54 np0005603622 nova_compute[221301]: 2026-01-31 08:32:54.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:55.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:56 np0005603622 nova_compute[221301]: 2026-01-31 08:32:56.925 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:32:56 np0005603622 nova_compute[221301]: 2026-01-31 08:32:56.926 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:32:56 np0005603622 nova_compute[221301]: 2026-01-31 08:32:56.926 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:32:56 np0005603622 nova_compute[221301]: 2026-01-31 08:32:56.957 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.153 221324 DEBUG nova.network.neutron [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updating instance_info_cache with network_info: [{"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.199 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.200 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.200 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.201 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.228 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Releasing lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.229 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Instance network_info: |[{"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.229 221324 DEBUG oslo_concurrency.lockutils [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.230 221324 DEBUG nova.network.neutron [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Refreshing network info cache for port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.233 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Start _get_guest_xml network_info=[{"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.239 221324 WARNING nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.244 221324 DEBUG nova.virt.libvirt.host [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.245 221324 DEBUG nova.virt.libvirt.host [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.249 221324 DEBUG nova.virt.libvirt.host [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.250 221324 DEBUG nova.virt.libvirt.host [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.251 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.251 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.252 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.252 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.252 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.253 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.253 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.253 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.254 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.254 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.254 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.255 221324 DEBUG nova.virt.hardware [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.258 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1889700565' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.698 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.725 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:32:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:57.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.729 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:32:57 np0005603622 nova_compute[221301]: 2026-01-31 08:32:57.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:32:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:32:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2232649993' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.144 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.145 221324 DEBUG nova.virt.libvirt.vif [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:32:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-1720372678',display_name='tempest-TestEncryptedCinderVolumes-server-1720372678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-1720372678',id=133,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2MvKEo4sLXWfCM6mtWon30b60LIvY6npmb7WB8/SULEbVOQsHGQoUKLqtv3v6I7hSgyksehrs6St5s7pKVurRudBAi7S3jWr6UsAYAhGVsYnRjCW5G1w/9wOmmV7+7nw==',key_name='tempest-keypair-805177201',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5fe2a450b6ff4f73a7fe8def47304a29',ramdisk_id='',reservation_id='r-ymbdr87m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestEncryptedCinderVolumes-214828144',owner_user_name='tempest-TestEncryptedCinderVolumes-214828144-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:32:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8efe4dc047d44a0fbd21b243ddc921d8',uuid=8e88da8b-f1f3-484b-a632-555780f0de0f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.145 221324 DEBUG nova.network.os_vif_util [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converting VIF {"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.146 221324 DEBUG nova.network.os_vif_util [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.147 221324 DEBUG nova.objects.instance [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:32:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:32:58.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.342 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <uuid>8e88da8b-f1f3-484b-a632-555780f0de0f</uuid>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <name>instance-00000085</name>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestEncryptedCinderVolumes-server-1720372678</nova:name>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:32:57</nova:creationTime>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:user uuid="8efe4dc047d44a0fbd21b243ddc921d8">tempest-TestEncryptedCinderVolumes-214828144-project-member</nova:user>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:project uuid="5fe2a450b6ff4f73a7fe8def47304a29">tempest-TestEncryptedCinderVolumes-214828144</nova:project>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <nova:port uuid="7bf485a9-4fd1-4c17-b23d-01cc90b31dfd">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="serial">8e88da8b-f1f3-484b-a632-555780f0de0f</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="uuid">8e88da8b-f1f3-484b-a632-555780f0de0f</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8e88da8b-f1f3-484b-a632-555780f0de0f_disk">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:21:b5:e9"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <target dev="tap7bf485a9-4f"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/console.log" append="off"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:32:58 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:32:58 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:32:58 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:32:58 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.344 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Preparing to wait for external event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.346 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.346 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.347 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.349 221324 DEBUG nova.virt.libvirt.vif [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:32:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-1720372678',display_name='tempest-TestEncryptedCinderVolumes-server-1720372678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-1720372678',id=133,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2MvKEo4sLXWfCM6mtWon30b60LIvY6npmb7WB8/SULEbVOQsHGQoUKLqtv3v6I7hSgyksehrs6St5s7pKVurRudBAi7S3jWr6UsAYAhGVsYnRjCW5G1w/9wOmmV7+7nw==',key_name='tempest-keypair-805177201',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5fe2a450b6ff4f73a7fe8def47304a29',ramdisk_id='',reservation_id='r-ymbdr87m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestEncryptedCinderVolumes-214828144',owner_user_name='tempest-TestEncryptedCinderVolumes-214828144-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:32:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8efe4dc047d44a0fbd21b243ddc921d8',uuid=8e88da8b-f1f3-484b-a632-555780f0de0f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.350 221324 DEBUG nova.network.os_vif_util [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converting VIF {"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.351 221324 DEBUG nova.network.os_vif_util [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.352 221324 DEBUG os_vif [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.354 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.354 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.360 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7bf485a9-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.361 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7bf485a9-4f, col_values=(('external_ids', {'iface-id': '7bf485a9-4fd1-4c17-b23d-01cc90b31dfd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:b5:e9', 'vm-uuid': '8e88da8b-f1f3-484b-a632-555780f0de0f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:58 np0005603622 NetworkManager[49080]: <info>  [1769848378.3666] manager: (tap7bf485a9-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.370 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.372 221324 INFO os_vif [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f')#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.924 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.925 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.925 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No VIF found with MAC fa:16:3e:21:b5:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.926 221324 INFO nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Using config drive#033[00m
Jan 31 03:32:58 np0005603622 nova_compute[221301]: 2026-01-31 08:32:58.954 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:32:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:32:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:32:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:32:59.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.148 221324 INFO nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Creating config drive at /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config#033[00m
Jan 31 03:33:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:00.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.157 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo0u2abkz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.296 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo0u2abkz" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.348 221324 DEBUG nova.storage.rbd_utils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] rbd image 8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.353 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config 8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.593 221324 DEBUG oslo_concurrency.processutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config 8e88da8b-f1f3-484b-a632-555780f0de0f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.593 221324 INFO nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Deleting local config drive /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f/disk.config because it was imported into RBD.#033[00m
Jan 31 03:33:00 np0005603622 kernel: tap7bf485a9-4f: entered promiscuous mode
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.6488] manager: (tap7bf485a9-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Jan 31 03:33:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:00Z|00602|binding|INFO|Claiming lport 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd for this chassis.
Jan 31 03:33:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:00Z|00603|binding|INFO|7bf485a9-4fd1-4c17-b23d-01cc90b31dfd: Claiming fa:16:3e:21:b5:e9 10.100.0.10
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:00Z|00604|binding|INFO|Setting lport 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd ovn-installed in OVS
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.659 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:00Z|00605|binding|INFO|Setting lport 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd up in Southbound
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.666 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:b5:e9 10.100.0.10'], port_security=['fa:16:3e:21:b5:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e88da8b-f1f3-484b-a632-555780f0de0f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5fe2a450b6ff4f73a7fe8def47304a29', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48bb54bf-6681-4763-8441-eabc4ce0587b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=539eef10-ad4b-48da-b723-4d7b7671b1c8, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.667 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd in datapath fbbedb67-9b83-45ea-9c22-0cf9a0008776 bound to our chassis#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.669 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbbedb67-9b83-45ea-9c22-0cf9a0008776#033[00m
Jan 31 03:33:00 np0005603622 systemd-udevd[273438]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.683 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a3e4f1-a03c-4072-bd80-a34cb3fbdbe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.684 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbbedb67-91 in ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.686 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbbedb67-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.686 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0c44e6-b5cd-4ff8-9324-3d3cdf592f64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 systemd-machined[190406]: New machine qemu-62-instance-00000085.
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.686 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5e97f5f5-ea71-4207-842f-9bc0edc14007]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.7004] device (tap7bf485a9-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.699 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c5977568-e087-43ca-9976-dda3e1f6416c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.7011] device (tap7bf485a9-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:33:00 np0005603622 systemd[1]: Started Virtual Machine qemu-62-instance-00000085.
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.709 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f5990222-865d-40a0-8f26-a83ebfc6077a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.734 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1cec0ec5-5a7e-479a-852e-d0fe73c9b24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf2da0e-f39b-4b80-955d-812190cc030b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.7427] manager: (tapfbbedb67-90): new Veth device (/org/freedesktop/NetworkManager/Devices/283)
Jan 31 03:33:00 np0005603622 systemd-udevd[273441]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.776 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0d7308-6247-487c-b91f-65add9016a80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.780 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d2426860-94e4-4440-9738-78c29e8a2242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.8034] device (tapfbbedb67-90): carrier: link connected
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.809 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0e727633-9642-45b2-805c-677e8075c35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[237062d6-44ad-4f8a-b3fc-44bc5e42ce6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbbedb67-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:51:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763214, 'reachable_time': 15890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273472, 'error': None, 'target': 'ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.837 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[276110e9-70e4-427e-967b-8fb93c8bb43a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:51e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 763214, 'tstamp': 763214}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273473, 'error': None, 'target': 'ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.855 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15226dd8-fc00-4f8d-b813-057ad1b2e7e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbbedb67-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:51:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763214, 'reachable_time': 15890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273474, 'error': None, 'target': 'ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.885 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f170dc33-cc0b-47e7-b855-3e9dd4166c4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.939 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccf42ba-868b-4ead-a701-d53b07d1d631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.941 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbbedb67-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.941 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.942 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbbedb67-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:00 np0005603622 NetworkManager[49080]: <info>  [1769848380.9454] manager: (tapfbbedb67-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 31 03:33:00 np0005603622 kernel: tapfbbedb67-90: entered promiscuous mode
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.947 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.951 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbbedb67-90, col_values=(('external_ids', {'iface-id': 'bd94f99c-43fc-4631-ae17-2d7418cfac15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:00Z|00606|binding|INFO|Releasing lport bd94f99c-43fc-4631-ae17-2d7418cfac15 from this chassis (sb_readonly=0)
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.952 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.953 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbbedb67-9b83-45ea-9c22-0cf9a0008776.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbbedb67-9b83-45ea-9c22-0cf9a0008776.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.954 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[46a34cd9-39c1-4046-917c-1ec15180e636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.956 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-fbbedb67-9b83-45ea-9c22-0cf9a0008776
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/fbbedb67-9b83-45ea-9c22-0cf9a0008776.pid.haproxy
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID fbbedb67-9b83-45ea-9c22-0cf9a0008776
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:33:00 np0005603622 nova_compute[221301]: 2026-01-31 08:33:00.956 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:00.957 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'env', 'PROCESS_TAG=haproxy-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbbedb67-9b83-45ea-9c22-0cf9a0008776.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.177 221324 DEBUG nova.network.neutron [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updated VIF entry in instance network info cache for port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.177 221324 DEBUG nova.network.neutron [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updating instance_info_cache with network_info: [{"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.211 221324 DEBUG oslo_concurrency.lockutils [req-da1ed6a0-966c-4cb0-b6e9-73920969a0fc req-b80c9fc8-7940-4ba0-9315-6c6caa79bd70 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.253 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.280 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.280 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.280 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.281 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.489 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848381.488967, 8e88da8b-f1f3-484b-a632-555780f0de0f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.490 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] VM Started (Lifecycle Event)#033[00m
Jan 31 03:33:01 np0005603622 podman[273547]: 2026-01-31 08:33:01.493712385 +0000 UTC m=+0.060505215 container create c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:33:01 np0005603622 systemd[1]: Started libpod-conmon-c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2.scope.
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.546 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.551 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848381.4924343, 8e88da8b-f1f3-484b-a632-555780f0de0f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.551 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:33:01 np0005603622 podman[273547]: 2026-01-31 08:33:01.461597078 +0000 UTC m=+0.028389908 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:33:01 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:33:01 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5b3cd021b054dc8647a8ade8782cbb9b0f9e86da9a7266893b726bb5e30ab3b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.582 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:01 np0005603622 podman[273547]: 2026-01-31 08:33:01.583872571 +0000 UTC m=+0.150665451 container init c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.586 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:33:01 np0005603622 podman[273547]: 2026-01-31 08:33:01.592707347 +0000 UTC m=+0.159500187 container start c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:33:01 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [NOTICE]   (273567) : New worker (273569) forked
Jan 31 03:33:01 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [NOTICE]   (273567) : Loading success.
Jan 31 03:33:01 np0005603622 nova_compute[221301]: 2026-01-31 08:33:01.634 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:33:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:01.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:02.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.363 221324 DEBUG nova.compute.manager [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.363 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.364 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.364 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.365 221324 DEBUG nova.compute.manager [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Processing event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.365 221324 DEBUG nova.compute.manager [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.366 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.366 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.366 221324 DEBUG oslo_concurrency.lockutils [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.366 221324 DEBUG nova.compute.manager [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] No waiting events found dispatching network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.367 221324 WARNING nova.compute.manager [req-413f7170-f65b-498b-9658-6cc1760fb663 req-d0f11dbd-364b-40da-82b3-e6af04dd6047 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received unexpected event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.368 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.372 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848382.3724616, 8e88da8b-f1f3-484b-a632-555780f0de0f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.373 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.375 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.379 221324 INFO nova.virt.libvirt.driver [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Instance spawned successfully.#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.380 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.459 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.468 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.474 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.475 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.476 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.477 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.477 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.478 221324 DEBUG nova.virt.libvirt.driver [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.524 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.586 221324 INFO nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Took 16.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.586 221324 DEBUG nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.684 221324 INFO nova.compute.manager [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Took 18.60 seconds to build instance.#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.709 221324 DEBUG oslo_concurrency.lockutils [None req-19aa276b-3b9d-4ce3-9b93-18f85a359a20 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:02 np0005603622 nova_compute[221301]: 2026-01-31 08:33:02.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:03 np0005603622 nova_compute[221301]: 2026-01-31 08:33:03.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:03.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:04.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:05.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:06 np0005603622 podman[273579]: 2026-01-31 08:33:06.152193363 +0000 UTC m=+0.065203821 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:33:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:06.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:06 np0005603622 podman[273578]: 2026-01-31 08:33:06.186127479 +0000 UTC m=+0.099239349 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 31 03:33:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:07.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.899 221324 DEBUG nova.compute.manager [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-changed-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.900 221324 DEBUG nova.compute.manager [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Refreshing instance network info cache due to event network-changed-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.900 221324 DEBUG oslo_concurrency.lockutils [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.901 221324 DEBUG oslo_concurrency.lockutils [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:07 np0005603622 nova_compute[221301]: 2026-01-31 08:33:07.901 221324 DEBUG nova.network.neutron [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Refreshing network info cache for port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:33:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:08.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:08 np0005603622 nova_compute[221301]: 2026-01-31 08:33:08.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:08 np0005603622 nova_compute[221301]: 2026-01-31 08:33:08.746 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:33:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:09 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:33:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:09.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:10.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:11.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:12.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:12 np0005603622 nova_compute[221301]: 2026-01-31 08:33:12.283 221324 DEBUG nova.network.neutron [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updated VIF entry in instance network info cache for port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:33:12 np0005603622 nova_compute[221301]: 2026-01-31 08:33:12.283 221324 DEBUG nova.network.neutron [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updating instance_info_cache with network_info: [{"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:12 np0005603622 nova_compute[221301]: 2026-01-31 08:33:12.318 221324 DEBUG oslo_concurrency.lockutils [req-f350efbd-baf0-4188-8895-464611121e38 req-6ff22a42-ca04-4922-be9e-5301bee833b4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8e88da8b-f1f3-484b-a632-555780f0de0f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:12 np0005603622 nova_compute[221301]: 2026-01-31 08:33:12.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:13 np0005603622 nova_compute[221301]: 2026-01-31 08:33:13.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:13.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:14.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:14.758 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:14.759 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:33:14 np0005603622 nova_compute[221301]: 2026-01-31 08:33:14.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:15.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:16Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:21:b5:e9 10.100.0.10
Jan 31 03:33:16 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:16Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:21:b5:e9 10.100.0.10
Jan 31 03:33:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:33:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:17.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:17 np0005603622 nova_compute[221301]: 2026-01-31 08:33:17.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:18.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:18 np0005603622 nova_compute[221301]: 2026-01-31 08:33:18.369 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:19.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:20.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:22.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:22 np0005603622 nova_compute[221301]: 2026-01-31 08:33:22.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:23 np0005603622 nova_compute[221301]: 2026-01-31 08:33:23.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:23.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:24.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:24 np0005603622 nova_compute[221301]: 2026-01-31 08:33:24.660 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:24 np0005603622 nova_compute[221301]: 2026-01-31 08:33:24.661 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:24 np0005603622 nova_compute[221301]: 2026-01-31 08:33:24.755 221324 DEBUG nova.objects.instance [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'flavor' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:24.762 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:24 np0005603622 nova_compute[221301]: 2026-01-31 08:33:24.985 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.623 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.624 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.625 221324 INFO nova.compute.manager [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Attaching volume a91659a9-a045-40d5-a67a-84e5963be227 to /dev/vdb#033[00m
Jan 31 03:33:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:25.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.815 221324 DEBUG os_brick.utils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.818 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.834 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.834 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[0139fea4-3358-4f48-9f44-c651687c5bea]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.836 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.845 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.845 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b641f4cb-5504-41ca-b728-7ddac6f48cc0]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.848 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.861 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.862 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[60b573fd-441e-499b-a39e-c5c9a6fed53a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.864 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[d2465f19-10b6-4412-848e-9f306e3d7832]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.864 221324 DEBUG oslo_concurrency.processutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.895 221324 DEBUG oslo_concurrency.processutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "nvme version" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.898 221324 DEBUG os_brick.initiator.connectors.lightos [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.898 221324 DEBUG os_brick.initiator.connectors.lightos [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.899 221324 DEBUG os_brick.initiator.connectors.lightos [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.899 221324 DEBUG os_brick.utils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:33:25 np0005603622 nova_compute[221301]: 2026-01-31 08:33:25.900 221324 DEBUG nova.virt.block_device [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updating existing volume attachment record: 05691143-289f-4876-9a9b-4720264fddbd _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:33:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:26.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:33:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1708699595' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:33:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:27.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:27 np0005603622 nova_compute[221301]: 2026-01-31 08:33:27.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:27 np0005603622 nova_compute[221301]: 2026-01-31 08:33:27.993 221324 DEBUG os_brick.encryptors [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Using volume encryption metadata '{'encryption_key_id': '554dc8d2-d437-4e70-a380-9cd5e356d258', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a91659a9-a045-40d5-a67a-84e5963be227', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a91659a9-a045-40d5-a67a-84e5963be227', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '8e88da8b-f1f3-484b-a632-555780f0de0f', 'attached_at': '', 'detached_at': '', 'volume_id': 'a91659a9-a045-40d5-a67a-84e5963be227', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.000 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Jan 31 03:33:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.032 221324 DEBUG barbicanclient.v1.secrets [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/554dc8d2-d437-4e70-a380-9cd5e356d258 get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.032 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.072 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.073 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.095 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.095 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.162 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.163 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:28.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.192 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.193 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.250 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.250 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.289 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.289 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.316 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.317 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.376 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.377 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.410 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.410 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.467 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.468 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.493 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.493 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.515 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.516 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.547 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.548 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.579 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.580 221324 INFO barbicanclient.base [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Calculated Secrets uuid ref: secrets/554dc8d2-d437-4e70-a380-9cd5e356d258#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.615 221324 DEBUG barbicanclient.client [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.615 221324 DEBUG nova.virt.libvirt.host [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Secret XML: <secret ephemeral="no" private="no">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <usage type="volume">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <volume>a91659a9-a045-40d5-a67a-84e5963be227</volume>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  </usage>
Jan 31 03:33:28 np0005603622 nova_compute[221301]: </secret>
Jan 31 03:33:28 np0005603622 nova_compute[221301]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.632 221324 DEBUG nova.objects.instance [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'flavor' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.806 221324 DEBUG nova.virt.libvirt.driver [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Attempting to attach volume a91659a9-a045-40d5-a67a-84e5963be227 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:33:28 np0005603622 nova_compute[221301]: 2026-01-31 08:33:28.810 221324 DEBUG nova.virt.libvirt.guest [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91659a9-a045-40d5-a67a-84e5963be227">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <serial>a91659a9-a045-40d5-a67a-84e5963be227</serial>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  <encryption format="luks">
Jan 31 03:33:28 np0005603622 nova_compute[221301]:    <secret type="passphrase" uuid="660e424d-d94d-49d9-b246-0f02aa78694f"/>
Jan 31 03:33:28 np0005603622 nova_compute[221301]:  </encryption>
Jan 31 03:33:28 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:33:28 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:33:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:29.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:30.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:30.890 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:31 np0005603622 nova_compute[221301]: 2026-01-31 08:33:31.416 221324 DEBUG nova.virt.libvirt.driver [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:33:31 np0005603622 nova_compute[221301]: 2026-01-31 08:33:31.417 221324 DEBUG nova.virt.libvirt.driver [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:33:31 np0005603622 nova_compute[221301]: 2026-01-31 08:33:31.417 221324 DEBUG nova.virt.libvirt.driver [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:33:31 np0005603622 nova_compute[221301]: 2026-01-31 08:33:31.417 221324 DEBUG nova.virt.libvirt.driver [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] No VIF found with MAC fa:16:3e:21:b5:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:33:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:31Z|00607|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:33:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:31Z|00608|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:33:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:31Z|00609|binding|INFO|Releasing lport bd94f99c-43fc-4631-ae17-2d7418cfac15 from this chassis (sb_readonly=0)
Jan 31 03:33:31 np0005603622 nova_compute[221301]: 2026-01-31 08:33:31.583 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:31 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 31 03:33:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:31.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:32 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 31 03:33:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:32.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:32 np0005603622 nova_compute[221301]: 2026-01-31 08:33:32.242 221324 DEBUG oslo_concurrency.lockutils [None req-55976a87-4d15-4eb6-ae31-750a13081deb 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 6.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:32 np0005603622 nova_compute[221301]: 2026-01-31 08:33:32.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:33 np0005603622 nova_compute[221301]: 2026-01-31 08:33:33.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:33 np0005603622 nova_compute[221301]: 2026-01-31 08:33:33.484 221324 DEBUG oslo_concurrency.lockutils [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:33 np0005603622 nova_compute[221301]: 2026-01-31 08:33:33.484 221324 DEBUG oslo_concurrency.lockutils [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:33 np0005603622 nova_compute[221301]: 2026-01-31 08:33:33.540 221324 INFO nova.compute.manager [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Detaching volume a91659a9-a045-40d5-a67a-84e5963be227#033[00m
Jan 31 03:33:33 np0005603622 nova_compute[221301]: 2026-01-31 08:33:33.717 221324 INFO nova.virt.block_device [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Attempting to driver detach volume a91659a9-a045-40d5-a67a-84e5963be227 from mountpoint /dev/vdb#033[00m
Jan 31 03:33:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:33.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:34.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.202 221324 DEBUG os_brick.encryptors [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Using volume encryption metadata '{'encryption_key_id': '554dc8d2-d437-4e70-a380-9cd5e356d258', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a91659a9-a045-40d5-a67a-84e5963be227', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a91659a9-a045-40d5-a67a-84e5963be227', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '8e88da8b-f1f3-484b-a632-555780f0de0f', 'attached_at': '', 'detached_at': '', 'volume_id': 'a91659a9-a045-40d5-a67a-84e5963be227', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.212 221324 DEBUG nova.virt.libvirt.driver [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Attempting to detach device vdb from instance 8e88da8b-f1f3-484b-a632-555780f0de0f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.213 221324 DEBUG nova.virt.libvirt.guest [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91659a9-a045-40d5-a67a-84e5963be227">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <serial>a91659a9-a045-40d5-a67a-84e5963be227</serial>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <encryption format="luks">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <secret type="passphrase" uuid="660e424d-d94d-49d9-b246-0f02aa78694f"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  </encryption>
Jan 31 03:33:34 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:33:34 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.229 221324 INFO nova.virt.libvirt.driver [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Successfully detached device vdb from instance 8e88da8b-f1f3-484b-a632-555780f0de0f from the persistent domain config.#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.231 221324 DEBUG nova.virt.libvirt.driver [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 8e88da8b-f1f3-484b-a632-555780f0de0f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.231 221324 DEBUG nova.virt.libvirt.guest [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91659a9-a045-40d5-a67a-84e5963be227">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <serial>a91659a9-a045-40d5-a67a-84e5963be227</serial>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  <encryption format="luks">
Jan 31 03:33:34 np0005603622 nova_compute[221301]:    <secret type="passphrase" uuid="660e424d-d94d-49d9-b246-0f02aa78694f"/>
Jan 31 03:33:34 np0005603622 nova_compute[221301]:  </encryption>
Jan 31 03:33:34 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:33:34 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.354 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769848414.3542447, 8e88da8b-f1f3-484b-a632-555780f0de0f => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.356 221324 DEBUG nova.virt.libvirt.driver [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 8e88da8b-f1f3-484b-a632-555780f0de0f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.359 221324 INFO nova.virt.libvirt.driver [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Successfully detached device vdb from instance 8e88da8b-f1f3-484b-a632-555780f0de0f from the live domain config.#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.820 221324 DEBUG nova.objects.instance [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'flavor' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:34 np0005603622 nova_compute[221301]: 2026-01-31 08:33:34.954 221324 DEBUG oslo_concurrency.lockutils [None req-97b2c561-ced0-4246-b46b-538763ff57ab 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:35.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:36.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.777 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.777 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.778 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.778 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.779 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.780 221324 INFO nova.compute.manager [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Terminating instance#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.782 221324 DEBUG nova.compute.manager [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:33:36 np0005603622 kernel: tap7bf485a9-4f (unregistering): left promiscuous mode
Jan 31 03:33:36 np0005603622 NetworkManager[49080]: <info>  [1769848416.8947] device (tap7bf485a9-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:33:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:36Z|00610|binding|INFO|Releasing lport 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd from this chassis (sb_readonly=0)
Jan 31 03:33:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:36Z|00611|binding|INFO|Setting lport 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd down in Southbound
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.899 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:36Z|00612|binding|INFO|Removing iface tap7bf485a9-4f ovn-installed in OVS
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:36 np0005603622 nova_compute[221301]: 2026-01-31 08:33:36.906 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:36.923 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:b5:e9 10.100.0.10'], port_security=['fa:16:3e:21:b5:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8e88da8b-f1f3-484b-a632-555780f0de0f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5fe2a450b6ff4f73a7fe8def47304a29', 'neutron:revision_number': '4', 'neutron:security_group_ids': '48bb54bf-6681-4763-8441-eabc4ce0587b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=539eef10-ad4b-48da-b723-4d7b7671b1c8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:33:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:36.924 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 7bf485a9-4fd1-4c17-b23d-01cc90b31dfd in datapath fbbedb67-9b83-45ea-9c22-0cf9a0008776 unbound from our chassis#033[00m
Jan 31 03:33:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:36.925 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbbedb67-9b83-45ea-9c22-0cf9a0008776, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:33:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:36.926 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7957924b-d520-402b-acfe-0a633c4eab00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:36.927 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776 namespace which is not needed anymore#033[00m
Jan 31 03:33:36 np0005603622 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000085.scope: Deactivated successfully.
Jan 31 03:33:36 np0005603622 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000085.scope: Consumed 16.224s CPU time.
Jan 31 03:33:36 np0005603622 systemd-machined[190406]: Machine qemu-62-instance-00000085 terminated.
Jan 31 03:33:36 np0005603622 podman[273839]: 2026-01-31 08:33:36.975909371 +0000 UTC m=+0.064233585 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:33:36 np0005603622 podman[273836]: 2026-01-31 08:33:36.995965827 +0000 UTC m=+0.084014163 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.005 221324 INFO nova.virt.libvirt.driver [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Instance destroyed successfully.#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.006 221324 DEBUG nova.objects.instance [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lazy-loading 'resources' on Instance uuid 8e88da8b-f1f3-484b-a632-555780f0de0f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:37 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [NOTICE]   (273567) : haproxy version is 2.8.14-c23fe91
Jan 31 03:33:37 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [NOTICE]   (273567) : path to executable is /usr/sbin/haproxy
Jan 31 03:33:37 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [WARNING]  (273567) : Exiting Master process...
Jan 31 03:33:37 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [ALERT]    (273567) : Current worker (273569) exited with code 143 (Terminated)
Jan 31 03:33:37 np0005603622 neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776[273563]: [WARNING]  (273567) : All workers exited. Exiting... (0)
Jan 31 03:33:37 np0005603622 systemd[1]: libpod-c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2.scope: Deactivated successfully.
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.028 221324 DEBUG nova.virt.libvirt.vif [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:32:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-1720372678',display_name='tempest-TestEncryptedCinderVolumes-server-1720372678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-1720372678',id=133,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2MvKEo4sLXWfCM6mtWon30b60LIvY6npmb7WB8/SULEbVOQsHGQoUKLqtv3v6I7hSgyksehrs6St5s7pKVurRudBAi7S3jWr6UsAYAhGVsYnRjCW5G1w/9wOmmV7+7nw==',key_name='tempest-keypair-805177201',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:33:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5fe2a450b6ff4f73a7fe8def47304a29',ramdisk_id='',reservation_id='r-ymbdr87m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestEncryptedCinderVolumes-214828144',owner_user_name='tempest-TestEncryptedCinderVolumes-214828144-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:33:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8efe4dc047d44a0fbd21b243ddc921d8',uuid=8e88da8b-f1f3-484b-a632-555780f0de0f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.029 221324 DEBUG nova.network.os_vif_util [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converting VIF {"id": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "address": "fa:16:3e:21:b5:e9", "network": {"id": "fbbedb67-9b83-45ea-9c22-0cf9a0008776", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-870203836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5fe2a450b6ff4f73a7fe8def47304a29", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bf485a9-4f", "ovs_interfaceid": "7bf485a9-4fd1-4c17-b23d-01cc90b31dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:33:37 np0005603622 podman[273899]: 2026-01-31 08:33:37.031033682 +0000 UTC m=+0.041386285 container died c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.030 221324 DEBUG nova.network.os_vif_util [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.031 221324 DEBUG os_vif [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.034 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7bf485a9-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.038 221324 INFO os_vif [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:b5:e9,bridge_name='br-int',has_traffic_filtering=True,id=7bf485a9-4fd1-4c17-b23d-01cc90b31dfd,network=Network(fbbedb67-9b83-45ea-9c22-0cf9a0008776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bf485a9-4f')#033[00m
Jan 31 03:33:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay-c5b3cd021b054dc8647a8ade8782cbb9b0f9e86da9a7266893b726bb5e30ab3b-merged.mount: Deactivated successfully.
Jan 31 03:33:37 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2-userdata-shm.mount: Deactivated successfully.
Jan 31 03:33:37 np0005603622 podman[273899]: 2026-01-31 08:33:37.066695914 +0000 UTC m=+0.077048497 container cleanup c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:33:37 np0005603622 systemd[1]: libpod-conmon-c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2.scope: Deactivated successfully.
Jan 31 03:33:37 np0005603622 podman[273957]: 2026-01-31 08:33:37.131334329 +0000 UTC m=+0.047532020 container remove c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.135 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e12a4d-3584-483e-81aa-a82724bac2c1]: (4, ('Sat Jan 31 08:33:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776 (c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2)\nc392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2\nSat Jan 31 08:33:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776 (c392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2)\nc392ffff2a135a0aceca1879ff2a080314e42a91dc945cc03eeebd275d2e9ed2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.137 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[00fd9cd5-24c9-41c3-8100-ad1325b021d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.138 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbbedb67-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:37 np0005603622 kernel: tapfbbedb67-90: left promiscuous mode
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.148 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[92943d14-e5c5-4ce4-b92e-678dd67505a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.165 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e72a25a1-6c91-4512-906b-6b063f97b2a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.167 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6444b5-ec8a-4d36-91f4-b088ee112373]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.179 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4740e8ce-3c9f-44bf-960e-b592bab4f46b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763207, 'reachable_time': 34960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273975, 'error': None, 'target': 'ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.183 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbbedb67-9b83-45ea-9c22-0cf9a0008776 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:33:37 np0005603622 systemd[1]: run-netns-ovnmeta\x2dfbbedb67\x2d9b83\x2d45ea\x2d9c22\x2d0cf9a0008776.mount: Deactivated successfully.
Jan 31 03:33:37 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:33:37.183 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[73e5554b-d590-4e82-8567-4a1a80d85065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.355 221324 DEBUG nova.compute.manager [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-unplugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.356 221324 DEBUG oslo_concurrency.lockutils [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.356 221324 DEBUG oslo_concurrency.lockutils [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.356 221324 DEBUG oslo_concurrency.lockutils [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.357 221324 DEBUG nova.compute.manager [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] No waiting events found dispatching network-vif-unplugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.357 221324 DEBUG nova.compute.manager [req-f344ec58-297b-41f6-ba6b-b934a6a6663f req-b5a64a3f-ed91-4cdc-afdb-06437af55531 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-unplugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:33:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:37.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:37 np0005603622 nova_compute[221301]: 2026-01-31 08:33:37.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:38.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.652 221324 INFO nova.virt.libvirt.driver [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Deleting instance files /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f_del#033[00m
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.653 221324 INFO nova.virt.libvirt.driver [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Deletion of /var/lib/nova/instances/8e88da8b-f1f3-484b-a632-555780f0de0f_del complete#033[00m
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.761 221324 INFO nova.compute.manager [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Took 1.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.762 221324 DEBUG oslo.service.loopingcall [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.764 221324 DEBUG nova.compute.manager [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:33:38 np0005603622 nova_compute[221301]: 2026-01-31 08:33:38.764 221324 DEBUG nova.network.neutron [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.566 221324 DEBUG nova.compute.manager [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.567 221324 DEBUG oslo_concurrency.lockutils [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.568 221324 DEBUG oslo_concurrency.lockutils [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.568 221324 DEBUG oslo_concurrency.lockutils [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.568 221324 DEBUG nova.compute.manager [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] No waiting events found dispatching network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:33:39 np0005603622 nova_compute[221301]: 2026-01-31 08:33:39.569 221324 WARNING nova.compute.manager [req-705dacb3-7c8c-4879-a40d-e51e6a2fc596 req-5711b9d9-7d45-4ada-a1a1-237c7d382d07 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received unexpected event network-vif-plugged-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:33:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:39.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:40.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:40 np0005603622 nova_compute[221301]: 2026-01-31 08:33:40.329 221324 DEBUG nova.network.neutron [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:40 np0005603622 nova_compute[221301]: 2026-01-31 08:33:40.393 221324 INFO nova.compute.manager [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Took 1.63 seconds to deallocate network for instance.#033[00m
Jan 31 03:33:40 np0005603622 nova_compute[221301]: 2026-01-31 08:33:40.722 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:40 np0005603622 nova_compute[221301]: 2026-01-31 08:33:40.722 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.218 221324 DEBUG oslo_concurrency.processutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3501450154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.686 221324 DEBUG oslo_concurrency.processutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.692 221324 DEBUG nova.compute.provider_tree [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.756 221324 DEBUG nova.scheduler.client.report [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.772 221324 DEBUG nova.compute.manager [req-f6f0ad3e-0249-4073-82b9-cafafdb8ff6f req-9e0ee98f-dd5d-4708-a505-beb67b83aae8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Received event network-vif-deleted-7bf485a9-4fd1-4c17-b23d-01cc90b31dfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:33:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:41.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.897 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:41 np0005603622 nova_compute[221301]: 2026-01-31 08:33:41.991 221324 INFO nova.scheduler.client.report [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Deleted allocations for instance 8e88da8b-f1f3-484b-a632-555780f0de0f#033[00m
Jan 31 03:33:42 np0005603622 nova_compute[221301]: 2026-01-31 08:33:42.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:42 np0005603622 nova_compute[221301]: 2026-01-31 08:33:42.128 221324 DEBUG oslo_concurrency.lockutils [None req-cf5001e0-5b9e-4bbe-b72a-fa043e624720 8efe4dc047d44a0fbd21b243ddc921d8 5fe2a450b6ff4f73a7fe8def47304a29 - - default default] Lock "8e88da8b-f1f3-484b-a632-555780f0de0f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:42.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:42 np0005603622 nova_compute[221301]: 2026-01-31 08:33:42.934 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:44.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:44 np0005603622 nova_compute[221301]: 2026-01-31 08:33:44.548 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:45.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:46.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:46 np0005603622 nova_compute[221301]: 2026-01-31 08:33:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:46 np0005603622 nova_compute[221301]: 2026-01-31 08:33:46.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:33:47 np0005603622 nova_compute[221301]: 2026-01-31 08:33:47.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:47 np0005603622 nova_compute[221301]: 2026-01-31 08:33:47.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:47 np0005603622 nova_compute[221301]: 2026-01-31 08:33:47.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:48.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:48 np0005603622 nova_compute[221301]: 2026-01-31 08:33:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:49 np0005603622 nova_compute[221301]: 2026-01-31 08:33:49.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:50.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.354 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.437 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.438 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.438 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.438 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.439 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/612937587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:51 np0005603622 nova_compute[221301]: 2026-01-31 08:33:51.889 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.005 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848417.003983, 8e88da8b-f1f3-484b-a632-555780f0de0f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.006 221324 INFO nova.compute.manager [-] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.040 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.135 221324 DEBUG nova.compute.manager [None req-f8dcac99-6bfd-42ab-a8e0-dc48f2b7a876 - - - - - -] [instance: 8e88da8b-f1f3-484b-a632-555780f0de0f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.143 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.144 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.147 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.147 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:33:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:52.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.300 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.301 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4051MB free_disk=20.85165786743164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.302 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.302 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.489 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.489 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.490 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.490 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.561 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/413884936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.970 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:52 np0005603622 nova_compute[221301]: 2026-01-31 08:33:52.974 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:53 np0005603622 nova_compute[221301]: 2026-01-31 08:33:53.086 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:53 np0005603622 nova_compute[221301]: 2026-01-31 08:33:53.145 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:33:53 np0005603622 nova_compute[221301]: 2026-01-31 08:33:53.146 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:33:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:53.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:33:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:54.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.146 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.146 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.626 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.626 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.626 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:33:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:55Z|00613|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:33:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:33:55Z|00614|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:33:55 np0005603622 nova_compute[221301]: 2026-01-31 08:33:55.734 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:55.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:33:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:56.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.420 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.421 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.509 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.688 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.689 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.697 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.697 221324 INFO nova.compute.claims [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:33:56 np0005603622 nova_compute[221301]: 2026-01-31 08:33:56.904 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:33:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2256261312' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.361 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.368 221324 DEBUG nova.compute.provider_tree [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.503 221324 DEBUG nova.scheduler.client.report [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.595 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.596 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.724 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [{"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.746 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.747 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.751 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.751 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.751 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.751 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.752 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:33:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:57.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.849 221324 INFO nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.882 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:33:57 np0005603622 nova_compute[221301]: 2026-01-31 08:33:57.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:33:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.199 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.200 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.201 221324 INFO nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Creating image(s)#033[00m
Jan 31 03:33:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:33:58.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.230 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.259 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.292 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.298 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.344 221324 DEBUG nova.policy [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6733330b634472ca8c21316f1ee5057', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e29363ca464487b931af54fe14166b1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.368 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.369 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.370 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.370 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.399 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.403 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.834 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:33:58 np0005603622 nova_compute[221301]: 2026-01-31 08:33:58.913 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] resizing rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.088 221324 DEBUG nova.objects.instance [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.159 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.161 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Ensure instance console log exists: /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.162 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.162 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.163 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:33:59 np0005603622 nova_compute[221301]: 2026-01-31 08:33:59.348 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Successfully created port: 664dc5ff-72ac-4e4a-8581-6071db25143c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:33:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:33:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:33:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:33:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:00.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.501 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Successfully updated port: 664dc5ff-72ac-4e4a-8581-6071db25143c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.625 221324 DEBUG nova.compute.manager [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-changed-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.625 221324 DEBUG nova.compute.manager [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Refreshing instance network info cache due to event network-changed-664dc5ff-72ac-4e4a-8581-6071db25143c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.626 221324 DEBUG oslo_concurrency.lockutils [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.627 221324 DEBUG oslo_concurrency.lockutils [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.627 221324 DEBUG nova.network.neutron [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Refreshing network info cache for port 664dc5ff-72ac-4e4a-8581-6071db25143c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.654 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:00 np0005603622 nova_compute[221301]: 2026-01-31 08:34:00.875 221324 DEBUG nova.network.neutron [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:34:01 np0005603622 nova_compute[221301]: 2026-01-31 08:34:01.304 221324 DEBUG nova.network.neutron [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:01 np0005603622 nova_compute[221301]: 2026-01-31 08:34:01.544 221324 DEBUG oslo_concurrency.lockutils [req-929bab8c-6784-4369-b7d9-4c42fe1d1fd1 req-d6956a26-eb63-4af6-b480-05a41252569c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:01 np0005603622 nova_compute[221301]: 2026-01-31 08:34:01.545 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:01 np0005603622 nova_compute[221301]: 2026-01-31 08:34:01.545 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:34:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:01.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:01 np0005603622 nova_compute[221301]: 2026-01-31 08:34:01.853 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:34:02 np0005603622 nova_compute[221301]: 2026-01-31 08:34:02.043 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:02.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:02 np0005603622 nova_compute[221301]: 2026-01-31 08:34:02.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:03Z|00615|binding|INFO|Releasing lport e9a7861c-c6ea-4166-9252-dc2aacdf4771 from this chassis (sb_readonly=0)
Jan 31 03:34:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:03Z|00616|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.525 221324 DEBUG nova.network.neutron [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.912 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.913 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance network_info: |[{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.919 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start _get_guest_xml network_info=[{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.927 221324 WARNING nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.936 221324 DEBUG nova.virt.libvirt.host [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.937 221324 DEBUG nova.virt.libvirt.host [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.942 221324 DEBUG nova.virt.libvirt.host [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.943 221324 DEBUG nova.virt.libvirt.host [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.946 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.947 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.948 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.948 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.949 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.950 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.950 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.951 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.951 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.952 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.952 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.953 221324 DEBUG nova.virt.hardware [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:34:03 np0005603622 nova_compute[221301]: 2026-01-31 08:34:03.959 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:04.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3431533195' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.493 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.540 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.544 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2577704909' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.976 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.977 221324 DEBUG nova.virt.libvirt.vif [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-232987388',display_name='tempest-ServerStableDeviceRescueTest-server-232987388',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-232987388',id=139,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-fnf190ma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:57Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9aac65c2-63f8-4383-80ed-6600852c6fb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.978 221324 DEBUG nova.network.os_vif_util [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.979 221324 DEBUG nova.network.os_vif_util [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:04 np0005603622 nova_compute[221301]: 2026-01-31 08:34:04.980 221324 DEBUG nova.objects.instance [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.024 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <uuid>9aac65c2-63f8-4383-80ed-6600852c6fb5</uuid>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <name>instance-0000008b</name>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-232987388</nova:name>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:34:03</nova:creationTime>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:user uuid="b6733330b634472ca8c21316f1ee5057">tempest-ServerStableDeviceRescueTest-319343227-project-member</nova:user>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:project uuid="1e29363ca464487b931af54fe14166b1">tempest-ServerStableDeviceRescueTest-319343227</nova:project>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <nova:port uuid="664dc5ff-72ac-4e4a-8581-6071db25143c">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="serial">9aac65c2-63f8-4383-80ed-6600852c6fb5</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="uuid">9aac65c2-63f8-4383-80ed-6600852c6fb5</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:c8:fe:3d"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <target dev="tap664dc5ff-72"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/console.log" append="off"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:34:05 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:34:05 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:34:05 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:34:05 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.026 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Preparing to wait for external event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.027 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.027 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.027 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.028 221324 DEBUG nova.virt.libvirt.vif [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:33:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-232987388',display_name='tempest-ServerStableDeviceRescueTest-server-232987388',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-232987388',id=139,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-fnf190ma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:33:57Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9aac65c2-63f8-4383-80ed-6600852c6fb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.029 221324 DEBUG nova.network.os_vif_util [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.030 221324 DEBUG nova.network.os_vif_util [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.030 221324 DEBUG os_vif [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.031 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.032 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.037 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap664dc5ff-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.037 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap664dc5ff-72, col_values=(('external_ids', {'iface-id': '664dc5ff-72ac-4e4a-8581-6071db25143c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:fe:3d', 'vm-uuid': '9aac65c2-63f8-4383-80ed-6600852c6fb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.040 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:05 np0005603622 NetworkManager[49080]: <info>  [1769848445.0418] manager: (tap664dc5ff-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.049 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.050 221324 INFO os_vif [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72')#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.229 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.230 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.230 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No VIF found with MAC fa:16:3e:c8:fe:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.232 221324 INFO nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Using config drive#033[00m
Jan 31 03:34:05 np0005603622 nova_compute[221301]: 2026-01-31 08:34:05.281 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:05.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:06.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:07 np0005603622 podman[274318]: 2026-01-31 08:34:07.145387352 +0000 UTC m=+0.064706428 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:34:07 np0005603622 podman[274317]: 2026-01-31 08:34:07.173921433 +0000 UTC m=+0.093297190 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.186 221324 INFO nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Creating config drive at /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.192 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnk14tcrh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.322 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnk14tcrh" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.362 221324 DEBUG nova.storage.rbd_utils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.368 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.558 221324 DEBUG oslo_concurrency.processutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.560 221324 INFO nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Deleting local config drive /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config because it was imported into RBD.#033[00m
Jan 31 03:34:07 np0005603622 kernel: tap664dc5ff-72: entered promiscuous mode
Jan 31 03:34:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:07Z|00617|binding|INFO|Claiming lport 664dc5ff-72ac-4e4a-8581-6071db25143c for this chassis.
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.6140] manager: (tap664dc5ff-72): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:07Z|00618|binding|INFO|664dc5ff-72ac-4e4a-8581-6071db25143c: Claiming fa:16:3e:c8:fe:3d 10.100.0.13
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.621 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:07Z|00619|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c ovn-installed in OVS
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:07Z|00620|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c up in Southbound
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.634 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.636 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 bound to our chassis#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.639 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:34:07 np0005603622 systemd-machined[190406]: New machine qemu-63-instance-0000008b.
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.650 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7556609f-90ef-4108-bc4f-3e1a4cf51c58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.651 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap31da00d3-01 in ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.655 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap31da00d3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.656 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[38241795-39d4-4f56-9c9b-c02dd68acda6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.657 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99708c4c-fa24-4763-b953-d79ce63327a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 systemd[1]: Started Virtual Machine qemu-63-instance-0000008b.
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.671 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5456b280-ed92-43d3-9157-85af8e46db46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 systemd-udevd[274417]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.6960] device (tap664dc5ff-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.694 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[410b1123-ab22-48ea-9d19-5f3e9afa03e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.6977] device (tap664dc5ff-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.721 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[58394645-e2ac-451e-8df5-3307f280d6d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.7279] manager: (tap31da00d3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/287)
Jan 31 03:34:07 np0005603622 systemd-udevd[274420]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.727 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbafcda-4856-4682-a65e-fe041c234bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.750 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ed15cad9-ea98-4b0c-8829-ab919d6ba5f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.753 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ef1bf103-e51e-463b-9f4a-04c7309500ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.7701] device (tap31da00d3-00): carrier: link connected
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.775 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8512de04-20b7-4006-af23-46ca49bac024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.792 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f88baf34-58d5-42d5-bace-ec20435f8613]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769911, 'reachable_time': 27113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274448, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.808 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2567dacd-1576-482a-b2ca-cbaeff7b9140]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:4f2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 769911, 'tstamp': 769911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274449, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a7696ffc-9d78-4977-8bb6-20190c52ac7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769911, 'reachable_time': 27113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274450, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:07.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.851 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7441c9ab-e103-4c37-b6af-30cac491462b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.894 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[51204c9d-bdaf-4e01-8c45-24fb6f66a71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.895 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.896 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.896 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.898 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 kernel: tap31da00d3-00: entered promiscuous mode
Jan 31 03:34:07 np0005603622 NetworkManager[49080]: <info>  [1769848447.8996] manager: (tap31da00d3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.902 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.903 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.904 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:07Z|00621|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.912 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.913 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.914 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[07ee15fb-7a02-4199-b5c7-cbbc5b340589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.915 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:34:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:07.916 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'env', 'PROCESS_TAG=haproxy-31da00d3-077b-4620-a7d3-68186467ab47', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/31da00d3-077b-4620-a7d3-68186467ab47.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:34:07 np0005603622 nova_compute[221301]: 2026-01-31 08:34:07.943 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.328 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848448.3272169, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.330 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Started (Lifecycle Event)#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.354 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.360 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848448.328082, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.360 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:34:08 np0005603622 podman[274518]: 2026-01-31 08:34:08.278649432 +0000 UTC m=+0.042249768 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:34:08 np0005603622 podman[274518]: 2026-01-31 08:34:08.375749154 +0000 UTC m=+0.139349420 container create e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.384 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.390 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:08 np0005603622 nova_compute[221301]: 2026-01-31 08:34:08.414 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:08 np0005603622 systemd[1]: Started libpod-conmon-e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16.scope.
Jan 31 03:34:08 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:34:08 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73732417322e643760502f1d042bf0673fd51424dce8645b93196cd84dcd9bf6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:34:08 np0005603622 podman[274518]: 2026-01-31 08:34:08.481744122 +0000 UTC m=+0.245344448 container init e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:34:08 np0005603622 podman[274518]: 2026-01-31 08:34:08.48840456 +0000 UTC m=+0.252004846 container start e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:34:08 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [NOTICE]   (274543) : New worker (274545) forked
Jan 31 03:34:08 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [NOTICE]   (274543) : Loading success.
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.436 221324 DEBUG nova.compute.manager [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.437 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.437 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.438 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.439 221324 DEBUG nova.compute.manager [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Processing event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.439 221324 DEBUG nova.compute.manager [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.440 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.440 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.441 221324 DEBUG oslo_concurrency.lockutils [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.441 221324 DEBUG nova.compute.manager [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.442 221324 WARNING nova.compute.manager [req-0f278b9a-80d8-4d13-a605-2ffd3d688539 req-6c9b1567-6b4a-4052-a50a-9a1aab1d3423 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.443 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.449 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848449.4495225, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.450 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.453 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.458 221324 INFO nova.virt.libvirt.driver [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance spawned successfully.#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.458 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.499 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.504 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.505 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.506 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.506 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.507 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.508 221324 DEBUG nova.virt.libvirt.driver [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.516 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.556 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.612 221324 INFO nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Took 11.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:34:09 np0005603622 nova_compute[221301]: 2026-01-31 08:34:09.613 221324 DEBUG nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 31 03:34:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:09.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:10 np0005603622 nova_compute[221301]: 2026-01-31 08:34:10.001 221324 INFO nova.compute.manager [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Took 13.34 seconds to build instance.#033[00m
Jan 31 03:34:10 np0005603622 nova_compute[221301]: 2026-01-31 08:34:10.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:10 np0005603622 nova_compute[221301]: 2026-01-31 08:34:10.062 221324 DEBUG oslo_concurrency.lockutils [None req-cba69ddf-e1f4-4fad-83f1-b6f85d7fd7fc b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:10.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 31 03:34:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 31 03:34:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:11.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:12.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:12 np0005603622 nova_compute[221301]: 2026-01-31 08:34:12.345 221324 DEBUG nova.compute.manager [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:12 np0005603622 nova_compute[221301]: 2026-01-31 08:34:12.425 221324 INFO nova.compute.manager [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] instance snapshotting#033[00m
Jan 31 03:34:12 np0005603622 nova_compute[221301]: 2026-01-31 08:34:12.946 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:13 np0005603622 nova_compute[221301]: 2026-01-31 08:34:13.214 221324 INFO nova.virt.libvirt.driver [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Beginning live snapshot process#033[00m
Jan 31 03:34:13 np0005603622 nova_compute[221301]: 2026-01-31 08:34:13.409 221324 DEBUG nova.virt.libvirt.imagebackend [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:34:13 np0005603622 nova_compute[221301]: 2026-01-31 08:34:13.675 221324 DEBUG nova.storage.rbd_utils [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] creating snapshot(896e155c1bb64d779ae18035693dcbf6) on rbd image(9aac65c2-63f8-4383-80ed-6600852c6fb5_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:34:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 31 03:34:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:13.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:13 np0005603622 nova_compute[221301]: 2026-01-31 08:34:13.850 221324 DEBUG nova.storage.rbd_utils [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] cloning vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk@896e155c1bb64d779ae18035693dcbf6 to images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:34:13 np0005603622 nova_compute[221301]: 2026-01-31 08:34:13.979 221324 DEBUG nova.storage.rbd_utils [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] flattening images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:34:14 np0005603622 nova_compute[221301]: 2026-01-31 08:34:14.246 221324 DEBUG nova.storage.rbd_utils [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] removing snapshot(896e155c1bb64d779ae18035693dcbf6) on rbd image(9aac65c2-63f8-4383-80ed-6600852c6fb5_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:34:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:14.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 31 03:34:14 np0005603622 nova_compute[221301]: 2026-01-31 08:34:14.859 221324 DEBUG nova.storage.rbd_utils [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] creating snapshot(snap) on rbd image(15979fb2-ffd4-4e76-9dfb-7c7ce052fa99) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:34:15 np0005603622 nova_compute[221301]: 2026-01-31 08:34:15.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:15.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 31 03:34:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:16.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:34:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:34:17 np0005603622 nova_compute[221301]: 2026-01-31 08:34:17.618 221324 INFO nova.virt.libvirt.driver [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Snapshot image upload complete#033[00m
Jan 31 03:34:17 np0005603622 nova_compute[221301]: 2026-01-31 08:34:17.620 221324 INFO nova.compute.manager [None req-a9cd8e53-7d71-4520-812b-358187f470b6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Took 5.19 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 03:34:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:17.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:17 np0005603622 nova_compute[221301]: 2026-01-31 08:34:17.963 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:18.182 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:18 np0005603622 nova_compute[221301]: 2026-01-31 08:34:18.182 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:18.183 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:34:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:18.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:19.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:20 np0005603622 nova_compute[221301]: 2026-01-31 08:34:20.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:20.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 31 03:34:20 np0005603622 nova_compute[221301]: 2026-01-31 08:34:20.529 221324 INFO nova.compute.manager [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Rescuing#033[00m
Jan 31 03:34:20 np0005603622 nova_compute[221301]: 2026-01-31 08:34:20.530 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:20 np0005603622 nova_compute[221301]: 2026-01-31 08:34:20.530 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:20 np0005603622 nova_compute[221301]: 2026-01-31 08:34:20.530 221324 DEBUG nova.network.neutron [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:34:21 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 31 03:34:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:21.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:22.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:22 np0005603622 nova_compute[221301]: 2026-01-31 08:34:22.546 221324 DEBUG nova.network.neutron [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:22 np0005603622 nova_compute[221301]: 2026-01-31 08:34:22.593 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:22 np0005603622 nova_compute[221301]: 2026-01-31 08:34:22.894 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:34:22 np0005603622 nova_compute[221301]: 2026-01-31 08:34:22.987 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:34:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:23 np0005603622 nova_compute[221301]: 2026-01-31 08:34:23.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:24.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:25 np0005603622 nova_compute[221301]: 2026-01-31 08:34:25.050 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:25 np0005603622 kernel: tap664dc5ff-72 (unregistering): left promiscuous mode
Jan 31 03:34:25 np0005603622 NetworkManager[49080]: <info>  [1769848465.7905] device (tap664dc5ff-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:34:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:25Z|00622|binding|INFO|Releasing lport 664dc5ff-72ac-4e4a-8581-6071db25143c from this chassis (sb_readonly=0)
Jan 31 03:34:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:25Z|00623|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c down in Southbound
Jan 31 03:34:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:25Z|00624|binding|INFO|Removing iface tap664dc5ff-72 ovn-installed in OVS
Jan 31 03:34:25 np0005603622 nova_compute[221301]: 2026-01-31 08:34:25.797 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:25 np0005603622 nova_compute[221301]: 2026-01-31 08:34:25.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:25 np0005603622 nova_compute[221301]: 2026-01-31 08:34:25.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:25 np0005603622 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 31 03:34:25 np0005603622 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000008b.scope: Consumed 13.715s CPU time.
Jan 31 03:34:25 np0005603622 systemd-machined[190406]: Machine qemu-63-instance-0000008b terminated.
Jan 31 03:34:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:25.868 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:25.870 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:34:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:25.872 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31da00d3-077b-4620-a7d3-68186467ab47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:34:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:25.873 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2d1a6b-7bc0-4ec5-963c-8be9fee7c6cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:25.874 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace which is not needed anymore#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.014 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.033 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.038 221324 INFO nova.virt.libvirt.driver [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance destroyed successfully.#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.039 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.076 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Attempting a stable device rescue#033[00m
Jan 31 03:34:26 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [NOTICE]   (274543) : haproxy version is 2.8.14-c23fe91
Jan 31 03:34:26 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [NOTICE]   (274543) : path to executable is /usr/sbin/haproxy
Jan 31 03:34:26 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [WARNING]  (274543) : Exiting Master process...
Jan 31 03:34:26 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [ALERT]    (274543) : Current worker (274545) exited with code 143 (Terminated)
Jan 31 03:34:26 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[274539]: [WARNING]  (274543) : All workers exited. Exiting... (0)
Jan 31 03:34:26 np0005603622 systemd[1]: libpod-e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16.scope: Deactivated successfully.
Jan 31 03:34:26 np0005603622 podman[274902]: 2026-01-31 08:34:26.126886263 +0000 UTC m=+0.154800142 container died e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.281 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.286 221324 DEBUG nova.compute.manager [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.287 221324 DEBUG oslo_concurrency.lockutils [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:26.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.287 221324 DEBUG oslo_concurrency.lockutils [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.288 221324 DEBUG oslo_concurrency.lockutils [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.288 221324 DEBUG nova.compute.manager [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.289 221324 WARNING nova.compute.manager [req-1252216c-62f6-4b51-9422-6a34f7d30e65 req-d8ea22ec-862a-4319-86fc-88f7ac9b7cd6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.297 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.298 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Creating image(s)#033[00m
Jan 31 03:34:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16-userdata-shm.mount: Deactivated successfully.
Jan 31 03:34:26 np0005603622 systemd[1]: var-lib-containers-storage-overlay-73732417322e643760502f1d042bf0673fd51424dce8645b93196cd84dcd9bf6-merged.mount: Deactivated successfully.
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.335 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.340 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.524 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:26 np0005603622 podman[274902]: 2026-01-31 08:34:26.53676505 +0000 UTC m=+0.564678929 container cleanup e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:34:26 np0005603622 systemd[1]: libpod-conmon-e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16.scope: Deactivated successfully.
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.559 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.563 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "ba78cdb8a9eae648c56a72e794bc53414ccc06d4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.564 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "ba78cdb8a9eae648c56a72e794bc53414ccc06d4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.754 221324 DEBUG nova.virt.libvirt.imagebackend [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:34:26 np0005603622 podman[274994]: 2026-01-31 08:34:26.772774688 +0000 UTC m=+0.216207460 container remove e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.777 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2466690e-e763-4183-b0c7-a170c9536721]: (4, ('Sat Jan 31 08:34:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16)\ne875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16\nSat Jan 31 08:34:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (e875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16)\ne875368f95a0b684b1a09f6d5f41ec21336580c00d2e37fdec8aca60de33aa16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.779 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a66ea175-2efd-49d5-8a71-36d3559f0847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.780 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:26 np0005603622 kernel: tap31da00d3-00: left promiscuous mode
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.795 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[313f9854-ebfe-4d01-997d-15e23e81e1cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.816 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b24d49-c7a0-4573-85de-de08ea2487ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.818 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3cea32bb-763b-4267-9072-80e1d4e62695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.829 221324 DEBUG nova.virt.libvirt.imagebackend [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.830 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] cloning images/15979fb2-ffd4-4e76-9dfb-7c7ce052fa99@snap to None/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.831 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[df1dcefd-5900-4df6-a40a-22599221878d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769905, 'reachable_time': 27752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275049, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.833 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:34:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:26.834 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[4c836d95-3bb9-41fe-a8f7-3935397b2e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:26 np0005603622 systemd[1]: run-netns-ovnmeta\x2d31da00d3\x2d077b\x2d4620\x2da7d3\x2d68186467ab47.mount: Deactivated successfully.
Jan 31 03:34:26 np0005603622 nova_compute[221301]: 2026-01-31 08:34:26.985 221324 DEBUG oslo_concurrency.lockutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "ba78cdb8a9eae648c56a72e794bc53414ccc06d4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.077 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:27.184 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.440 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.443 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start _get_guest_xml network_info=[{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:c8:fe:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '15979fb2-ffd4-4e76-9dfb-7c7ce052fa99', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.443 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'resources' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.708 221324 WARNING nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.714 221324 DEBUG nova.virt.libvirt.host [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.715 221324 DEBUG nova.virt.libvirt.host [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.717 221324 DEBUG nova.virt.libvirt.host [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.718 221324 DEBUG nova.virt.libvirt.host [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.719 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.719 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.720 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.720 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.720 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.721 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.721 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.721 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.721 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.722 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.722 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.722 221324 DEBUG nova.virt.hardware [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.722 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:27 np0005603622 nova_compute[221301]: 2026-01-31 08:34:27.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.154 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:28.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.475 221324 DEBUG nova.compute.manager [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.475 221324 DEBUG oslo_concurrency.lockutils [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.476 221324 DEBUG oslo_concurrency.lockutils [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.476 221324 DEBUG oslo_concurrency.lockutils [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.477 221324 DEBUG nova.compute.manager [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.477 221324 WARNING nova.compute.manager [req-229e7430-f295-4f66-9b71-5cd791b46a42 req-7047494a-5613-414f-8197-c7b300acc71f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:34:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2460703301' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.632 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:28 np0005603622 nova_compute[221301]: 2026-01-31 08:34:28.680 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/824317352' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.126 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.128 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 31 03:34:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:34:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4143710692' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.594 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.596 221324 DEBUG nova.virt.libvirt.vif [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:33:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-232987388',display_name='tempest-ServerStableDeviceRescueTest-server-232987388',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-232987388',id=139,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-fnf190ma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:34:17Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9aac65c2-63f8-4383-80ed-6600852c6fb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:c8:fe:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.597 221324 DEBUG nova.network.os_vif_util [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:c8:fe:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.598 221324 DEBUG nova.network.os_vif_util [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:34:29 np0005603622 nova_compute[221301]: 2026-01-31 08:34:29.600 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:29.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.052 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.060 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <uuid>9aac65c2-63f8-4383-80ed-6600852c6fb5</uuid>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <name>instance-0000008b</name>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-232987388</nova:name>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:34:27</nova:creationTime>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:user uuid="b6733330b634472ca8c21316f1ee5057">tempest-ServerStableDeviceRescueTest-319343227-project-member</nova:user>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:project uuid="1e29363ca464487b931af54fe14166b1">tempest-ServerStableDeviceRescueTest-319343227</nova:project>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <nova:port uuid="664dc5ff-72ac-4e4a-8581-6071db25143c">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="serial">9aac65c2-63f8-4383-80ed-6600852c6fb5</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="uuid">9aac65c2-63f8-4383-80ed-6600852c6fb5</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.rescue">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <target dev="sdb" bus="scsi"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <boot order="1"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:c8:fe:3d"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <target dev="tap664dc5ff-72"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/console.log" append="off"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:34:30 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:34:30 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:34:30 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:34:30 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.070 221324 INFO nova.virt.libvirt.driver [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance destroyed successfully.#033[00m
Jan 31 03:34:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:30.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.351249) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470351339, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1571, "num_deletes": 259, "total_data_size": 3174496, "memory_usage": 3218480, "flush_reason": "Manual Compaction"}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470368511, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2079860, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58118, "largest_seqno": 59684, "table_properties": {"data_size": 2073364, "index_size": 3571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14618, "raw_average_key_size": 20, "raw_value_size": 2059856, "raw_average_value_size": 2837, "num_data_blocks": 157, "num_entries": 726, "num_filter_entries": 726, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848360, "oldest_key_time": 1769848360, "file_creation_time": 1769848470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 17319 microseconds, and 6739 cpu microseconds.
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.368565) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2079860 bytes OK
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.368588) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.390538) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.390598) EVENT_LOG_v1 {"time_micros": 1769848470390585, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.390630) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3167176, prev total WAL file size 3167176, number of live WAL files 2.
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.391801) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303132' seq:72057594037927935, type:22 .. '6C6F676D0032323634' seq:0, type:0; will stop at (end)
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2031KB)], [114(10MB)]
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470391870, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 13536340, "oldest_snapshot_seqno": -1}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8402 keys, 13400678 bytes, temperature: kUnknown
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470513226, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 13400678, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13343292, "index_size": 35258, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21061, "raw_key_size": 217804, "raw_average_key_size": 25, "raw_value_size": 13192799, "raw_average_value_size": 1570, "num_data_blocks": 1387, "num_entries": 8402, "num_filter_entries": 8402, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.513559) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 13400678 bytes
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.518964) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.4 rd, 110.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 10.9 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 8939, records dropped: 537 output_compression: NoCompression
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.518981) EVENT_LOG_v1 {"time_micros": 1769848470518972, "job": 72, "event": "compaction_finished", "compaction_time_micros": 121471, "compaction_time_cpu_micros": 42181, "output_level": 6, "num_output_files": 1, "total_output_size": 13400678, "num_input_records": 8939, "num_output_records": 8402, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470519298, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848470520379, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.391718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.520483) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.520493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.520498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.520502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:34:30.520507) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:34:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.884 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.885 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.885 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.885 221324 DEBUG nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No VIF found with MAC fa:16:3e:c8:fe:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.887 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Using config drive#033[00m
Jan 31 03:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:30.888 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:30.889 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:30.890 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:30 np0005603622 nova_compute[221301]: 2026-01-31 08:34:30.927 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:31 np0005603622 nova_compute[221301]: 2026-01-31 08:34:31.124 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:31 np0005603622 nova_compute[221301]: 2026-01-31 08:34:31.335 221324 DEBUG nova.objects.instance [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'keypairs' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 31 03:34:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:32.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:32 np0005603622 nova_compute[221301]: 2026-01-31 08:34:32.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:33 np0005603622 nova_compute[221301]: 2026-01-31 08:34:33.446 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Creating config drive at /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue#033[00m
Jan 31 03:34:33 np0005603622 nova_compute[221301]: 2026-01-31 08:34:33.450 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9m6ii0v1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:33 np0005603622 nova_compute[221301]: 2026-01-31 08:34:33.573 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9m6ii0v1" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:33 np0005603622 nova_compute[221301]: 2026-01-31 08:34:33.597 221324 DEBUG nova.storage.rbd_utils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:34:33 np0005603622 nova_compute[221301]: 2026-01-31 08:34:33.600 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.119 221324 DEBUG oslo_concurrency.processutils [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue 9aac65c2-63f8-4383-80ed-6600852c6fb5_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.119 221324 INFO nova.virt.libvirt.driver [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Deleting local config drive /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:34:34 np0005603622 kernel: tap664dc5ff-72: entered promiscuous mode
Jan 31 03:34:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:34Z|00625|binding|INFO|Claiming lport 664dc5ff-72ac-4e4a-8581-6071db25143c for this chassis.
Jan 31 03:34:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:34Z|00626|binding|INFO|664dc5ff-72ac-4e4a-8581-6071db25143c: Claiming fa:16:3e:c8:fe:3d 10.100.0.13
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.1590] manager: (tap664dc5ff-72): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 31 03:34:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:34Z|00627|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c ovn-installed in OVS
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.168 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:34 np0005603622 systemd-udevd[275259]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.1884] device (tap664dc5ff-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.1891] device (tap664dc5ff-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:34:34 np0005603622 systemd-machined[190406]: New machine qemu-64-instance-0000008b.
Jan 31 03:34:34 np0005603622 systemd[1]: Started Virtual Machine qemu-64-instance-0000008b.
Jan 31 03:34:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:34Z|00628|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c up in Southbound
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.219 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.220 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 bound to our chassis#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.222 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.230 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3ca55a-b732-4bb6-99c3-e37718d0d9c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.231 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap31da00d3-01 in ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.233 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap31da00d3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.234 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[944b9008-c413-4d60-b3ec-b410b83e5891]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.234 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4151eb81-03e9-4ddd-b44a-be2711c0e2d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.244 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d727d4-e658-4c08-855e-5df67d503082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.265 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[675bda63-eb29-49c3-a7e2-062d8725a4da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.285 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[66a9a486-e634-40dc-8014-bdfcca247ba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 systemd-udevd[275262]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.2922] manager: (tap31da00d3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.293 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe99581-bcff-4669-bf6e-4f51214a22a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.315 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f71e9182-da76-4be8-9be9-36086dc2865c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.317 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9396fbe2-9723-4f9f-8456-5348ccbf8610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.3300] device (tap31da00d3-00): carrier: link connected
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.334 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a2e7d4-7a3d-4f51-824f-3b23fcf0ea46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.346 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2eabd301-7e22-46f3-8142-e90bb610595f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772567, 'reachable_time': 32606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275293, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.356 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[29a72999-bc2f-4a1b-941c-e68d2255160e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:4f2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 772567, 'tstamp': 772567}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275294, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.371 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a375d6a6-7eb2-405f-bebd-ef9470c9f478]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772567, 'reachable_time': 32606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275295, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.405 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed174f1-c86a-4e11-9d56-38d1de848b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.451 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a5180f4b-9360-4063-abc0-b829e0bbe8a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.452 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.452 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.453 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:34 np0005603622 kernel: tap31da00d3-00: entered promiscuous mode
Jan 31 03:34:34 np0005603622 NetworkManager[49080]: <info>  [1769848474.4555] manager: (tap31da00d3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.457 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:34Z|00629|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.465 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.468 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ac6cdc-053a-423e-97d9-9a20380ac15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.469 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:34:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:34.470 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'env', 'PROCESS_TAG=haproxy-31da00d3-077b-4620-a7d3-68186467ab47', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/31da00d3-077b-4620-a7d3-68186467ab47.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:34:34 np0005603622 podman[275383]: 2026-01-31 08:34:34.879175202 +0000 UTC m=+0.059229781 container create d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.899 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 9aac65c2-63f8-4383-80ed-6600852c6fb5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.900 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848474.8992658, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.901 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.909 221324 DEBUG nova.compute.manager [None req-2baf3147-cc3d-4ccc-bf73-5e5a5b5ea7af b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:34 np0005603622 systemd[1]: Started libpod-conmon-d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51.scope.
Jan 31 03:34:34 np0005603622 podman[275383]: 2026-01-31 08:34:34.845658058 +0000 UTC m=+0.025712667 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:34:34 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:34:34 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bcaf0bef0c817b0fc1b3e2aad0ca1a474e60e2eea5e7cac253d40ad91f1f289/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:34:34 np0005603622 podman[275383]: 2026-01-31 08:34:34.961551591 +0000 UTC m=+0.141606180 container init d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:34:34 np0005603622 podman[275383]: 2026-01-31 08:34:34.965310951 +0000 UTC m=+0.145365520 container start d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.969 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:34 np0005603622 nova_compute[221301]: 2026-01-31 08:34:34.972 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:34 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [NOTICE]   (275411) : New worker (275413) forked
Jan 31 03:34:34 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [NOTICE]   (275411) : Loading success.
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.033 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.034 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848474.9007454, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.034 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Started (Lifecycle Event)#033[00m
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.133 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:35 np0005603622 nova_compute[221301]: 2026-01-31 08:34:35.138 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:35.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:36.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:37 np0005603622 nova_compute[221301]: 2026-01-31 08:34:37.508 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:37.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:38 np0005603622 nova_compute[221301]: 2026-01-31 08:34:38.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e325 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:38 np0005603622 podman[275422]: 2026-01-31 08:34:38.170031527 +0000 UTC m=+0.079541113 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 31 03:34:38 np0005603622 podman[275423]: 2026-01-31 08:34:38.170559401 +0000 UTC m=+0.081488665 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 31 03:34:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:38.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:39.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:40 np0005603622 nova_compute[221301]: 2026-01-31 08:34:40.056 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:40.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 31 03:34:40 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #51. Immutable memtables: 0.
Jan 31 03:34:41 np0005603622 nova_compute[221301]: 2026-01-31 08:34:41.296 221324 INFO nova.compute.manager [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Unrescuing#033[00m
Jan 31 03:34:41 np0005603622 nova_compute[221301]: 2026-01-31 08:34:41.298 221324 DEBUG oslo_concurrency.lockutils [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:41 np0005603622 nova_compute[221301]: 2026-01-31 08:34:41.298 221324 DEBUG oslo_concurrency.lockutils [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:41 np0005603622 nova_compute[221301]: 2026-01-31 08:34:41.299 221324 DEBUG nova.network.neutron [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:34:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:41.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:34:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:34:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.086 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.597 221324 DEBUG nova.network.neutron [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.642 221324 DEBUG oslo_concurrency.lockutils [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.644 221324 DEBUG nova.objects.instance [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'flavor' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.721 221324 DEBUG nova.compute.manager [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.721 221324 DEBUG oslo_concurrency.lockutils [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.722 221324 DEBUG oslo_concurrency.lockutils [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.722 221324 DEBUG oslo_concurrency.lockutils [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.722 221324 DEBUG nova.compute.manager [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.722 221324 WARNING nova.compute.manager [req-02f2cca7-2231-4f4d-850d-2626fac19e20 req-71637134-c351-402a-9962-036ee3af3401 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:43 np0005603622 kernel: tap664dc5ff-72 (unregistering): left promiscuous mode
Jan 31 03:34:43 np0005603622 NetworkManager[49080]: <info>  [1769848483.7366] device (tap664dc5ff-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.746 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:43Z|00630|binding|INFO|Releasing lport 664dc5ff-72ac-4e4a-8581-6071db25143c from this chassis (sb_readonly=0)
Jan 31 03:34:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:43Z|00631|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c down in Southbound
Jan 31 03:34:43 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:43Z|00632|binding|INFO|Removing iface tap664dc5ff-72 ovn-installed in OVS
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.753 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:43.782 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:43.784 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:34:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:43.786 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31da00d3-077b-4620-a7d3-68186467ab47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:34:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:43.787 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ec233fa3-8e1e-4f85-b682-2c67f02634d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:43 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:43.788 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace which is not needed anymore#033[00m
Jan 31 03:34:43 np0005603622 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 31 03:34:43 np0005603622 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Consumed 9.844s CPU time.
Jan 31 03:34:43 np0005603622 systemd-machined[190406]: Machine qemu-64-instance-0000008b terminated.
Jan 31 03:34:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:43.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.912 221324 INFO nova.virt.libvirt.driver [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance destroyed successfully.#033[00m
Jan 31 03:34:43 np0005603622 nova_compute[221301]: 2026-01-31 08:34:43.913 221324 DEBUG nova.objects.instance [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:44 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [NOTICE]   (275411) : haproxy version is 2.8.14-c23fe91
Jan 31 03:34:44 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [NOTICE]   (275411) : path to executable is /usr/sbin/haproxy
Jan 31 03:34:44 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [WARNING]  (275411) : Exiting Master process...
Jan 31 03:34:44 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [ALERT]    (275411) : Current worker (275413) exited with code 143 (Terminated)
Jan 31 03:34:44 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275403]: [WARNING]  (275411) : All workers exited. Exiting... (0)
Jan 31 03:34:44 np0005603622 systemd[1]: libpod-d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51.scope: Deactivated successfully.
Jan 31 03:34:44 np0005603622 podman[275490]: 2026-01-31 08:34:44.055319732 +0000 UTC m=+0.175591117 container died d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 03:34:44 np0005603622 kernel: tap664dc5ff-72: entered promiscuous mode
Jan 31 03:34:44 np0005603622 systemd-udevd[275469]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:34:44 np0005603622 NetworkManager[49080]: <info>  [1769848484.1007] manager: (tap664dc5ff-72): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 31 03:34:44 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:44Z|00633|binding|INFO|Claiming lport 664dc5ff-72ac-4e4a-8581-6071db25143c for this chassis.
Jan 31 03:34:44 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:44Z|00634|binding|INFO|664dc5ff-72ac-4e4a-8581-6071db25143c: Claiming fa:16:3e:c8:fe:3d 10.100.0.13
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.099 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:44 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:44Z|00635|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c ovn-installed in OVS
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.106 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.110 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:44 np0005603622 NetworkManager[49080]: <info>  [1769848484.1111] device (tap664dc5ff-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:34:44 np0005603622 NetworkManager[49080]: <info>  [1769848484.1118] device (tap664dc5ff-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:34:44 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:44Z|00636|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c up in Southbound
Jan 31 03:34:44 np0005603622 systemd-machined[190406]: New machine qemu-65-instance-0000008b.
Jan 31 03:34:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:44.120 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:34:44 np0005603622 systemd[1]: Started Virtual Machine qemu-65-instance-0000008b.
Jan 31 03:34:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:44 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51-userdata-shm.mount: Deactivated successfully.
Jan 31 03:34:44 np0005603622 systemd[1]: var-lib-containers-storage-overlay-3bcaf0bef0c817b0fc1b3e2aad0ca1a474e60e2eea5e7cac253d40ad91f1f289-merged.mount: Deactivated successfully.
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.576 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 9aac65c2-63f8-4383-80ed-6600852c6fb5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.577 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848484.576225, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.577 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.647 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.652 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.693 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.694 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848484.5797036, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.694 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Started (Lifecycle Event)#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.763 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.770 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:34:44 np0005603622 nova_compute[221301]: 2026-01-31 08:34:44.818 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:34:44 np0005603622 podman[275490]: 2026-01-31 08:34:44.840755981 +0000 UTC m=+0.961027406 container cleanup d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:34:44 np0005603622 systemd[1]: libpod-conmon-d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51.scope: Deactivated successfully.
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:45 np0005603622 podman[275612]: 2026-01-31 08:34:45.689024037 +0000 UTC m=+0.820920407 container remove d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.695 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6ace0456-5212-4673-b966-206956820ed5]: (4, ('Sat Jan 31 08:34:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51)\nd7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51\nSat Jan 31 08:34:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (d7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51)\nd7e1a9d7c2288345ca27a3a716c8b1d223b976dea0e49d603ff44c5d3ec35e51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.697 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a534a903-1448-4b76-82a7-e19cb22afcad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.699 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.701 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:45 np0005603622 kernel: tap31da00d3-00: left promiscuous mode
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.711 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c6bcd76-384b-4f8c-a23e-4c32ac04aa40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.727 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[397fac71-4072-4d64-a0ca-c596fbd1088e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.729 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[050ed53b-8654-4210-a22c-ee961024f6bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.746 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd88e64-60b1-400d-9fcc-7ee9b6b70eea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772562, 'reachable_time': 41645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275628, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.749 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.749 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[37575466-4dfc-4c61-bd7b-6f0dfa520827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.750 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:34:45 np0005603622 systemd[1]: run-netns-ovnmeta\x2d31da00d3\x2d077b\x2d4620\x2da7d3\x2d68186467ab47.mount: Deactivated successfully.
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.753 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.761 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2acd27dd-742b-468f-9211-2f3658886cc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.764 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap31da00d3-01 in ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.766 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap31da00d3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.766 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcfccb30-31b5-4d79-a0d1-b464b965c174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.768 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f028f8a3-b780-419a-b3f5-7d5061bff3d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.776 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[60674e15-1770-4d97-8221-012e5cd6f39e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.789 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[71be9bf3-2efb-4966-8534-45dcc806d892]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.813 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6b626af1-46ae-46ab-b3ae-a57544807e21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 NetworkManager[49080]: <info>  [1769848485.8207] manager: (tap31da00d3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.821 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[40c1223b-543f-4743-b8de-f73bfc636581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.844 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[39cd4546-e7bd-4ee8-84eb-6fcc3cca61f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.846 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2749f22e-b00a-456d-a806-0f236a24e975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 NetworkManager[49080]: <info>  [1769848485.8650] device (tap31da00d3-00): carrier: link connected
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.870 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[06f708ff-885a-4566-8130-585bbf32cb2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.876 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.877 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.877 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.877 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.878 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.878 221324 WARNING nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.878 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.879 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.879 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.879 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.880 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.880 221324 WARNING nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.880 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.881 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.881 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.881 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.881 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.882 221324 WARNING nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.882 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.882 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.883 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.883 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.883 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.883 221324 WARNING nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.884 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.884 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.884 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.884 221324 DEBUG oslo_concurrency.lockutils [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.885 221324 DEBUG nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:34:45 np0005603622 nova_compute[221301]: 2026-01-31 08:34:45.885 221324 WARNING nova.compute.manager [req-2389fbb2-9949-4f3f-8b69-70bcb3d8bdbd req-f15f8982-428c-40fe-9641-781ad81d7c88 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.884 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1cbaf5-0536-49ff-929b-e86b89a025d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275652, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.896 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4f034753-086b-45a3-94a9-72a1f11b362d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:4f2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773720, 'tstamp': 773720}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275653, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:45.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.915 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1d356f4c-6c64-4d2b-916c-958089ac05ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275654, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:45.945 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[70e54219-164f-4318-acce-e965a922a721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.001 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[56d2e99a-b418-4569-8ff2-f0fc5e1b710b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.002 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.003 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.003 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:46 np0005603622 nova_compute[221301]: 2026-01-31 08:34:46.005 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:46 np0005603622 kernel: tap31da00d3-00: entered promiscuous mode
Jan 31 03:34:46 np0005603622 NetworkManager[49080]: <info>  [1769848486.0065] manager: (tap31da00d3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 31 03:34:46 np0005603622 nova_compute[221301]: 2026-01-31 08:34:46.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.010 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:34:46 np0005603622 nova_compute[221301]: 2026-01-31 08:34:46.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:46Z|00637|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:34:46 np0005603622 nova_compute[221301]: 2026-01-31 08:34:46.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.014 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.015 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1f64a473-1791-4b45-8779-b778ab98641f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.015 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/31da00d3-077b-4620-a7d3-68186467ab47.pid.haproxy
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 31da00d3-077b-4620-a7d3-68186467ab47
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:34:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:34:46.016 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'env', 'PROCESS_TAG=haproxy-31da00d3-077b-4620-a7d3-68186467ab47', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/31da00d3-077b-4620-a7d3-68186467ab47.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:34:46 np0005603622 nova_compute[221301]: 2026-01-31 08:34:46.020 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:46.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:46 np0005603622 podman[275686]: 2026-01-31 08:34:46.339624797 +0000 UTC m=+0.022398068 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:34:46 np0005603622 podman[275686]: 2026-01-31 08:34:46.668208495 +0000 UTC m=+0.350981766 container create f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:34:46 np0005603622 systemd[1]: Started libpod-conmon-f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f.scope.
Jan 31 03:34:46 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:34:46 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cf9f253d37229f93dbb4ca4c6b22fee2e4f4c926b350d91366ed4f0973e39f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:34:46 np0005603622 podman[275686]: 2026-01-31 08:34:46.940991935 +0000 UTC m=+0.623765266 container init f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:34:46 np0005603622 podman[275686]: 2026-01-31 08:34:46.94868884 +0000 UTC m=+0.631462131 container start f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:34:46 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [NOTICE]   (275705) : New worker (275707) forked
Jan 31 03:34:46 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [NOTICE]   (275705) : Loading success.
Jan 31 03:34:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:47.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:48 np0005603622 nova_compute[221301]: 2026-01-31 08:34:48.131 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:48 np0005603622 nova_compute[221301]: 2026-01-31 08:34:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:48 np0005603622 nova_compute[221301]: 2026-01-31 08:34:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:48 np0005603622 nova_compute[221301]: 2026-01-31 08:34:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:48 np0005603622 nova_compute[221301]: 2026-01-31 08:34:48.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:34:49 np0005603622 nova_compute[221301]: 2026-01-31 08:34:49.034 221324 DEBUG nova.compute.manager [None req-6dbf0bb7-f012-41df-87c2-cca18108966f b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:34:49 np0005603622 nova_compute[221301]: 2026-01-31 08:34:49.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:50 np0005603622 nova_compute[221301]: 2026-01-31 08:34:50.062 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.532 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.532 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.534 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.534 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:34:51 np0005603622 nova_compute[221301]: 2026-01-31 08:34:51.535 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:51.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:52.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/941961036' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:52 np0005603622 nova_compute[221301]: 2026-01-31 08:34:52.470 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.935s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.130 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.130 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.134 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.137 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.137 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.143 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.143 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000007e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.328 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.330 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3848MB free_disk=20.78925323486328GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.331 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.331 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.621 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 884be038-2a08-4b69-a345-5b6ac2661dfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.623 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.623 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.624 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.624 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:34:53 np0005603622 nova_compute[221301]: 2026-01-31 08:34:53.694 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1976335928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:54 np0005603622 nova_compute[221301]: 2026-01-31 08:34:54.105 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:54 np0005603622 nova_compute[221301]: 2026-01-31 08:34:54.110 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:54 np0005603622 nova_compute[221301]: 2026-01-31 08:34:54.236 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:54.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:54 np0005603622 nova_compute[221301]: 2026-01-31 08:34:54.451 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:34:54 np0005603622 nova_compute[221301]: 2026-01-31 08:34:54.452 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:55 np0005603622 nova_compute[221301]: 2026-01-31 08:34:55.065 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:55.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:34:57 np0005603622 nova_compute[221301]: 2026-01-31 08:34:57.650 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:57 np0005603622 nova_compute[221301]: 2026-01-31 08:34:57.652 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:57.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:57 np0005603622 nova_compute[221301]: 2026-01-31 08:34:57.928 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:34:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.138 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.245 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.246 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.253 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.253 221324 INFO nova.compute.claims [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:34:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:34:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:34:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.453 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.454 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.454 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.551 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.647 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:34:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:58Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:fe:3d 10.100.0.13
Jan 31 03:34:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:34:58Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:fe:3d 10.100.0.13
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.792 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.793 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.793 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:34:58 np0005603622 nova_compute[221301]: 2026-01-31 08:34:58.794 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:34:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:34:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1051462439' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:34:59 np0005603622 nova_compute[221301]: 2026-01-31 08:34:59.120 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:34:59 np0005603622 nova_compute[221301]: 2026-01-31 08:34:59.127 221324 DEBUG nova.compute.provider_tree [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:34:59 np0005603622 nova_compute[221301]: 2026-01-31 08:34:59.240 221324 DEBUG nova.scheduler.client.report [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:34:59 np0005603622 nova_compute[221301]: 2026-01-31 08:34:59.791 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:34:59 np0005603622 nova_compute[221301]: 2026-01-31 08:34:59.792 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:34:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:34:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:34:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:34:59.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.103 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.128 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.129 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.213 221324 INFO nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:35:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:00.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.397 221324 DEBUG nova.policy [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6733330b634472ca8c21316f1ee5057', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e29363ca464487b931af54fe14166b1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.458 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:35:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:00.458 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:00.462 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:35:00 np0005603622 nova_compute[221301]: 2026-01-31 08:35:00.463 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.052 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.055 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.056 221324 INFO nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Creating image(s)#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.096 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.146 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.183 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.187 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.245 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.246 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.246 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.247 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.276 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.280 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.355 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Successfully created port: 2ed53bda-fdee-4925-8492-3be158eb8d21 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.531 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [{"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.574 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-884be038-2a08-4b69-a345-5b6ac2661dfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.575 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.576 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.576 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:01 np0005603622 nova_compute[221301]: 2026-01-31 08:35:01.577 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:01.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:02.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:02 np0005603622 nova_compute[221301]: 2026-01-31 08:35:02.876 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Successfully updated port: 2ed53bda-fdee-4925-8492-3be158eb8d21 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:35:02 np0005603622 nova_compute[221301]: 2026-01-31 08:35:02.980 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:02 np0005603622 nova_compute[221301]: 2026-01-31 08:35:02.981 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:02 np0005603622 nova_compute[221301]: 2026-01-31 08:35:02.981 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:35:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.141 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.312 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.354 221324 DEBUG nova.compute.manager [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-changed-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.354 221324 DEBUG nova.compute.manager [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Refreshing instance network info cache due to event network-changed-2ed53bda-fdee-4925-8492-3be158eb8d21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.355 221324 DEBUG oslo_concurrency.lockutils [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.732 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:03 np0005603622 nova_compute[221301]: 2026-01-31 08:35:03.805 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] resizing rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:35:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:04.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.452 221324 DEBUG nova.objects.instance [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.508 221324 DEBUG nova.network.neutron [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.592 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.592 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Ensure instance console log exists: /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.593 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.593 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.593 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.778 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.779 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance network_info: |[{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.780 221324 DEBUG oslo_concurrency.lockutils [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.780 221324 DEBUG nova.network.neutron [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Refreshing network info cache for port 2ed53bda-fdee-4925-8492-3be158eb8d21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.784 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start _get_guest_xml network_info=[{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.788 221324 WARNING nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.793 221324 DEBUG nova.virt.libvirt.host [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.794 221324 DEBUG nova.virt.libvirt.host [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.797 221324 DEBUG nova.virt.libvirt.host [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.797 221324 DEBUG nova.virt.libvirt.host [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.799 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.799 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.800 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.800 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.801 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.801 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.801 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.802 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.802 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.802 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.803 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.803 221324 DEBUG nova.virt.hardware [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:35:04 np0005603622 nova_compute[221301]: 2026-01-31 08:35:04.807 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.105 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/311454942' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.486 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.680s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.536 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.546 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:05.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/420211199' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.988 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.991 221324 DEBUG nova.virt.libvirt.vif [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1321512917',display_name='tempest-ServerStableDeviceRescueTest-server-1321512917',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1321512917',id=142,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-6ftdla88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:35:00Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9707f58b-2c76-40df-b8a8-5df66bab50bb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.992 221324 DEBUG nova.network.os_vif_util [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.994 221324 DEBUG nova.network.os_vif_util [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:05 np0005603622 nova_compute[221301]: 2026-01-31 08:35:05.996 221324 DEBUG nova.objects.instance [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.065 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <uuid>9707f58b-2c76-40df-b8a8-5df66bab50bb</uuid>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <name>instance-0000008e</name>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1321512917</nova:name>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:35:04</nova:creationTime>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:user uuid="b6733330b634472ca8c21316f1ee5057">tempest-ServerStableDeviceRescueTest-319343227-project-member</nova:user>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:project uuid="1e29363ca464487b931af54fe14166b1">tempest-ServerStableDeviceRescueTest-319343227</nova:project>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <nova:port uuid="2ed53bda-fdee-4925-8492-3be158eb8d21">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="serial">9707f58b-2c76-40df-b8a8-5df66bab50bb</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="uuid">9707f58b-2c76-40df-b8a8-5df66bab50bb</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:e3:11:72"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <target dev="tap2ed53bda-fd"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/console.log" append="off"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:35:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:35:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:35:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:35:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.066 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Preparing to wait for external event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.066 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.066 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.067 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.068 221324 DEBUG nova.virt.libvirt.vif [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1321512917',display_name='tempest-ServerStableDeviceRescueTest-server-1321512917',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1321512917',id=142,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-6ftdla88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:35:00Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9707f58b-2c76-40df-b8a8-5df66bab50bb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.068 221324 DEBUG nova.network.os_vif_util [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.069 221324 DEBUG nova.network.os_vif_util [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.070 221324 DEBUG os_vif [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.071 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.076 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.076 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ed53bda-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.077 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ed53bda-fd, col_values=(('external_ids', {'iface-id': '2ed53bda-fdee-4925-8492-3be158eb8d21', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:11:72', 'vm-uuid': '9707f58b-2c76-40df-b8a8-5df66bab50bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.079 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:06 np0005603622 NetworkManager[49080]: <info>  [1769848506.0810] manager: (tap2ed53bda-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.089 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.090 221324 INFO os_vif [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd')#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.206 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.206 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.207 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No VIF found with MAC fa:16:3e:e3:11:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.208 221324 INFO nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Using config drive#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.245 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:06.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:06.465 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.756 221324 INFO nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Creating config drive at /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.764 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpunpf7p3z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.896 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpunpf7p3z" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.939 221324 DEBUG nova.storage.rbd_utils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:06 np0005603622 nova_compute[221301]: 2026-01-31 08:35:06.944 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.247 221324 DEBUG oslo_concurrency.processutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.248 221324 INFO nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Deleting local config drive /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config because it was imported into RBD.#033[00m
Jan 31 03:35:07 np0005603622 kernel: tap2ed53bda-fd: entered promiscuous mode
Jan 31 03:35:07 np0005603622 NetworkManager[49080]: <info>  [1769848507.2999] manager: (tap2ed53bda-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 31 03:35:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:07Z|00638|binding|INFO|Claiming lport 2ed53bda-fdee-4925-8492-3be158eb8d21 for this chassis.
Jan 31 03:35:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:07Z|00639|binding|INFO|2ed53bda-fdee-4925-8492-3be158eb8d21: Claiming fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:07Z|00640|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 ovn-installed in OVS
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.316 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:07 np0005603622 systemd-machined[190406]: New machine qemu-66-instance-0000008e.
Jan 31 03:35:07 np0005603622 systemd[1]: Started Virtual Machine qemu-66-instance-0000008e.
Jan 31 03:35:07 np0005603622 systemd-udevd[276086]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:35:07 np0005603622 NetworkManager[49080]: <info>  [1769848507.3715] device (tap2ed53bda-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:35:07 np0005603622 NetworkManager[49080]: <info>  [1769848507.3723] device (tap2ed53bda-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:35:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:07Z|00641|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 up in Southbound
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.385 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.387 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 bound to our chassis#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.389 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.394 221324 DEBUG nova.network.neutron [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updated VIF entry in instance network info cache for port 2ed53bda-fdee-4925-8492-3be158eb8d21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.394 221324 DEBUG nova.network.neutron [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.404 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fd90c737-b62a-4f34-8ab3-599f7af8e90a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.434 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa501c4-003a-49d8-bd1a-61ad83cb6bc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.437 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2142fe1a-6045-49f7-a89b-e8683bda362f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.463 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc54cc8-7444-4663-8525-c34ad16727c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.479 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5d93ca-ffde-4f7f-9f4c-0538480cd55e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276100, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.496 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3b32cf-d543-4880-ac06-8f79db112990]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276101, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276101, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.499 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.507 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.508 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.508 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:07.509 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.518 221324 DEBUG oslo_concurrency.lockutils [req-cafd6bdd-3ac2-49b4-83e5-cabc91d24e2b req-f38a8431-1367-440e-825c-7adabc52fe6b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.705 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848507.7052264, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:07 np0005603622 nova_compute[221301]: 2026-01-31 08:35:07.706 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Started (Lifecycle Event)#033[00m
Jan 31 03:35:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 31 03:35:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:07.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:08.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.515 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.885 221324 DEBUG nova.compute.manager [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.886 221324 DEBUG oslo_concurrency.lockutils [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.886 221324 DEBUG oslo_concurrency.lockutils [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.887 221324 DEBUG oslo_concurrency.lockutils [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.887 221324 DEBUG nova.compute.manager [req-0b7cfadb-0e39-41fd-a279-cdb8fce67cae req-9d098e16-21ec-45f3-ae0e-a0c3cae0cded fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Processing event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.888 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.894 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.897 221324 INFO nova.virt.libvirt.driver [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance spawned successfully.#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.897 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.914 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.917 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.966 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.966 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.967 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.968 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.968 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.969 221324 DEBUG nova.virt.libvirt.driver [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.980 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.981 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848507.7054925, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:08 np0005603622 nova_compute[221301]: 2026-01-31 08:35:08.981 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.128 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.132 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848508.8922327, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.132 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:35:09 np0005603622 podman[276145]: 2026-01-31 08:35:09.150320606 +0000 UTC m=+0.066588328 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:35:09 np0005603622 podman[276144]: 2026-01-31 08:35:09.204863082 +0000 UTC m=+0.121251147 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.520 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.523 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.576 221324 INFO nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Took 8.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.576 221324 DEBUG nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.830 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:35:09 np0005603622 nova_compute[221301]: 2026-01-31 08:35:09.935 221324 INFO nova.compute.manager [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Took 11.71 seconds to build instance.#033[00m
Jan 31 03:35:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:09.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:10 np0005603622 nova_compute[221301]: 2026-01-31 08:35:10.233 221324 DEBUG oslo_concurrency.lockutils [None req-bbbe8234-944a-47fd-b476-d2f6c38b4283 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:10.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.080 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.489 221324 DEBUG nova.compute.manager [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.489 221324 DEBUG oslo_concurrency.lockutils [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.490 221324 DEBUG oslo_concurrency.lockutils [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.490 221324 DEBUG oslo_concurrency.lockutils [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.490 221324 DEBUG nova.compute.manager [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:11 np0005603622 nova_compute[221301]: 2026-01-31 08:35:11.491 221324 WARNING nova.compute.manager [req-82171f81-5623-4943-96b0-1ab19ff79bb5 req-22cca0e4-62aa-4cbb-b0d7-2b355e7757b5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:35:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:12.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:12 np0005603622 nova_compute[221301]: 2026-01-31 08:35:12.429 221324 DEBUG nova.compute.manager [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:12 np0005603622 nova_compute[221301]: 2026-01-31 08:35:12.501 221324 INFO nova.compute.manager [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] instance snapshotting#033[00m
Jan 31 03:35:12 np0005603622 nova_compute[221301]: 2026-01-31 08:35:12.770 221324 INFO nova.virt.libvirt.driver [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Beginning live snapshot process#033[00m
Jan 31 03:35:12 np0005603622 nova_compute[221301]: 2026-01-31 08:35:12.996 221324 DEBUG nova.virt.libvirt.imagebackend [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 03:35:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.224 221324 DEBUG nova.storage.rbd_utils [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] creating snapshot(1894e324a90344328fd2cff5e57508ed) on rbd image(9707f58b-2c76-40df-b8a8-5df66bab50bb_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.751 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.752 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.752 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.753 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.753 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.754 221324 INFO nova.compute.manager [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Terminating instance#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.755 221324 DEBUG nova.compute.manager [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:35:13 np0005603622 kernel: tapebff0bdd-64 (unregistering): left promiscuous mode
Jan 31 03:35:13 np0005603622 NetworkManager[49080]: <info>  [1769848513.8331] device (tapebff0bdd-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:35:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:13Z|00642|binding|INFO|Releasing lport ebff0bdd-64a8-478c-a24f-94fc3ea465c4 from this chassis (sb_readonly=0)
Jan 31 03:35:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:13Z|00643|binding|INFO|Setting lport ebff0bdd-64a8-478c-a24f-94fc3ea465c4 down in Southbound
Jan 31 03:35:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 31 03:35:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:13Z|00644|binding|INFO|Removing iface tapebff0bdd-64 ovn-installed in OVS
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.861 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:13.867 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:61 10.100.0.9'], port_security=['fa:16:3e:d7:63:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6071a46-64a6-45aa-97c6-06e6c564195b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40db421b27d84f809f8074c58151327f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '986b09c9-4243-429e-9b6e-93ffcacf8cb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=111856e4-2ce2-4b64-a82d-6a5bd7b8a457, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=ebff0bdd-64a8-478c-a24f-94fc3ea465c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:13.869 139793 INFO neutron.agent.ovn.metadata.agent [-] Port ebff0bdd-64a8-478c-a24f-94fc3ea465c4 in datapath f6071a46-64a6-45aa-97c6-06e6c564195b unbound from our chassis#033[00m
Jan 31 03:35:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:13.871 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6071a46-64a6-45aa-97c6-06e6c564195b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:35:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:13.872 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[35427d3a-f2f7-4123-b811-64c626cdb248]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:13.873 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b namespace which is not needed anymore#033[00m
Jan 31 03:35:13 np0005603622 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000081.scope: Deactivated successfully.
Jan 31 03:35:13 np0005603622 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000081.scope: Consumed 24.306s CPU time.
Jan 31 03:35:13 np0005603622 systemd-machined[190406]: Machine qemu-61-instance-00000081 terminated.
Jan 31 03:35:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:13.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.983 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.990 221324 INFO nova.virt.libvirt.driver [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Instance destroyed successfully.#033[00m
Jan 31 03:35:13 np0005603622 nova_compute[221301]: 2026-01-31 08:35:13.991 221324 DEBUG nova.objects.instance [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lazy-loading 'resources' on Instance uuid e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.027 221324 DEBUG nova.virt.libvirt.vif [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-170523285',display_name='tempest-₡-170523285',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--170523285',id=129,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:30:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='40db421b27d84f809f8074c58151327f',ramdisk_id='',reservation_id='r-l8obdkvp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1064072764',owner_user_name='tempest-ServersTestJSON-1064072764-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:30:53Z,user_data=None,user_id='fb3f20f0143d465ebfe98f6a13200890',uuid=e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.028 221324 DEBUG nova.network.os_vif_util [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converting VIF {"id": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "address": "fa:16:3e:d7:63:61", "network": {"id": "f6071a46-64a6-45aa-97c6-06e6c564195b", "bridge": "br-int", "label": "tempest-ServersTestJSON-1491278061-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "40db421b27d84f809f8074c58151327f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebff0bdd-64", "ovs_interfaceid": "ebff0bdd-64a8-478c-a24f-94fc3ea465c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.029 221324 DEBUG nova.network.os_vif_util [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.029 221324 DEBUG os_vif [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.031 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebff0bdd-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.032 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.033 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.035 221324 INFO os_vif [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:61,bridge_name='br-int',has_traffic_filtering=True,id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4,network=Network(f6071a46-64a6-45aa-97c6-06e6c564195b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebff0bdd-64')#033[00m
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [NOTICE]   (272653) : haproxy version is 2.8.14-c23fe91
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [NOTICE]   (272653) : path to executable is /usr/sbin/haproxy
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [WARNING]  (272653) : Exiting Master process...
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [WARNING]  (272653) : Exiting Master process...
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [ALERT]    (272653) : Current worker (272655) exited with code 143 (Terminated)
Jan 31 03:35:14 np0005603622 neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b[272649]: [WARNING]  (272653) : All workers exited. Exiting... (0)
Jan 31 03:35:14 np0005603622 systemd[1]: libpod-74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45.scope: Deactivated successfully.
Jan 31 03:35:14 np0005603622 podman[276262]: 2026-01-31 08:35:14.192638778 +0000 UTC m=+0.232378382 container died 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:35:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:14.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45-userdata-shm.mount: Deactivated successfully.
Jan 31 03:35:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay-37a20b3e7681bb4f2fe958aa100d348a24800ab6d3a409bf9be3de90e62fae8a-merged.mount: Deactivated successfully.
Jan 31 03:35:14 np0005603622 podman[276262]: 2026-01-31 08:35:14.488697188 +0000 UTC m=+0.528436752 container cleanup 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:35:14 np0005603622 systemd[1]: libpod-conmon-74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45.scope: Deactivated successfully.
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.534 221324 DEBUG nova.compute.manager [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-unplugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.535 221324 DEBUG oslo_concurrency.lockutils [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.535 221324 DEBUG oslo_concurrency.lockutils [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.536 221324 DEBUG oslo_concurrency.lockutils [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.536 221324 DEBUG nova.compute.manager [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] No waiting events found dispatching network-vif-unplugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.536 221324 DEBUG nova.compute.manager [req-f7ea8d8f-c0a3-45ad-8fe5-c69ab88321cd req-63d5d038-0b05-48a5-b517-00e3489a2095 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-unplugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:35:14 np0005603622 nova_compute[221301]: 2026-01-31 08:35:14.910 221324 DEBUG nova.storage.rbd_utils [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] cloning vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk@1894e324a90344328fd2cff5e57508ed to images/357acf21-e111-4e02-8a13-c11baeb4f6ec clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:35:14 np0005603622 podman[276321]: 2026-01-31 08:35:14.940071682 +0000 UTC m=+0.423536453 container remove 74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.947 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c617d21d-0010-4287-b6a8-9d25ce4ee7b4]: (4, ('Sat Jan 31 08:35:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45)\n74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45\nSat Jan 31 08:35:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b (74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45)\n74e207a9200d667b135f7c76e77793cad205e5d47c5a3379c8a956a757e8fc45\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.949 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5e623cfa-ff5f-4d1c-a615-e1185ff89b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.950 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6071a46-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:14 np0005603622 kernel: tapf6071a46-60: left promiscuous mode
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.966 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[864dd2ba-fb86-48e3-93dd-f845866576ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.982 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[148f4415-85f8-4bc2-9745-0ff15230e18a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:14.984 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[56581ec5-2ef9-4374-9076-7e629cc5979f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:15.000 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[81045389-3984-460c-a9c9-09afa301026d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 750326, 'reachable_time': 29021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276359, 'error': None, 'target': 'ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:15 np0005603622 systemd[1]: run-netns-ovnmeta\x2df6071a46\x2d64a6\x2d45aa\x2d97c6\x2d06e6c564195b.mount: Deactivated successfully.
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:15.005 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6071a46-64a6-45aa-97c6-06e6c564195b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:35:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:15.005 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[908e1ccd-6277-449f-8bdf-af18e897215b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.152 221324 DEBUG nova.storage.rbd_utils [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] flattening images/357acf21-e111-4e02-8a13-c11baeb4f6ec flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.346 221324 INFO nova.virt.libvirt.driver [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Deleting instance files /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_del#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.347 221324 INFO nova.virt.libvirt.driver [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Deletion of /var/lib/nova/instances/e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c_del complete#033[00m
Jan 31 03:35:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.572 221324 DEBUG nova.storage.rbd_utils [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] removing snapshot(1894e324a90344328fd2cff5e57508ed) on rbd image(9707f58b-2c76-40df-b8a8-5df66bab50bb_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.689 221324 INFO nova.compute.manager [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Took 1.93 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.690 221324 DEBUG oslo.service.loopingcall [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.690 221324 DEBUG nova.compute.manager [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:35:15 np0005603622 nova_compute[221301]: 2026-01-31 08:35:15.691 221324 DEBUG nova.network.neutron [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:35:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:15.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:16.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.652 221324 DEBUG nova.compute.manager [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.652 221324 DEBUG oslo_concurrency.lockutils [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.652 221324 DEBUG oslo_concurrency.lockutils [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.653 221324 DEBUG oslo_concurrency.lockutils [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.653 221324 DEBUG nova.compute.manager [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] No waiting events found dispatching network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.653 221324 WARNING nova.compute.manager [req-88fa0664-81d9-4241-9f9c-7b5390ffd0c0 req-6da6c4f6-6b25-4d70-b9ad-435e87d1d2e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received unexpected event network-vif-plugged-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.717 221324 DEBUG nova.storage.rbd_utils [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] creating snapshot(snap) on rbd image(357acf21-e111-4e02-8a13-c11baeb4f6ec) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.913 221324 DEBUG nova.compute.manager [req-4a25f324-465e-4f3e-9514-fcee0141aeab req-d3ef7f61-7d7f-44cb-8750-e5d86e248334 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Received event network-vif-deleted-ebff0bdd-64a8-478c-a24f-94fc3ea465c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.915 221324 INFO nova.compute.manager [req-4a25f324-465e-4f3e-9514-fcee0141aeab req-d3ef7f61-7d7f-44cb-8750-e5d86e248334 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Neutron deleted interface ebff0bdd-64a8-478c-a24f-94fc3ea465c4; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:35:16 np0005603622 nova_compute[221301]: 2026-01-31 08:35:16.915 221324 DEBUG nova.network.neutron [req-4a25f324-465e-4f3e-9514-fcee0141aeab req-d3ef7f61-7d7f-44cb-8750-e5d86e248334 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:17 np0005603622 nova_compute[221301]: 2026-01-31 08:35:17.213 221324 DEBUG nova.network.neutron [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:17 np0005603622 nova_compute[221301]: 2026-01-31 08:35:17.371 221324 DEBUG nova.compute.manager [req-4a25f324-465e-4f3e-9514-fcee0141aeab req-d3ef7f61-7d7f-44cb-8750-e5d86e248334 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Detach interface failed, port_id=ebff0bdd-64a8-478c-a24f-94fc3ea465c4, reason: Instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:35:17 np0005603622 nova_compute[221301]: 2026-01-31 08:35:17.397 221324 INFO nova.compute.manager [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Took 1.71 seconds to deallocate network for instance.#033[00m
Jan 31 03:35:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 31 03:35:17 np0005603622 nova_compute[221301]: 2026-01-31 08:35:17.893 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:17 np0005603622 nova_compute[221301]: 2026-01-31 08:35:17.896 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:17.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.007 221324 DEBUG oslo_concurrency.processutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:18.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/86913855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.479 221324 DEBUG oslo_concurrency.processutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.484 221324 DEBUG nova.compute.provider_tree [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.600 221324 DEBUG nova.scheduler.client.report [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.881 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:18 np0005603622 nova_compute[221301]: 2026-01-31 08:35:18.951 221324 INFO nova.scheduler.client.report [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Deleted allocations for instance e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c#033[00m
Jan 31 03:35:19 np0005603622 nova_compute[221301]: 2026-01-31 08:35:19.033 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:19 np0005603622 nova_compute[221301]: 2026-01-31 08:35:19.510 221324 DEBUG oslo_concurrency.lockutils [None req-5d1bef11-1ab6-485a-9a50-3b2443a539e3 fb3f20f0143d465ebfe98f6a13200890 40db421b27d84f809f8074c58151327f - - default default] Lock "e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:19.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:20 np0005603622 nova_compute[221301]: 2026-01-31 08:35:20.101 221324 INFO nova.virt.libvirt.driver [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Snapshot image upload complete#033[00m
Jan 31 03:35:20 np0005603622 nova_compute[221301]: 2026-01-31 08:35:20.102 221324 INFO nova.compute.manager [None req-e843e7cb-d897-4eec-8832-61072ef7ce42 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Took 7.60 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 03:35:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:20.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:21.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:22.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:23 np0005603622 nova_compute[221301]: 2026-01-31 08:35:23.204 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:23Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:35:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:23Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:35:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:23.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 nova_compute[221301]: 2026-01-31 08:35:24.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:24.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:35:25 np0005603622 nova_compute[221301]: 2026-01-31 08:35:25.416 221324 INFO nova.compute.manager [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Rescuing#033[00m
Jan 31 03:35:25 np0005603622 nova_compute[221301]: 2026-01-31 08:35:25.417 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:25 np0005603622 nova_compute[221301]: 2026-01-31 08:35:25.417 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:25 np0005603622 nova_compute[221301]: 2026-01-31 08:35:25.418 221324 DEBUG nova.network.neutron [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:35:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 31 03:35:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:25.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:26.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:27.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:28 np0005603622 nova_compute[221301]: 2026-01-31 08:35:28.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:28.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:28 np0005603622 nova_compute[221301]: 2026-01-31 08:35:28.987 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848513.986303, e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:28 np0005603622 nova_compute[221301]: 2026-01-31 08:35:28.988 221324 INFO nova.compute.manager [-] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:35:29 np0005603622 nova_compute[221301]: 2026-01-31 08:35:29.023 221324 DEBUG nova.compute.manager [None req-00389984-3db0-479f-8d37-7653a4946da1 - - - - - -] [instance: e1d25fcb-4ed5-4caa-b481-60a2cb9eeb7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:29 np0005603622 nova_compute[221301]: 2026-01-31 08:35:29.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:30.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:30 np0005603622 nova_compute[221301]: 2026-01-31 08:35:30.704 221324 DEBUG nova.network.neutron [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:30.889 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:30.890 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:30.890 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:31 np0005603622 nova_compute[221301]: 2026-01-31 08:35:31.020 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:31 np0005603622 nova_compute[221301]: 2026-01-31 08:35:31.482 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:35:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:31Z|00645|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:35:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:31Z|00646|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:35:31 np0005603622 nova_compute[221301]: 2026-01-31 08:35:31.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:31Z|00647|binding|INFO|Releasing lport 7e288124-e200-4c03-8a4a-baab3e3f3d7a from this chassis (sb_readonly=0)
Jan 31 03:35:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:31Z|00648|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:35:31 np0005603622 nova_compute[221301]: 2026-01-31 08:35:31.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:31.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:35:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:32.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.101 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.102 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.102 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.103 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.103 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.104 221324 INFO nova.compute.manager [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Terminating instance#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.105 221324 DEBUG nova.compute.manager [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:35:33 np0005603622 kernel: tapdcdd9995-1b (unregistering): left promiscuous mode
Jan 31 03:35:33 np0005603622 NetworkManager[49080]: <info>  [1769848533.1746] device (tapdcdd9995-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:35:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:33Z|00649|binding|INFO|Releasing lport dcdd9995-1bf7-4e6c-843e-fa270ad890d0 from this chassis (sb_readonly=0)
Jan 31 03:35:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:33Z|00650|binding|INFO|Setting lport dcdd9995-1bf7-4e6c-843e-fa270ad890d0 down in Southbound
Jan 31 03:35:33 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:33Z|00651|binding|INFO|Removing iface tapdcdd9995-1b ovn-installed in OVS
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Jan 31 03:35:33 np0005603622 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007e.scope: Consumed 31.521s CPU time.
Jan 31 03:35:33 np0005603622 systemd-machined[190406]: Machine qemu-57-instance-0000007e terminated.
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.301 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:97:89 10.100.0.5'], port_security=['fa:16:3e:df:97:89 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '884be038-2a08-4b69-a345-5b6ac2661dfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44469d8b-ad30-4270-88fa-e67c568f3150', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '953a213fa5cb435ab3c04ad96152685f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8c881e0-722d-4784-9f91-71ffaeb0ba02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d972fb9d-6d12-4c1c-b135-704d64887b72, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=dcdd9995-1bf7-4e6c-843e-fa270ad890d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.303 139793 INFO neutron.agent.ovn.metadata.agent [-] Port dcdd9995-1bf7-4e6c-843e-fa270ad890d0 in datapath 44469d8b-ad30-4270-88fa-e67c568f3150 unbound from our chassis#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.306 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44469d8b-ad30-4270-88fa-e67c568f3150, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.307 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7384ad-9673-4006-bb58-c84d660aa7c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.308 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 namespace which is not needed anymore#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.351 221324 INFO nova.virt.libvirt.driver [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Instance destroyed successfully.#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.351 221324 DEBUG nova.objects.instance [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lazy-loading 'resources' on Instance uuid 884be038-2a08-4b69-a345-5b6ac2661dfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.410 221324 DEBUG nova.virt.libvirt.vif [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-67549667',display_name='tempest-ServerActionsTestOtherB-server-67549667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-67549667',id=126,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:28:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='953a213fa5cb435ab3c04ad96152685f',ramdisk_id='',reservation_id='r-ph9lb0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1048458052',owner_user_name='tempest-ServerActionsTestOtherB-1048458052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:28:31Z,user_data=None,user_id='ef51681d234a4abc88ff433d0640b6e7',uuid=884be038-2a08-4b69-a345-5b6ac2661dfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.412 221324 DEBUG nova.network.os_vif_util [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converting VIF {"id": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "address": "fa:16:3e:df:97:89", "network": {"id": "44469d8b-ad30-4270-88fa-e67c568f3150", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-2130829654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "953a213fa5cb435ab3c04ad96152685f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcdd9995-1b", "ovs_interfaceid": "dcdd9995-1bf7-4e6c-843e-fa270ad890d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.414 221324 DEBUG nova.network.os_vif_util [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.415 221324 DEBUG os_vif [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.417 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.418 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcdd9995-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.420 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.423 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.427 221324 INFO os_vif [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:97:89,bridge_name='br-int',has_traffic_filtering=True,id=dcdd9995-1bf7-4e6c-843e-fa270ad890d0,network=Network(44469d8b-ad30-4270-88fa-e67c568f3150),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcdd9995-1b')#033[00m
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [NOTICE]   (270376) : haproxy version is 2.8.14-c23fe91
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [NOTICE]   (270376) : path to executable is /usr/sbin/haproxy
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [WARNING]  (270376) : Exiting Master process...
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [WARNING]  (270376) : Exiting Master process...
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [ALERT]    (270376) : Current worker (270378) exited with code 143 (Terminated)
Jan 31 03:35:33 np0005603622 neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150[270368]: [WARNING]  (270376) : All workers exited. Exiting... (0)
Jan 31 03:35:33 np0005603622 systemd[1]: libpod-de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2.scope: Deactivated successfully.
Jan 31 03:35:33 np0005603622 podman[276787]: 2026-01-31 08:35:33.474214504 +0000 UTC m=+0.069618439 container died de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:35:33 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2-userdata-shm.mount: Deactivated successfully.
Jan 31 03:35:33 np0005603622 systemd[1]: var-lib-containers-storage-overlay-c18bc93d1d11aaf4d33a9ba6576717a6d32b1969c62ce468409f87804c11d8e5-merged.mount: Deactivated successfully.
Jan 31 03:35:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:33 np0005603622 podman[276787]: 2026-01-31 08:35:33.626054666 +0000 UTC m=+0.221458611 container cleanup de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:35:33 np0005603622 systemd[1]: libpod-conmon-de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2.scope: Deactivated successfully.
Jan 31 03:35:33 np0005603622 podman[276833]: 2026-01-31 08:35:33.786267211 +0000 UTC m=+0.135253390 container remove de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.790 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[462f113c-ae02-4d2e-b132-be872a428d7b]: (4, ('Sat Jan 31 08:35:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2)\nde30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2\nSat Jan 31 08:35:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 (de30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2)\nde30a807106e9c2ea9f69947997a99aa181a3e59f2782c0861ca0aad03159dc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.792 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[85c5d9e7-0a07-41e0-b078-c920bec92d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.794 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44469d8b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.797 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 kernel: tap44469d8b-a0: left promiscuous mode
Jan 31 03:35:33 np0005603622 nova_compute[221301]: 2026-01-31 08:35:33.817 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.821 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ae3d28-b252-4640-8237-075b470a074f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.839 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[816aa915-a210-4bd7-91d5-577edd41eb99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.841 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa30fb8-02a7-4abc-adfd-37ae3d4ac721]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.853 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb997c2-716c-4a56-9ad4-0b6ec1448fee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736158, 'reachable_time': 42941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276849, 'error': None, 'target': 'ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.855 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44469d8b-ad30-4270-88fa-e67c568f3150 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:35:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:33.855 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[28fa4321-c5a2-4f1d-b04f-278f637a11d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:33 np0005603622 systemd[1]: run-netns-ovnmeta\x2d44469d8b\x2dad30\x2d4270\x2d88fa\x2de67c568f3150.mount: Deactivated successfully.
Jan 31 03:35:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:33.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.301 221324 INFO nova.virt.libvirt.driver [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Deleting instance files /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc_del#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.302 221324 INFO nova.virt.libvirt.driver [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Deletion of /var/lib/nova/instances/884be038-2a08-4b69-a345-5b6ac2661dfc_del complete#033[00m
Jan 31 03:35:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:34.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:34 np0005603622 kernel: tap2ed53bda-fd (unregistering): left promiscuous mode
Jan 31 03:35:34 np0005603622 NetworkManager[49080]: <info>  [1769848534.4108] device (tap2ed53bda-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:35:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:34Z|00652|binding|INFO|Releasing lport 2ed53bda-fdee-4925-8492-3be158eb8d21 from this chassis (sb_readonly=0)
Jan 31 03:35:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:34Z|00653|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 down in Southbound
Jan 31 03:35:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:34Z|00654|binding|INFO|Removing iface tap2ed53bda-fd ovn-installed in OVS
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Jan 31 03:35:34 np0005603622 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008e.scope: Consumed 14.248s CPU time.
Jan 31 03:35:34 np0005603622 systemd-machined[190406]: Machine qemu-66-instance-0000008e terminated.
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.549 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.551 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.554 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8811aa4d-6146-4b06-8a8f-ab7e4fb08a5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.595 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[67d70d56-7895-4c51-a561-8cd6b3255623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.599 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80189021-8667-4379-a7c2-031a6dd9a649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.623 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[23666650-d78f-46dc-a91a-39ec1fb7628b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.685 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[121b647f-cdca-4943-8508-85bcda8c289f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276858, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.687 221324 INFO nova.compute.manager [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Took 1.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.689 221324 DEBUG oslo.service.loopingcall [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.689 221324 DEBUG nova.compute.manager [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.690 221324 DEBUG nova.network.neutron [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.699 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.699 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f61fe93f-20e6-4d89-bac4-53f13949c8f5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276860, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276860, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.701 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.707 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.707 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.708 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.708 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:35:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:34.708 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.714 221324 INFO nova.virt.libvirt.driver [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance destroyed successfully.#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.715 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.784 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Attempting a stable device rescue#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.992 221324 DEBUG nova.compute.manager [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-unplugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.992 221324 DEBUG oslo_concurrency.lockutils [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.993 221324 DEBUG oslo_concurrency.lockutils [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.994 221324 DEBUG oslo_concurrency.lockutils [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.994 221324 DEBUG nova.compute.manager [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] No waiting events found dispatching network-vif-unplugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:34 np0005603622 nova_compute[221301]: 2026-01-31 08:35:34.995 221324 DEBUG nova.compute.manager [req-1c1c9fc7-8fea-42b1-b272-aec6e96fa1c0 req-fdcc2501-f517-4e96-b4b4-76d6d605cabe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-unplugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.451 221324 DEBUG nova.compute.manager [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.452 221324 DEBUG oslo_concurrency.lockutils [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.453 221324 DEBUG oslo_concurrency.lockutils [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.453 221324 DEBUG oslo_concurrency.lockutils [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.453 221324 DEBUG nova.compute.manager [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.454 221324 WARNING nova.compute.manager [req-62a7a619-5316-450e-8ff7-e93b636e7676 req-e94be530-f225-4fbb-a535-e73c9bfeb9dc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.660 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.669 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.670 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Creating image(s)#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.771 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.775 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.839 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.869 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.873 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "045564f94eae0382118756614521dcaec5180e53" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:35 np0005603622 nova_compute[221301]: 2026-01-31 08:35:35.874 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "045564f94eae0382118756614521dcaec5180e53" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:35.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.238 221324 DEBUG nova.virt.libvirt.imagebackend [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/357acf21-e111-4e02-8a13-c11baeb4f6ec/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/357acf21-e111-4e02-8a13-c11baeb4f6ec/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.315 221324 DEBUG nova.virt.libvirt.imagebackend [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/357acf21-e111-4e02-8a13-c11baeb4f6ec/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.316 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] cloning images/357acf21-e111-4e02-8a13-c11baeb4f6ec@snap to None/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:35:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:36.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.481 221324 DEBUG oslo_concurrency.lockutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "045564f94eae0382118756614521dcaec5180e53" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.539 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'migration_context' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.587 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.592 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start _get_guest_xml network_info=[{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:e3:11:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '357acf21-e111-4e02-8a13-c11baeb4f6ec', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.593 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'resources' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.629 221324 WARNING nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.637 221324 DEBUG nova.virt.libvirt.host [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.638 221324 DEBUG nova.virt.libvirt.host [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.647 221324 DEBUG nova.virt.libvirt.host [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.648 221324 DEBUG nova.virt.libvirt.host [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.651 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.652 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.653 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.653 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.653 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.654 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.654 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.654 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.655 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.655 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.655 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.655 221324 DEBUG nova.virt.hardware [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.656 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.688 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.783 221324 DEBUG nova.network.neutron [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:36 np0005603622 nova_compute[221301]: 2026-01-31 08:35:36.863 221324 INFO nova.compute.manager [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Took 2.17 seconds to deallocate network for instance.#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.020 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.020 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.134 221324 DEBUG nova.compute.manager [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.135 221324 DEBUG oslo_concurrency.lockutils [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.136 221324 DEBUG oslo_concurrency.lockutils [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.136 221324 DEBUG oslo_concurrency.lockutils [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.137 221324 DEBUG nova.compute.manager [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] No waiting events found dispatching network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.137 221324 WARNING nova.compute.manager [req-9cd3ada3-1dc4-4f68-9567-13d0e89cd953 req-8ea1c076-0158-4e7f-bc5a-8041f52d5644 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received unexpected event network-vif-plugged-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1843978316' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.228 221324 DEBUG oslo_concurrency.processutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.254 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.301 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.749 221324 DEBUG nova.compute.manager [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.750 221324 DEBUG oslo_concurrency.lockutils [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.750 221324 DEBUG oslo_concurrency.lockutils [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.750 221324 DEBUG oslo_concurrency.lockutils [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.751 221324 DEBUG nova.compute.manager [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.751 221324 WARNING nova.compute.manager [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.751 221324 DEBUG nova.compute.manager [req-d0f79e10-d9b3-4665-a04c-8a69d63dd912 req-779b9322-f07a-4e8b-8fc0-87e2a02253ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Received event network-vif-deleted-dcdd9995-1bf7-4e6c-843e-fa270ad890d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4084730857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3083270612' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.789 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.790 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.812 221324 DEBUG oslo_concurrency.processutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.584s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.819 221324 DEBUG nova.compute.provider_tree [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:37 np0005603622 nova_compute[221301]: 2026-01-31 08:35:37.970 221324 DEBUG nova.scheduler.client.report [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:37.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.038 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.108 221324 INFO nova.scheduler.client.report [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Deleted allocations for instance 884be038-2a08-4b69-a345-5b6ac2661dfc#033[00m
Jan 31 03:35:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:35:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3461818811' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.212 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.214 221324 DEBUG nova.virt.libvirt.vif [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1321512917',display_name='tempest-ServerStableDeviceRescueTest-server-1321512917',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1321512917',id=142,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:35:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-6ftdla88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:35:20Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9707f58b-2c76-40df-b8a8-5df66bab50bb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:e3:11:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.214 221324 DEBUG nova.network.os_vif_util [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "vif_mac": "fa:16:3e:e3:11:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.215 221324 DEBUG nova.network.os_vif_util [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.217 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.257 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <uuid>9707f58b-2c76-40df-b8a8-5df66bab50bb</uuid>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <name>instance-0000008e</name>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1321512917</nova:name>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:35:36</nova:creationTime>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:user uuid="b6733330b634472ca8c21316f1ee5057">tempest-ServerStableDeviceRescueTest-319343227-project-member</nova:user>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:project uuid="1e29363ca464487b931af54fe14166b1">tempest-ServerStableDeviceRescueTest-319343227</nova:project>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <nova:port uuid="2ed53bda-fdee-4925-8492-3be158eb8d21">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="serial">9707f58b-2c76-40df-b8a8-5df66bab50bb</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="uuid">9707f58b-2c76-40df-b8a8-5df66bab50bb</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.rescue">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <target dev="sdb" bus="usb"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <boot order="1"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:e3:11:72"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <target dev="tap2ed53bda-fd"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/console.log" append="off"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:35:38 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:35:38 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:35:38 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:35:38 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.269 221324 DEBUG oslo_concurrency.lockutils [None req-db724440-2693-4878-9061-cfc083f8f773 ef51681d234a4abc88ff433d0640b6e7 953a213fa5cb435ab3c04ad96152685f - - default default] Lock "884be038-2a08-4b69-a345-5b6ac2661dfc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.279 221324 INFO nova.virt.libvirt.driver [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance destroyed successfully.#033[00m
Jan 31 03:35:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:38.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.388 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.388 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.388 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.389 221324 DEBUG nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] No VIF found with MAC fa:16:3e:e3:11:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.389 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Using config drive#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.420 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.477 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:38 np0005603622 nova_compute[221301]: 2026-01-31 08:35:38.531 221324 DEBUG nova.objects.instance [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'keypairs' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.337 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Creating config drive at /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.344 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphf07c30h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.484 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphf07c30h" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.539 221324 DEBUG nova.storage.rbd_utils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] rbd image 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.543 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.698 221324 DEBUG oslo_concurrency.processutils [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue 9707f58b-2c76-40df-b8a8-5df66bab50bb_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.699 221324 INFO nova.virt.libvirt.driver [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Deleting local config drive /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:35:39 np0005603622 kernel: tap2ed53bda-fd: entered promiscuous mode
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.754 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:39Z|00655|binding|INFO|Claiming lport 2ed53bda-fdee-4925-8492-3be158eb8d21 for this chassis.
Jan 31 03:35:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:39Z|00656|binding|INFO|2ed53bda-fdee-4925-8492-3be158eb8d21: Claiming fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:35:39 np0005603622 NetworkManager[49080]: <info>  [1769848539.7582] manager: (tap2ed53bda-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Jan 31 03:35:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:39Z|00657|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 ovn-installed in OVS
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.764 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.766 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:39 np0005603622 systemd-machined[190406]: New machine qemu-67-instance-0000008e.
Jan 31 03:35:39 np0005603622 systemd[1]: Started Virtual Machine qemu-67-instance-0000008e.
Jan 31 03:35:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:39Z|00658|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 up in Southbound
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.823 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:39 np0005603622 systemd-udevd[277215]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.824 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 bound to our chassis#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.825 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:35:39 np0005603622 NetworkManager[49080]: <info>  [1769848539.8350] device (tap2ed53bda-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:35:39 np0005603622 NetworkManager[49080]: <info>  [1769848539.8367] device (tap2ed53bda-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.839 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[91ca135e-2a24-4f26-a25b-395b9519d41c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 podman[277189]: 2026-01-31 08:35:39.853112291 +0000 UTC m=+0.065356335 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.863 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[949c395f-ff95-4da2-8f44-ab36365f57e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.865 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2ddbda-3286-4ded-8bce-45ef79f02252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 podman[277188]: 2026-01-31 08:35:39.878420716 +0000 UTC m=+0.089723835 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.885 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8e8d54-2691-4fcc-a786-9be1e88eb990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.896 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[174c0efd-c47f-4dd3-984d-64b21238cf00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277249, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.904 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c0310f98-f5ea-4b1f-85d8-8dcd9ad59476]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277250, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277250, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.906 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:39 np0005603622 nova_compute[221301]: 2026-01-31 08:35:39.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.910 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.910 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.910 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:39.911 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:39.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:40.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.732 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 9707f58b-2c76-40df-b8a8-5df66bab50bb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.732 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848540.7313948, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.733 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.740 221324 DEBUG nova.compute.manager [None req-f1213951-4748-415d-9803-5e3dfbf61a83 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.968 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:40 np0005603622 nova_compute[221301]: 2026-01-31 08:35:40.972 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.180 221324 DEBUG nova.compute.manager [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.181 221324 DEBUG oslo_concurrency.lockutils [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.182 221324 DEBUG oslo_concurrency.lockutils [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.182 221324 DEBUG oslo_concurrency.lockutils [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.183 221324 DEBUG nova.compute.manager [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.184 221324 WARNING nova.compute.manager [req-84fce297-c129-43a3-9887-6dacd7ad3ce7 req-dd474c6a-bc3d-4b76-b69d-5f1cf4d42713 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.339 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848540.7356834, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.340 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Started (Lifecycle Event)#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.394 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:41 np0005603622 nova_compute[221301]: 2026-01-31 08:35:41.398 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:41.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:42.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.773427) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542774104, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 1175, "num_deletes": 255, "total_data_size": 2378805, "memory_usage": 2405840, "flush_reason": "Manual Compaction"}
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542910975, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 1556740, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59689, "largest_seqno": 60859, "table_properties": {"data_size": 1551316, "index_size": 2820, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12225, "raw_average_key_size": 20, "raw_value_size": 1540356, "raw_average_value_size": 2610, "num_data_blocks": 122, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848470, "oldest_key_time": 1769848470, "file_creation_time": 1769848542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 137084 microseconds, and 5292 cpu microseconds.
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.911036) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 1556740 bytes OK
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.911061) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.977889) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.977926) EVENT_LOG_v1 {"time_micros": 1769848542977917, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.977947) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 2373061, prev total WAL file size 2373061, number of live WAL files 2.
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.979087) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(1520KB)], [117(12MB)]
Jan 31 03:35:42 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848542979121, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 14957418, "oldest_snapshot_seqno": -1}
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8463 keys, 12785608 bytes, temperature: kUnknown
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543269505, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 12785608, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12728432, "index_size": 34920, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 219961, "raw_average_key_size": 25, "raw_value_size": 12577449, "raw_average_value_size": 1486, "num_data_blocks": 1366, "num_entries": 8463, "num_filter_entries": 8463, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:35:43 np0005603622 nova_compute[221301]: 2026-01-31 08:35:43.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.269727) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 12785608 bytes
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.370079) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 51.5 rd, 44.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 12.8 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(17.8) write-amplify(8.2) OK, records in: 8992, records dropped: 529 output_compression: NoCompression
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.370118) EVENT_LOG_v1 {"time_micros": 1769848543370104, "job": 74, "event": "compaction_finished", "compaction_time_micros": 290447, "compaction_time_cpu_micros": 25033, "output_level": 6, "num_output_files": 1, "total_output_size": 12785608, "num_input_records": 8992, "num_output_records": 8463, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543370591, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848543372702, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:42.978955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.373009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.373019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.373024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.373028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:35:43.373032) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:35:43 np0005603622 nova_compute[221301]: 2026-01-31 08:35:43.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:43.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:44.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.815 221324 DEBUG nova.compute.manager [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.816 221324 DEBUG oslo_concurrency.lockutils [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.816 221324 DEBUG oslo_concurrency.lockutils [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.817 221324 DEBUG oslo_concurrency.lockutils [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.817 221324 DEBUG nova.compute.manager [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:45 np0005603622 nova_compute[221301]: 2026-01-31 08:35:45.818 221324 WARNING nova.compute.manager [req-a5aa1061-ae05-40d6-9adb-70f045d6f9ab req-65613c43-a689-4ea2-9d53-e7bffd75be03 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:35:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:45.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:46.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:47 np0005603622 nova_compute[221301]: 2026-01-31 08:35:47.291 221324 INFO nova.compute.manager [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Unrescuing#033[00m
Jan 31 03:35:47 np0005603622 nova_compute[221301]: 2026-01-31 08:35:47.291 221324 DEBUG oslo_concurrency.lockutils [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:47 np0005603622 nova_compute[221301]: 2026-01-31 08:35:47.292 221324 DEBUG oslo_concurrency.lockutils [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:47 np0005603622 nova_compute[221301]: 2026-01-31 08:35:47.292 221324 DEBUG nova.network.neutron [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:35:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:48.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.349 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848533.347823, 884be038-2a08-4b69-a345-5b6ac2661dfc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.349 221324 INFO nova.compute.manager [-] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:48.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.427 221324 DEBUG nova.compute.manager [None req-0a25b865-abd2-4fd3-b354-43a204ef2f3a - - - - - -] [instance: 884be038-2a08-4b69-a345-5b6ac2661dfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:48 np0005603622 nova_compute[221301]: 2026-01-31 08:35:48.428 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:49 np0005603622 nova_compute[221301]: 2026-01-31 08:35:49.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:49 np0005603622 nova_compute[221301]: 2026-01-31 08:35:49.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:49 np0005603622 nova_compute[221301]: 2026-01-31 08:35:49.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:35:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 31 03:35:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:50.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:50 np0005603622 nova_compute[221301]: 2026-01-31 08:35:50.168 221324 DEBUG nova.network.neutron [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:35:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:50.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:50 np0005603622 nova_compute[221301]: 2026-01-31 08:35:50.565 221324 DEBUG oslo_concurrency.lockutils [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:35:50 np0005603622 nova_compute[221301]: 2026-01-31 08:35:50.566 221324 DEBUG nova.objects.instance [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'flavor' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:50 np0005603622 kernel: tap2ed53bda-fd (unregistering): left promiscuous mode
Jan 31 03:35:50 np0005603622 NetworkManager[49080]: <info>  [1769848550.9965] device (tap2ed53bda-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00659|binding|INFO|Releasing lport 2ed53bda-fdee-4925-8492-3be158eb8d21 from this chassis (sb_readonly=0)
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00660|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 down in Southbound
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00661|binding|INFO|Removing iface tap2ed53bda-fd ovn-installed in OVS
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.021 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Jan 31 03:35:51 np0005603622 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008e.scope: Consumed 10.871s CPU time.
Jan 31 03:35:51 np0005603622 systemd-machined[190406]: Machine qemu-67-instance-0000008e terminated.
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.054 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.056 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.058 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.072 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[53ce234d-5c65-4666-800e-5c967a5af191]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.106 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9a520ca6-eacd-4a1e-bd17-2789f52338a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.110 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[997ec6fe-4d0e-48ac-a241-cb9f47f3604e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.134 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[73435d8a-7a2f-4d43-b37c-a548f2afaf8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.148 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbb47d5-cf43-4ab4-bebf-e8bb606ab7d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277323, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.160 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02a2aac1-03ad-4517-bf9c-66ab8f534017]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277324, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277324, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.162 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.168 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.169 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.169 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.170 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.170 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.266 221324 INFO nova.virt.libvirt.driver [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance destroyed successfully.#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.267 221324 DEBUG nova.objects.instance [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:35:51 np0005603622 kernel: tap2ed53bda-fd: entered promiscuous mode
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00662|binding|INFO|Claiming lport 2ed53bda-fdee-4925-8492-3be158eb8d21 for this chassis.
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00663|binding|INFO|2ed53bda-fdee-4925-8492-3be158eb8d21: Claiming fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:35:51 np0005603622 systemd-udevd[277314]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 NetworkManager[49080]: <info>  [1769848551.3915] manager: (tap2ed53bda-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:51 np0005603622 NetworkManager[49080]: <info>  [1769848551.4000] device (tap2ed53bda-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:35:51 np0005603622 NetworkManager[49080]: <info>  [1769848551.4008] device (tap2ed53bda-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00664|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 ovn-installed in OVS
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.408 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:51Z|00665|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 up in Southbound
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.423 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.424 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 bound to our chassis#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.425 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:35:51 np0005603622 systemd-machined[190406]: New machine qemu-68-instance-0000008e.
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.443 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c13c172-2ba5-42fe-8acb-54796ce830ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 systemd[1]: Started Virtual Machine qemu-68-instance-0000008e.
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.472 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[807b33f2-598b-40f5-89fd-032f9702aec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.475 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e6b3f7-43fc-4d4e-b714-b84a953f1003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.505 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80e72aa1-e274-4325-b23f-f72817f07c65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.522 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1b182fd6-995b-44fc-baa5-d5210f1679cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 38788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277362, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.536 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8302ae03-e36d-4427-afaf-60ea423c08d6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277363, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277363, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.538 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 nova_compute[221301]: 2026-01-31 08:35:51.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.540 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.541 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.541 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:35:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:35:51.542 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:35:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:52.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.314 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 9707f58b-2c76-40df-b8a8-5df66bab50bb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.315 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848552.3139315, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.315 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.360 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.367 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:52.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.495 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.502 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.503 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848552.3171473, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.504 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Started (Lifecycle Event)#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.979 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:52 np0005603622 nova_compute[221301]: 2026-01-31 08:35:52.983 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.319 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.320 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.321 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.321 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.322 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.638 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:35:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1013984925' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.813 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.828 221324 DEBUG nova.compute.manager [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.829 221324 DEBUG oslo_concurrency.lockutils [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.829 221324 DEBUG oslo_concurrency.lockutils [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.830 221324 DEBUG oslo_concurrency.lockutils [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.830 221324 DEBUG nova.compute.manager [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:53 np0005603622 nova_compute[221301]: 2026-01-31 08:35:53.830 221324 WARNING nova.compute.manager [req-e81f05eb-f90d-4483-9245-15b9100b984c req-aba4da9b-a545-4c12-8b4a-9e48c8a3f49e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:35:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:54.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:35:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:54.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.557 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.557 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.560 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.561 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.755 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.756 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4121MB free_disk=20.851451873779297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.756 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:54 np0005603622 nova_compute[221301]: 2026-01-31 08:35:54.756 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.010 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.010 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.011 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.011 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.114 221324 DEBUG nova.compute.manager [None req-af8b10c2-fd69-4b74-a3be-20dcdd6fcc87 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.166 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:35:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:35:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3519386751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.616 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.622 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:35:55 np0005603622 nova_compute[221301]: 2026-01-31 08:35:55.672 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:35:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:56.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:56 np0005603622 nova_compute[221301]: 2026-01-31 08:35:56.040 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:35:56 np0005603622 nova_compute[221301]: 2026-01-31 08:35:56.041 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:56.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.345 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.346 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.346 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.346 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.346 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 WARNING nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.347 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.348 221324 WARNING nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.348 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.348 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.348 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.348 221324 DEBUG oslo_concurrency.lockutils [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.349 221324 DEBUG nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:35:57 np0005603622 nova_compute[221301]: 2026-01-31 08:35:57.349 221324 WARNING nova.compute.manager [req-b07c062e-2130-4134-8fe0-9f3393c0472c req-aee45589-3a72-4cd8-88ec-edc0f792750d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:35:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:35:58.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:58 np0005603622 nova_compute[221301]: 2026-01-31 08:35:58.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:35:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:35:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:35:58.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:35:58 np0005603622 nova_compute[221301]: 2026-01-31 08:35:58.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:35:58 np0005603622 nova_compute[221301]: 2026-01-31 08:35:58.942 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:35:58 np0005603622 nova_compute[221301]: 2026-01-31 08:35:58.943 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:35:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:35:59Z|00666|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:35:59 np0005603622 nova_compute[221301]: 2026-01-31 08:35:59.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:35:59 np0005603622 nova_compute[221301]: 2026-01-31 08:35:59.572 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:35:59 np0005603622 nova_compute[221301]: 2026-01-31 08:35:59.573 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:35:59 np0005603622 nova_compute[221301]: 2026-01-31 08:35:59.573 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:36:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:36:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:00.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:36:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:00.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:02.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:02.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:03 np0005603622 nova_compute[221301]: 2026-01-31 08:36:03.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:03 np0005603622 nova_compute[221301]: 2026-01-31 08:36:03.435 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:04.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:04.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.504 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.573 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.574 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.574 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.575 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:04 np0005603622 nova_compute[221301]: 2026-01-31 08:36:04.575 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:06.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:06 np0005603622 ovn_controller[130109]: 2026-01-31T08:36:06Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:11:72 10.100.0.9
Jan 31 03:36:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:06.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:08.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 31 03:36:08 np0005603622 nova_compute[221301]: 2026-01-31 08:36:08.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:08.246 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:36:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:08.247 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:36:08 np0005603622 nova_compute[221301]: 2026-01-31 08:36:08.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:08.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:08 np0005603622 nova_compute[221301]: 2026-01-31 08:36:08.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e334 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:10.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:10 np0005603622 podman[277474]: 2026-01-31 08:36:10.187598178 +0000 UTC m=+0.096383763 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 31 03:36:10 np0005603622 podman[277473]: 2026-01-31 08:36:10.194008789 +0000 UTC m=+0.102599289 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:36:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:10.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:11 np0005603622 nova_compute[221301]: 2026-01-31 08:36:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:11 np0005603622 nova_compute[221301]: 2026-01-31 08:36:11.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:36:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:12.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:12 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:12.249 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:36:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:12.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:13 np0005603622 nova_compute[221301]: 2026-01-31 08:36:13.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:13 np0005603622 nova_compute[221301]: 2026-01-31 08:36:13.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e334 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:14.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 31 03:36:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:16.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:16.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:18.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.437 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:18.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:18 np0005603622 nova_compute[221301]: 2026-01-31 08:36:18.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:20.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:20.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:22.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:22.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.477 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:23 np0005603622 nova_compute[221301]: 2026-01-31 08:36:23.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:24.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:36:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:24.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:36:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:36:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:36:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:26.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:28.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:28 np0005603622 nova_compute[221301]: 2026-01-31 08:36:28.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:28.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:30.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:30.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:30.891 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:30.892 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:36:30.892 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:32.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:36:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:32 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:36:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:32.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:33 np0005603622 nova_compute[221301]: 2026-01-31 08:36:33.447 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:33 np0005603622 nova_compute[221301]: 2026-01-31 08:36:33.447 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:36:33 np0005603622 nova_compute[221301]: 2026-01-31 08:36:33.483 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:33 np0005603622 nova_compute[221301]: 2026-01-31 08:36:33.716 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:36:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:34.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:34.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 31 03:36:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:36.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:36:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:36.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:36:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:38.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:36:38 np0005603622 nova_compute[221301]: 2026-01-31 08:36:38.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:38.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 31 03:36:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:36:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:40.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:40.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:41 np0005603622 podman[277699]: 2026-01-31 08:36:41.138050701 +0000 UTC m=+0.056254713 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:36:41 np0005603622 podman[277698]: 2026-01-31 08:36:41.176227299 +0000 UTC m=+0.093885986 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:36:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:42.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:42.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:43 np0005603622 nova_compute[221301]: 2026-01-31 08:36:43.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:44.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:44.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:46.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:46.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 31 03:36:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:48.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:48 np0005603622 nova_compute[221301]: 2026-01-31 08:36:48.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:48.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:48 np0005603622 nova_compute[221301]: 2026-01-31 08:36:48.667 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:49 np0005603622 nova_compute[221301]: 2026-01-31 08:36:49.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:49 np0005603622 nova_compute[221301]: 2026-01-31 08:36:49.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:36:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:50.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:50 np0005603622 nova_compute[221301]: 2026-01-31 08:36:50.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:50.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:51 np0005603622 nova_compute[221301]: 2026-01-31 08:36:51.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:52.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:52.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.495 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.754 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.755 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.755 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.756 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:36:53 np0005603622 nova_compute[221301]: 2026-01-31 08:36:53.757 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:54.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3111963811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.187 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:54.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.632 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.632 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.637 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.638 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.811 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.812 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3993MB free_disk=20.83078384399414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.812 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:36:54 np0005603622 nova_compute[221301]: 2026-01-31 08:36:54.813 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.536 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.536 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.536 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.536 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.624 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.683 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.684 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.709 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.734 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:36:55 np0005603622 nova_compute[221301]: 2026-01-31 08:36:55.798 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:36:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:36:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:56.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:36:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:36:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3744106434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:36:56 np0005603622 nova_compute[221301]: 2026-01-31 08:36:56.201 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:36:56 np0005603622 nova_compute[221301]: 2026-01-31 08:36:56.206 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:36:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:56.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:56 np0005603622 nova_compute[221301]: 2026-01-31 08:36:56.995 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:36:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:36:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:58 np0005603622 nova_compute[221301]: 2026-01-31 08:36:58.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:36:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:36:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:36:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:36:58.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:36:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:36:58 np0005603622 nova_compute[221301]: 2026-01-31 08:36:58.671 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:36:58 np0005603622 nova_compute[221301]: 2026-01-31 08:36:58.672 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:00.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:00.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:01 np0005603622 nova_compute[221301]: 2026-01-31 08:37:01.673 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:01 np0005603622 nova_compute[221301]: 2026-01-31 08:37:01.673 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:37:01 np0005603622 nova_compute[221301]: 2026-01-31 08:37:01.673 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:37:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:02.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:02 np0005603622 nova_compute[221301]: 2026-01-31 08:37:02.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:37:02 np0005603622 nova_compute[221301]: 2026-01-31 08:37:02.423 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:37:02 np0005603622 nova_compute[221301]: 2026-01-31 08:37:02.423 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:37:02 np0005603622 nova_compute[221301]: 2026-01-31 08:37:02.423 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:37:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:02.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:03 np0005603622 nova_compute[221301]: 2026-01-31 08:37:03.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:04.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:05 np0005603622 nova_compute[221301]: 2026-01-31 08:37:05.699 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:37:06 np0005603622 nova_compute[221301]: 2026-01-31 08:37:06.036 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:37:06 np0005603622 nova_compute[221301]: 2026-01-31 08:37:06.037 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:37:06 np0005603622 nova_compute[221301]: 2026-01-31 08:37:06.038 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:06 np0005603622 nova_compute[221301]: 2026-01-31 08:37:06.038 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:06 np0005603622 nova_compute[221301]: 2026-01-31 08:37:06.039 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:06.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:08 np0005603622 nova_compute[221301]: 2026-01-31 08:37:08.499 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:08.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:10.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:10.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:12 np0005603622 podman[277788]: 2026-01-31 08:37:12.141037184 +0000 UTC m=+0.053568960 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:37:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:12.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:12 np0005603622 podman[277787]: 2026-01-31 08:37:12.162830936 +0000 UTC m=+0.080319474 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:37:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:12.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:13 np0005603622 nova_compute[221301]: 2026-01-31 08:37:13.501 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:14.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:14.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:14 np0005603622 nova_compute[221301]: 2026-01-31 08:37:14.757 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:16.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:16.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:18.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:18 np0005603622 nova_compute[221301]: 2026-01-31 08:37:18.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:20.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:37:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3837399192' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:37:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:22.129 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:37:22 np0005603622 nova_compute[221301]: 2026-01-31 08:37:22.129 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:22.131 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:37:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:22.131 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:37:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:22.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:23 np0005603622 nova_compute[221301]: 2026-01-31 08:37:23.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:24.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:24.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:26.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:26.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:28 np0005603622 nova_compute[221301]: 2026-01-31 08:37:28.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:28.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:30.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:30.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:30.893 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:30.893 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:37:30.894 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:32.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.511 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.511 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.511 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:37:33 np0005603622 nova_compute[221301]: 2026-01-31 08:37:33.514 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:34.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:36.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:36.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.015 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.016 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.449 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.773 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.774 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.787 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:37:37 np0005603622 nova_compute[221301]: 2026-01-31 08:37:37.790 221324 INFO nova.compute.claims [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:37:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:38.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.516 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.548 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:38 np0005603622 nova_compute[221301]: 2026-01-31 08:37:38.549 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:37:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:38.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:40.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:40.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:37:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:37:41 np0005603622 nova_compute[221301]: 2026-01-31 08:37:41.486 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:37:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2631105235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:37:41 np0005603622 nova_compute[221301]: 2026-01-31 08:37:41.961 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:41 np0005603622 nova_compute[221301]: 2026-01-31 08:37:41.969 221324 DEBUG nova.compute.provider_tree [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:37:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:42.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:42.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:42 np0005603622 nova_compute[221301]: 2026-01-31 08:37:42.944 221324 DEBUG nova.scheduler.client.report [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:37:43 np0005603622 podman[277986]: 2026-01-31 08:37:43.141365258 +0000 UTC m=+0.063688961 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:37:43 np0005603622 podman[277985]: 2026-01-31 08:37:43.181299203 +0000 UTC m=+0.106832911 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 31 03:37:43 np0005603622 nova_compute[221301]: 2026-01-31 08:37:43.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:43 np0005603622 nova_compute[221301]: 2026-01-31 08:37:43.552 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:44.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:44.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:44 np0005603622 nova_compute[221301]: 2026-01-31 08:37:44.717 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 6.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:44 np0005603622 nova_compute[221301]: 2026-01-31 08:37:44.718 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:37:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:46.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:46 np0005603622 nova_compute[221301]: 2026-01-31 08:37:46.824 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:37:46 np0005603622 nova_compute[221301]: 2026-01-31 08:37:46.825 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:37:48 np0005603622 nova_compute[221301]: 2026-01-31 08:37:48.164 221324 INFO nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:37:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:48.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:48 np0005603622 nova_compute[221301]: 2026-01-31 08:37:48.553 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:48 np0005603622 nova_compute[221301]: 2026-01-31 08:37:48.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:49 np0005603622 nova_compute[221301]: 2026-01-31 08:37:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:49 np0005603622 nova_compute[221301]: 2026-01-31 08:37:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:49 np0005603622 nova_compute[221301]: 2026-01-31 08:37:49.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:37:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:37:49 np0005603622 nova_compute[221301]: 2026-01-31 08:37:49.952 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:37:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:50.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:50.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.402 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.404 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.405 221324 INFO nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Creating image(s)#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.446 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.488 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:51 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:37:51 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.528 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.534 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.595 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.597 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.598 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.598 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.635 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:37:51 np0005603622 nova_compute[221301]: 2026-01-31 08:37:51.640 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:52.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:52 np0005603622 nova_compute[221301]: 2026-01-31 08:37:52.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:52 np0005603622 nova_compute[221301]: 2026-01-31 08:37:52.565 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.925s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:52.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:52 np0005603622 nova_compute[221301]: 2026-01-31 08:37:52.672 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] resizing rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:37:52 np0005603622 nova_compute[221301]: 2026-01-31 08:37:52.816 221324 DEBUG nova.objects.instance [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'migration_context' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:37:52 np0005603622 nova_compute[221301]: 2026-01-31 08:37:52.931 221324 DEBUG nova.policy [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1c6e7eff11b435a81429826a682b32f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0bfe11bd9d694684b527666e2c378eed', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.163 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.164 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Ensure instance console log exists: /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.165 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.165 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.166 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:53 np0005603622 nova_compute[221301]: 2026-01-31 08:37:53.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:37:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:37:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:37:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:54.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:37:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:54.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:55 np0005603622 nova_compute[221301]: 2026-01-31 08:37:55.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:37:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:56.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:56.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:56 np0005603622 nova_compute[221301]: 2026-01-31 08:37:56.818 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:37:56 np0005603622 nova_compute[221301]: 2026-01-31 08:37:56.818 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:37:56 np0005603622 nova_compute[221301]: 2026-01-31 08:37:56.819 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:37:56 np0005603622 nova_compute[221301]: 2026-01-31 08:37:56.819 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:37:56 np0005603622 nova_compute[221301]: 2026-01-31 08:37:56.820 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:37:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:37:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/900075497' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:37:57 np0005603622 nova_compute[221301]: 2026-01-31 08:37:57.247 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:37:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:37:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:37:58.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:37:58 np0005603622 nova_compute[221301]: 2026-01-31 08:37:58.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:37:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:37:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:37:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:37:58.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:37:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:00.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:00.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.074 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.074 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.078 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.079 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.091792) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682091841, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 1633, "num_deletes": 252, "total_data_size": 3713124, "memory_usage": 3748344, "flush_reason": "Manual Compaction"}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682106481, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 1489167, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 60864, "largest_seqno": 62492, "table_properties": {"data_size": 1483842, "index_size": 2593, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14077, "raw_average_key_size": 21, "raw_value_size": 1472163, "raw_average_value_size": 2213, "num_data_blocks": 115, "num_entries": 665, "num_filter_entries": 665, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848544, "oldest_key_time": 1769848544, "file_creation_time": 1769848682, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 14755 microseconds, and 5227 cpu microseconds.
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.106541) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 1489167 bytes OK
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.106568) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.108434) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.108457) EVENT_LOG_v1 {"time_micros": 1769848682108450, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.108476) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 3705665, prev total WAL file size 3705665, number of live WAL files 2.
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.109928) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303033' seq:72057594037927935, type:22 .. '6D6772737461740032323534' seq:0, type:0; will stop at (end)
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(1454KB)], [120(12MB)]
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682110027, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 14274775, "oldest_snapshot_seqno": -1}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8670 keys, 11381618 bytes, temperature: kUnknown
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682194946, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11381618, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11325795, "index_size": 33055, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21701, "raw_key_size": 224601, "raw_average_key_size": 25, "raw_value_size": 11174051, "raw_average_value_size": 1288, "num_data_blocks": 1291, "num_entries": 8670, "num_filter_entries": 8670, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848682, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.195196) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11381618 bytes
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.197697) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.9 rd, 133.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.2 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(17.2) write-amplify(7.6) OK, records in: 9128, records dropped: 458 output_compression: NoCompression
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.197713) EVENT_LOG_v1 {"time_micros": 1769848682197705, "job": 76, "event": "compaction_finished", "compaction_time_micros": 85005, "compaction_time_cpu_micros": 32760, "output_level": 6, "num_output_files": 1, "total_output_size": 11381618, "num_input_records": 9128, "num_output_records": 8670, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682197924, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848682199093, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.109825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.199131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.199135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.199136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.199137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:02.199139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.231 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.233 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3934MB free_disk=20.84152603149414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.233 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:02 np0005603622 nova_compute[221301]: 2026-01-31 08:38:02.233 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:02.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:02.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.565 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.567 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:38:03 np0005603622 nova_compute[221301]: 2026-01-31 08:38:03.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:04.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:04.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.545 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Successfully created port: b62616fc-dd91-4cc2-b323-70fffebab4fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.715 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.715 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.716 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 815ef28e-2297-49ba-88a1-23f722c3fa0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.716 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:38:05 np0005603622 nova_compute[221301]: 2026-01-31 08:38:05.717 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:38:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:06.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:06.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:08.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:08 np0005603622 nova_compute[221301]: 2026-01-31 08:38:08.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:08.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:09 np0005603622 nova_compute[221301]: 2026-01-31 08:38:09.363 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1446590847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:09 np0005603622 nova_compute[221301]: 2026-01-31 08:38:09.817 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:09 np0005603622 nova_compute[221301]: 2026-01-31 08:38:09.825 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:38:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:10.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:10 np0005603622 nova_compute[221301]: 2026-01-31 08:38:10.252 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:38:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:10.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:10 np0005603622 nova_compute[221301]: 2026-01-31 08:38:10.896 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:38:10 np0005603622 nova_compute[221301]: 2026-01-31 08:38:10.897 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:11 np0005603622 nova_compute[221301]: 2026-01-31 08:38:11.898 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:11 np0005603622 nova_compute[221301]: 2026-01-31 08:38:11.898 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:38:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:12.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:12 np0005603622 nova_compute[221301]: 2026-01-31 08:38:12.875 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Successfully updated port: b62616fc-dd91-4cc2-b323-70fffebab4fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.569 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.803 221324 DEBUG nova.compute.manager [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.803 221324 DEBUG nova.compute.manager [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.803 221324 DEBUG oslo_concurrency.lockutils [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.804 221324 DEBUG oslo_concurrency.lockutils [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.804 221324 DEBUG nova.network.neutron [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:38:13 np0005603622 nova_compute[221301]: 2026-01-31 08:38:13.882 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:14 np0005603622 nova_compute[221301]: 2026-01-31 08:38:14.044 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:14 np0005603622 nova_compute[221301]: 2026-01-31 08:38:14.044 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:14 np0005603622 nova_compute[221301]: 2026-01-31 08:38:14.045 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:38:14 np0005603622 podman[278293]: 2026-01-31 08:38:14.133270707 +0000 UTC m=+0.059518449 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 03:38:14 np0005603622 podman[278292]: 2026-01-31 08:38:14.167864201 +0000 UTC m=+0.092120420 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:38:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:14.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:14.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:15 np0005603622 nova_compute[221301]: 2026-01-31 08:38:15.399 221324 DEBUG nova.network.neutron [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:38:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:16 np0005603622 nova_compute[221301]: 2026-01-31 08:38:16.573 221324 DEBUG nova.network.neutron [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:16.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:16 np0005603622 nova_compute[221301]: 2026-01-31 08:38:16.701 221324 DEBUG oslo_concurrency.lockutils [req-c8edde9e-45c9-4fbe-b708-1b7551852b7f req-f48d3196-0e12-4a83-a62e-a7bfa59b69f4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:16 np0005603622 nova_compute[221301]: 2026-01-31 08:38:16.701 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:16 np0005603622 nova_compute[221301]: 2026-01-31 08:38:16.702 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:38:17 np0005603622 nova_compute[221301]: 2026-01-31 08:38:17.160 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:38:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:18.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.627 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:18.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.669 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.669 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.670 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.670 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:18 np0005603622 nova_compute[221301]: 2026-01-31 08:38:18.671 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:20.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:20.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:22.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:22.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:23 np0005603622 nova_compute[221301]: 2026-01-31 08:38:23.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:23 np0005603622 nova_compute[221301]: 2026-01-31 08:38:23.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:24.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:24.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.064 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:25.065 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:25.067 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.411 221324 DEBUG nova.network.neutron [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.710 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.710 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance network_info: |[{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.715 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start _get_guest_xml network_info=[{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.721 221324 WARNING nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.731 221324 DEBUG nova.virt.libvirt.host [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.732 221324 DEBUG nova.virt.libvirt.host [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.735 221324 DEBUG nova.virt.libvirt.host [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.736 221324 DEBUG nova.virt.libvirt.host [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.738 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.739 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.740 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.740 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.741 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.741 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.742 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.742 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.743 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.743 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.744 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.744 221324 DEBUG nova.virt.hardware [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:38:25 np0005603622 nova_compute[221301]: 2026-01-31 08:38:25.749 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:38:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171097878' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.203 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.226 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.230 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:26.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:26.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:38:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/244803189' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.871 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.874 221324 DEBUG nova.virt.libvirt.vif [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:37:50Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.874 221324 DEBUG nova.network.os_vif_util [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.876 221324 DEBUG nova.network.os_vif_util [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:26 np0005603622 nova_compute[221301]: 2026-01-31 08:38:26.878 221324 DEBUG nova.objects.instance [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'pci_devices' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.002 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <uuid>815ef28e-2297-49ba-88a1-23f722c3fa0a</uuid>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <name>instance-00000092</name>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2115362642</nova:name>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:38:25</nova:creationTime>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <nova:port uuid="b62616fc-dd91-4cc2-b323-70fffebab4fb">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="serial">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="uuid">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:44:41:0a"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <target dev="tapb62616fc-dd"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/console.log" append="off"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:38:27 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:38:27 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:38:27 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:38:27 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.003 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Preparing to wait for external event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.004 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.004 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.004 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.005 221324 DEBUG nova.virt.libvirt.vif [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:37:50Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.005 221324 DEBUG nova.network.os_vif_util [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.006 221324 DEBUG nova.network.os_vif_util [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.006 221324 DEBUG os_vif [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.007 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.008 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.014 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb62616fc-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.015 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb62616fc-dd, col_values=(('external_ids', {'iface-id': 'b62616fc-dd91-4cc2-b323-70fffebab4fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:41:0a', 'vm-uuid': '815ef28e-2297-49ba-88a1-23f722c3fa0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:27 np0005603622 NetworkManager[49080]: <info>  [1769848707.0208] manager: (tapb62616fc-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.028 221324 INFO os_vif [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.619 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.621 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.621 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No VIF found with MAC fa:16:3e:44:41:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.622 221324 INFO nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Using config drive#033[00m
Jan 31 03:38:27 np0005603622 nova_compute[221301]: 2026-01-31 08:38:27.654 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:28.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:28 np0005603622 nova_compute[221301]: 2026-01-31 08:38:28.576 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:28.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:29 np0005603622 nova_compute[221301]: 2026-01-31 08:38:29.515 221324 INFO nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Creating config drive at /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config#033[00m
Jan 31 03:38:29 np0005603622 nova_compute[221301]: 2026-01-31 08:38:29.520 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplfy13e3g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:29 np0005603622 nova_compute[221301]: 2026-01-31 08:38:29.663 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplfy13e3g" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:29 np0005603622 nova_compute[221301]: 2026-01-31 08:38:29.697 221324 DEBUG nova.storage.rbd_utils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:38:29 np0005603622 nova_compute[221301]: 2026-01-31 08:38:29.702 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:30.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:30.894 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:30.894 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:30.895 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.065 221324 DEBUG oslo_concurrency.processutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config 815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.363s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.066 221324 INFO nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Deleting local config drive /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/disk.config because it was imported into RBD.#033[00m
Jan 31 03:38:32 np0005603622 kernel: tapb62616fc-dd: entered promiscuous mode
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.1307] manager: (tapb62616fc-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.131 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:32Z|00667|binding|INFO|Claiming lport b62616fc-dd91-4cc2-b323-70fffebab4fb for this chassis.
Jan 31 03:38:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:32Z|00668|binding|INFO|b62616fc-dd91-4cc2-b323-70fffebab4fb: Claiming fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:32Z|00669|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb ovn-installed in OVS
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.212 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 systemd-machined[190406]: New machine qemu-69-instance-00000092.
Jan 31 03:38:32 np0005603622 systemd-udevd[278472]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:38:32 np0005603622 systemd[1]: Started Virtual Machine qemu-69-instance-00000092.
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.2295] device (tapb62616fc-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.2314] device (tapb62616fc-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:38:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:32.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:32Z|00670|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb up in Southbound
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.556 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.558 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f bound to our chassis#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.560 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d238e24-9954-4b32-b589-6db6c8760a3f#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.574 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1e118b64-6a4e-4315-b05a-556e14617cfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.575 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d238e24-91 in ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.577 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d238e24-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.577 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d633c9fd-d08b-4ff6-812c-c66dc73883ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.579 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[103d2b00-4d0c-47d2-a948-bd50559c6c9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.591 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[c9675b77-fdd2-4cfa-b2e0-10ac86fc288d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.604 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e198c267-ffc3-4c34-8d38-be8c89e7960b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.620 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848712.6197326, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.620 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Started (Lifecycle Event)#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.637 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[560b810f-44f2-40bd-9348-3fc2b0ee5b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.6451] manager: (tap3d238e24-90): new Veth device (/org/freedesktop/NetworkManager/Devices/301)
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.645 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[75b31134-c010-45d5-b418-622004c03262]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.665 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.671 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848712.6199408, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.671 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:38:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:32.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.675 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[67442b24-3b56-4a5a-9cd4-da27e9716a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.679 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f851e0fd-166f-4912-8c98-af0346db5173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.6984] device (tap3d238e24-90): carrier: link connected
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.702 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[94eb1144-079b-441b-8958-15b4965bca39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.717 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[801eaab5-1e5c-49f1-8966-d0815380b9aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796403, 'reachable_time': 43121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278550, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.733 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[96899480-bdfd-4ea2-8b99-4ad6b394b952]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:782a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 796403, 'tstamp': 796403}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278551, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.747 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.750 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[406ed9b3-7d96-4220-943f-c7a5ad383d49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796403, 'reachable_time': 43121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278552, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.751 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.779 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fc469043-e07c-46bd-93ad-27c5df796acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.816 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[81eb44c4-ec17-436b-a4bd-1cc2d9503e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.817 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.817 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.818 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d238e24-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 NetworkManager[49080]: <info>  [1769848712.8200] manager: (tap3d238e24-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 31 03:38:32 np0005603622 kernel: tap3d238e24-90: entered promiscuous mode
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.823 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d238e24-90, col_values=(('external_ids', {'iface-id': '7571c123-c1d9-4ad9-a9c7-718eca889c7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.824 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:32Z|00671|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=0)
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.824 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.827 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.828 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.829 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a362271c-848f-4274-bacb-96e32d04e1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.830 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:38:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:32.831 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'env', 'PROCESS_TAG=haproxy-3d238e24-9954-4b32-b589-6db6c8760a3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d238e24-9954-4b32-b589-6db6c8760a3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:38:32 np0005603622 nova_compute[221301]: 2026-01-31 08:38:32.855 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:38:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:38:33.069 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:38:33 np0005603622 podman[278584]: 2026-01-31 08:38:33.152718417 +0000 UTC m=+0.027284809 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.304 221324 DEBUG nova.compute.manager [req-964da147-ceaa-4d60-8a02-ef133eb3378a req-379c0088-4c4a-44b1-bb74-31e106b67957 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.304 221324 DEBUG oslo_concurrency.lockutils [req-964da147-ceaa-4d60-8a02-ef133eb3378a req-379c0088-4c4a-44b1-bb74-31e106b67957 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.304 221324 DEBUG oslo_concurrency.lockutils [req-964da147-ceaa-4d60-8a02-ef133eb3378a req-379c0088-4c4a-44b1-bb74-31e106b67957 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.305 221324 DEBUG oslo_concurrency.lockutils [req-964da147-ceaa-4d60-8a02-ef133eb3378a req-379c0088-4c4a-44b1-bb74-31e106b67957 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.305 221324 DEBUG nova.compute.manager [req-964da147-ceaa-4d60-8a02-ef133eb3378a req-379c0088-4c4a-44b1-bb74-31e106b67957 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Processing event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.306 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.309 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848713.309643, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.310 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.312 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.314 221324 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance spawned successfully.#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.314 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:38:33 np0005603622 podman[278584]: 2026-01-31 08:38:33.34283624 +0000 UTC m=+0.217402602 container create 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.357 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.361 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:38:33 np0005603622 systemd[1]: Started libpod-conmon-1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee.scope.
Jan 31 03:38:33 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.424 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.425 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.426 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.427 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.427 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.428 221324 DEBUG nova.virt.libvirt.driver [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:38:33 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a179d386be7f1d4509858b0b343d3f561d17ae212d40441075e5e9e04f2e44e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.433 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:38:33 np0005603622 podman[278584]: 2026-01-31 08:38:33.443816915 +0000 UTC m=+0.318383307 container init 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:38:33 np0005603622 podman[278584]: 2026-01-31 08:38:33.451949432 +0000 UTC m=+0.326515794 container start 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 03:38:33 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [NOTICE]   (278603) : New worker (278605) forked
Jan 31 03:38:33 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [NOTICE]   (278603) : Loading success.
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.896 221324 INFO nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Took 42.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:38:33 np0005603622 nova_compute[221301]: 2026-01-31 08:38:33.897 221324 DEBUG nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:38:34 np0005603622 nova_compute[221301]: 2026-01-31 08:38:34.041 221324 INFO nova.compute.manager [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Took 56.33 seconds to build instance.#033[00m
Jan 31 03:38:34 np0005603622 nova_compute[221301]: 2026-01-31 08:38:34.128 221324 DEBUG oslo_concurrency.lockutils [None req-e0218ee6-e0dd-4fdc-85b4-da631fb90f3c f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 57.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 31 03:38:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:34.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 31 03:38:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:34.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:36.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.304 221324 DEBUG nova.compute.manager [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.304 221324 DEBUG oslo_concurrency.lockutils [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.305 221324 DEBUG oslo_concurrency.lockutils [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.305 221324 DEBUG oslo_concurrency.lockutils [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.305 221324 DEBUG nova.compute.manager [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:38:36 np0005603622 nova_compute[221301]: 2026-01-31 08:38:36.305 221324 WARNING nova.compute.manager [req-7f69a63e-88cc-47a3-8d48-d4f13b098a39 req-edd7742e-0ab9-4ab6-9cda-83a4c5d38f1f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state None.#033[00m
Jan 31 03:38:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:36.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:37 np0005603622 nova_compute[221301]: 2026-01-31 08:38:37.021 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:38.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:38 np0005603622 nova_compute[221301]: 2026-01-31 08:38:38.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:38.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:40.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:41 np0005603622 nova_compute[221301]: 2026-01-31 08:38:41.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:41 np0005603622 NetworkManager[49080]: <info>  [1769848721.5057] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 31 03:38:41 np0005603622 NetworkManager[49080]: <info>  [1769848721.5072] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 31 03:38:41 np0005603622 nova_compute[221301]: 2026-01-31 08:38:41.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:41Z|00672|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:38:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:41Z|00673|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=0)
Jan 31 03:38:41 np0005603622 nova_compute[221301]: 2026-01-31 08:38:41.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:42 np0005603622 nova_compute[221301]: 2026-01-31 08:38:42.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:42.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:42.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:43 np0005603622 nova_compute[221301]: 2026-01-31 08:38:43.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:44.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:44.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:45 np0005603622 podman[278616]: 2026-01-31 08:38:45.130833546 +0000 UTC m=+0.052174543 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:38:45 np0005603622 podman[278615]: 2026-01-31 08:38:45.155143535 +0000 UTC m=+0.078362872 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:38:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:46.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:46.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:46 np0005603622 nova_compute[221301]: 2026-01-31 08:38:46.924 221324 DEBUG nova.compute.manager [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:38:46 np0005603622 nova_compute[221301]: 2026-01-31 08:38:46.924 221324 DEBUG nova.compute.manager [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:38:46 np0005603622 nova_compute[221301]: 2026-01-31 08:38:46.924 221324 DEBUG oslo_concurrency.lockutils [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:38:46 np0005603622 nova_compute[221301]: 2026-01-31 08:38:46.924 221324 DEBUG oslo_concurrency.lockutils [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:38:46 np0005603622 nova_compute[221301]: 2026-01-31 08:38:46.925 221324 DEBUG nova.network.neutron [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:38:47 np0005603622 nova_compute[221301]: 2026-01-31 08:38:47.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:48.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:48 np0005603622 nova_compute[221301]: 2026-01-31 08:38:48.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:48.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:48Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:38:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:38:48Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:38:49 np0005603622 podman[278831]: 2026-01-31 08:38:49.954865022 +0000 UTC m=+0.278066341 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 03:38:50 np0005603622 podman[278831]: 2026-01-31 08:38:50.057758168 +0000 UTC m=+0.380959467 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 31 03:38:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:50.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:50 np0005603622 nova_compute[221301]: 2026-01-31 08:38:50.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:50 np0005603622 nova_compute[221301]: 2026-01-31 08:38:50.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:38:50 np0005603622 nova_compute[221301]: 2026-01-31 08:38:50.679 221324 DEBUG nova.network.neutron [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:38:50 np0005603622 nova_compute[221301]: 2026-01-31 08:38:50.680 221324 DEBUG nova.network.neutron [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:38:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:50.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:50 np0005603622 nova_compute[221301]: 2026-01-31 08:38:50.745 221324 DEBUG oslo_concurrency.lockutils [req-6c7eaa6e-0aab-4aaf-af85-ba506e963000 req-3c1c0c78-efdb-4ff2-aee5-032128965624 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:38:51 np0005603622 nova_compute[221301]: 2026-01-31 08:38:51.426 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:38:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:38:52 np0005603622 nova_compute[221301]: 2026-01-31 08:38:52.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:52.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:52 np0005603622 nova_compute[221301]: 2026-01-31 08:38:52.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:52.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:53 np0005603622 nova_compute[221301]: 2026-01-31 08:38:53.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:54.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:54 np0005603622 nova_compute[221301]: 2026-01-31 08:38:54.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:54.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:54 np0005603622 nova_compute[221301]: 2026-01-31 08:38:54.876 221324 INFO nova.compute.manager [None req-0cde8181-9473-4b04-ad2a-35d8b55f2786 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Get console output#033[00m
Jan 31 03:38:54 np0005603622 nova_compute[221301]: 2026-01-31 08:38:54.882 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.813961) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735814008, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 803, "num_deletes": 250, "total_data_size": 1492804, "memory_usage": 1510304, "flush_reason": "Manual Compaction"}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735851108, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 985411, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62497, "largest_seqno": 63295, "table_properties": {"data_size": 981599, "index_size": 1592, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 7780, "raw_average_key_size": 17, "raw_value_size": 973989, "raw_average_value_size": 2178, "num_data_blocks": 71, "num_entries": 447, "num_filter_entries": 447, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848682, "oldest_key_time": 1769848682, "file_creation_time": 1769848735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 37261 microseconds, and 4655 cpu microseconds.
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.851214) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 985411 bytes OK
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.851250) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.853469) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.853490) EVENT_LOG_v1 {"time_micros": 1769848735853483, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.853513) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 1488625, prev total WAL file size 1488625, number of live WAL files 2.
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.854060) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353031' seq:0, type:0; will stop at (end)
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(962KB)], [123(10MB)]
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735854148, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 12367029, "oldest_snapshot_seqno": -1}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8604 keys, 11295229 bytes, temperature: kUnknown
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735946896, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 11295229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11239870, "index_size": 32719, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21573, "raw_key_size": 224958, "raw_average_key_size": 26, "raw_value_size": 11089080, "raw_average_value_size": 1288, "num_data_blocks": 1260, "num_entries": 8604, "num_filter_entries": 8604, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848735, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.947214) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 11295229 bytes
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.948659) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.2 rd, 121.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 10.9 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(24.0) write-amplify(11.5) OK, records in: 9117, records dropped: 513 output_compression: NoCompression
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.948689) EVENT_LOG_v1 {"time_micros": 1769848735948676, "job": 78, "event": "compaction_finished", "compaction_time_micros": 92854, "compaction_time_cpu_micros": 31672, "output_level": 6, "num_output_files": 1, "total_output_size": 11295229, "num_input_records": 9117, "num_output_records": 8604, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735948993, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848735950495, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.853969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.950598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.950606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.950609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.950611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:55 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:38:55.950614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:38:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:38:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.609 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.610 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.610 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.610 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:38:56 np0005603622 nova_compute[221301]: 2026-01-31 08:38:56.611 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:56.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:57 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1652218873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.025 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.066 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.318 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.319 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.323 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.323 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.326 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.326 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.543 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.545 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3666MB free_disk=20.739837646484375GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.545 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.546 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.877 221324 INFO nova.compute.manager [None req-4031566d-fdf9-4e78-8794-cf4ae75e9111 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Get console output#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.883 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.908 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.908 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.908 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 815ef28e-2297-49ba-88a1-23f722c3fa0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.909 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:38:57 np0005603622 nova_compute[221301]: 2026-01-31 08:38:57.909 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:38:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:38:58 np0005603622 nova_compute[221301]: 2026-01-31 08:38:58.090 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:38:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:38:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:38:58.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:38:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:38:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1113208861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:38:58 np0005603622 nova_compute[221301]: 2026-01-31 08:38:58.504 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:38:58 np0005603622 nova_compute[221301]: 2026-01-31 08:38:58.510 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:38:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:38:58 np0005603622 nova_compute[221301]: 2026-01-31 08:38:58.691 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:38:58 np0005603622 nova_compute[221301]: 2026-01-31 08:38:58.703 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:38:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:38:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:38:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:38:58.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:38:59 np0005603622 nova_compute[221301]: 2026-01-31 08:38:59.301 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:38:59 np0005603622 nova_compute[221301]: 2026-01-31 08:38:59.301 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:00 np0005603622 nova_compute[221301]: 2026-01-31 08:39:00.303 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:00 np0005603622 nova_compute[221301]: 2026-01-31 08:39:00.303 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:39:00 np0005603622 nova_compute[221301]: 2026-01-31 08:39:00.304 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:39:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:00.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:00.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:01 np0005603622 nova_compute[221301]: 2026-01-31 08:39:01.165 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:01 np0005603622 nova_compute[221301]: 2026-01-31 08:39:01.165 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:01 np0005603622 nova_compute[221301]: 2026-01-31 08:39:01.165 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:39:01 np0005603622 nova_compute[221301]: 2026-01-31 08:39:01.165 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:02 np0005603622 nova_compute[221301]: 2026-01-31 08:39:02.069 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:02.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:02.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:03 np0005603622 nova_compute[221301]: 2026-01-31 08:39:03.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:04 np0005603622 nova_compute[221301]: 2026-01-31 08:39:04.215 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [{"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:04 np0005603622 nova_compute[221301]: 2026-01-31 08:39:04.262 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9aac65c2-63f8-4383-80ed-6600852c6fb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:04 np0005603622 nova_compute[221301]: 2026-01-31 08:39:04.262 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:39:04 np0005603622 nova_compute[221301]: 2026-01-31 08:39:04.262 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:04 np0005603622 nova_compute[221301]: 2026-01-31 08:39:04.263 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:04.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:04.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:05 np0005603622 nova_compute[221301]: 2026-01-31 08:39:05.000 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:05 np0005603622 nova_compute[221301]: 2026-01-31 08:39:05.001 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:05 np0005603622 nova_compute[221301]: 2026-01-31 08:39:05.001 221324 DEBUG nova.network.neutron [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:39:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:06.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:06.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:07 np0005603622 nova_compute[221301]: 2026-01-31 08:39:07.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:07 np0005603622 nova_compute[221301]: 2026-01-31 08:39:07.528 221324 DEBUG nova.network.neutron [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 31 03:39:07 np0005603622 nova_compute[221301]: 2026-01-31 08:39:07.792 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:08.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:08.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:08 np0005603622 nova_compute[221301]: 2026-01-31 08:39:08.733 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.126 221324 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.127 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Creating file /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/d31fd4b064bc419a800ed66d21761f86.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.127 221324 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/d31fd4b064bc419a800ed66d21761f86.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:10.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.544 221324 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/d31fd4b064bc419a800ed66d21761f86.tmp" returned: 1 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.545 221324 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/d31fd4b064bc419a800ed66d21761f86.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.545 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Creating directory /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.545 221324 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:10.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.768 221324 DEBUG oslo_concurrency.processutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:10 np0005603622 nova_compute[221301]: 2026-01-31 08:39:10.773 221324 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:39:12 np0005603622 nova_compute[221301]: 2026-01-31 08:39:12.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 31 03:39:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:12.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:12.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.394417) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753394474, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 491, "num_deletes": 251, "total_data_size": 622385, "memory_usage": 631848, "flush_reason": "Manual Compaction"}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753416340, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 410188, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63300, "largest_seqno": 63786, "table_properties": {"data_size": 407517, "index_size": 706, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6695, "raw_average_key_size": 19, "raw_value_size": 402015, "raw_average_value_size": 1158, "num_data_blocks": 30, "num_entries": 347, "num_filter_entries": 347, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848736, "oldest_key_time": 1769848736, "file_creation_time": 1769848753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 22009 microseconds, and 2145 cpu microseconds.
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.416421) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 410188 bytes OK
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.416447) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.458993) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.459090) EVENT_LOG_v1 {"time_micros": 1769848753459054, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.459125) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 619405, prev total WAL file size 619446, number of live WAL files 2.
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.459894) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(400KB)], [126(10MB)]
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753459961, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 11705417, "oldest_snapshot_seqno": -1}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 8434 keys, 9792809 bytes, temperature: kUnknown
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753628719, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 9792809, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9739879, "index_size": 30737, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21125, "raw_key_size": 222158, "raw_average_key_size": 26, "raw_value_size": 9593239, "raw_average_value_size": 1137, "num_data_blocks": 1170, "num_entries": 8434, "num_filter_entries": 8434, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.629112) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 9792809 bytes
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.663071) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.3 rd, 58.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.8 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(52.4) write-amplify(23.9) OK, records in: 8951, records dropped: 517 output_compression: NoCompression
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.663121) EVENT_LOG_v1 {"time_micros": 1769848753663102, "job": 80, "event": "compaction_finished", "compaction_time_micros": 168888, "compaction_time_cpu_micros": 37675, "output_level": 6, "num_output_files": 1, "total_output_size": 9792809, "num_input_records": 8951, "num_output_records": 8434, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753663468, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848753664974, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.459737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.665067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.665082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.665085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.665090) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:39:13.665093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:39:13 np0005603622 kernel: tapb62616fc-dd (unregistering): left promiscuous mode
Jan 31 03:39:13 np0005603622 NetworkManager[49080]: <info>  [1769848753.6935] device (tapb62616fc-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:39:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:13Z|00674|binding|INFO|Releasing lport b62616fc-dd91-4cc2-b323-70fffebab4fb from this chassis (sb_readonly=0)
Jan 31 03:39:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:13Z|00675|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb down in Southbound
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:13Z|00676|binding|INFO|Removing iface tapb62616fc-dd ovn-installed in OVS
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.735 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 31 03:39:13 np0005603622 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000092.scope: Consumed 14.481s CPU time.
Jan 31 03:39:13 np0005603622 systemd-machined[190406]: Machine qemu-69-instance-00000092 terminated.
Jan 31 03:39:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:13.807 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:13.810 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f unbound from our chassis#033[00m
Jan 31 03:39:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:13.812 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d238e24-9954-4b32-b589-6db6c8760a3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:39:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:13.814 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a05bab79-7822-449c-adee-23f5be1f6e7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:13.816 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace which is not needed anymore#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.936 221324 INFO nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.947 221324 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance destroyed successfully.#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.949 221324 DEBUG nova.virt.libvirt.vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:39:03Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.950 221324 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1478468357", "vif_mac": "fa:16:3e:44:41:0a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.952 221324 DEBUG nova.network.os_vif_util [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.953 221324 DEBUG os_vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.957 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.958 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb62616fc-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.961 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.967 221324 INFO os_vif [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.971 221324 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:13 np0005603622 nova_compute[221301]: 2026-01-31 08:39:13.971 221324 DEBUG nova.virt.libvirt.driver [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:39:14 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [NOTICE]   (278603) : haproxy version is 2.8.14-c23fe91
Jan 31 03:39:14 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [NOTICE]   (278603) : path to executable is /usr/sbin/haproxy
Jan 31 03:39:14 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [WARNING]  (278603) : Exiting Master process...
Jan 31 03:39:14 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [ALERT]    (278603) : Current worker (278605) exited with code 143 (Terminated)
Jan 31 03:39:14 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[278599]: [WARNING]  (278603) : All workers exited. Exiting... (0)
Jan 31 03:39:14 np0005603622 systemd[1]: libpod-1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee.scope: Deactivated successfully.
Jan 31 03:39:14 np0005603622 podman[279209]: 2026-01-31 08:39:14.139711847 +0000 UTC m=+0.220711041 container died 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 03:39:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:14.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay-a179d386be7f1d4509858b0b343d3f561d17ae212d40441075e5e9e04f2e44e6-merged.mount: Deactivated successfully.
Jan 31 03:39:14 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee-userdata-shm.mount: Deactivated successfully.
Jan 31 03:39:14 np0005603622 nova_compute[221301]: 2026-01-31 08:39:14.672 221324 DEBUG neutronclient.v2_0.client [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b62616fc-dd91-4cc2-b323-70fffebab4fb for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:39:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:14.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.114 221324 DEBUG nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.115 221324 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.116 221324 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.116 221324 DEBUG oslo_concurrency.lockutils [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.117 221324 DEBUG nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.117 221324 WARNING nova.compute.manager [req-aaba77bc-c5cb-43e7-92b7-ed45d8948f34 req-36d93f30-a927-4f2a-b26b-13067b66629e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.207 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.208 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.209 221324 DEBUG oslo_concurrency.lockutils [None req-dfb2e8a5-d4ea-4a48-848b-df12aefff031 f40dad094c6e43daab6e48d01b1df1ff bf9fd1d29e534bd99b47eb8854374663 - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:15 np0005603622 podman[279209]: 2026-01-31 08:39:15.229056305 +0000 UTC m=+1.310055499 container cleanup 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:39:15 np0005603622 systemd[1]: libpod-conmon-1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee.scope: Deactivated successfully.
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.351 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:15 np0005603622 podman[279252]: 2026-01-31 08:39:15.781105096 +0000 UTC m=+0.529124640 container remove 1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.786 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[488d42fe-b867-4c9f-bcca-291721e77a5a]: (4, ('Sat Jan 31 08:39:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee)\n1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee\nSat Jan 31 08:39:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee)\n1315a6d780966c94013988f9328d6a7633189d3f61d298c436ff2096aff06fee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.789 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[219131aa-ba88-4739-9ed2-10add78467a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.791 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:15 np0005603622 kernel: tap3d238e24-90: left promiscuous mode
Jan 31 03:39:15 np0005603622 nova_compute[221301]: 2026-01-31 08:39:15.815 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3689f931-6bea-47ae-aef7-b2c66c9b80ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.837 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb9d048-d33d-4f9a-aba2-625269ff3ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.838 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5d3babda-a00d-4e16-a8ff-7f5e2544f456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 podman[279260]: 2026-01-31 08:39:15.847142338 +0000 UTC m=+0.569723754 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.857 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a9fcf0-cf6a-4ad1-a8d6-4c045fb409de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796397, 'reachable_time': 27457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279304, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 systemd[1]: run-netns-ovnmeta\x2d3d238e24\x2d9954\x2d4b32\x2db589\x2d6db6c8760a3f.mount: Deactivated successfully.
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.864 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:39:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:15.864 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[389c1390-13bf-4400-9206-9c5fb2990c0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:15 np0005603622 podman[279253]: 2026-01-31 08:39:15.875095334 +0000 UTC m=+0.600337421 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 31 03:39:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:16.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:16.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.040 221324 DEBUG nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.040 221324 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.040 221324 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.041 221324 DEBUG oslo_concurrency.lockutils [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.041 221324 DEBUG nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.041 221324 WARNING nova.compute.manager [req-8f8a69e3-9aec-4a8d-ab09-213505569e77 req-429b3b12-ee13-4a10-b06b-7f882a1ea8e0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:39:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:18.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.737 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:18.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:18 np0005603622 nova_compute[221301]: 2026-01-31 08:39:18.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:20 np0005603622 nova_compute[221301]: 2026-01-31 08:39:20.287 221324 DEBUG nova.compute.manager [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:20 np0005603622 nova_compute[221301]: 2026-01-31 08:39:20.288 221324 DEBUG nova.compute.manager [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:39:20 np0005603622 nova_compute[221301]: 2026-01-31 08:39:20.288 221324 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:20 np0005603622 nova_compute[221301]: 2026-01-31 08:39:20.289 221324 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:20 np0005603622 nova_compute[221301]: 2026-01-31 08:39:20.289 221324 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:39:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:20.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:20.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 31 03:39:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:39:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/902985181' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:39:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:39:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/902985181' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:39:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 31 03:39:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:22.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:22.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:23 np0005603622 nova_compute[221301]: 2026-01-31 08:39:23.738 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:23 np0005603622 nova_compute[221301]: 2026-01-31 08:39:23.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:24 np0005603622 nova_compute[221301]: 2026-01-31 08:39:24.026 221324 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:39:24 np0005603622 nova_compute[221301]: 2026-01-31 08:39:24.026 221324 DEBUG nova.network.neutron [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:24 np0005603622 nova_compute[221301]: 2026-01-31 08:39:24.215 221324 DEBUG oslo_concurrency.lockutils [req-dfe8333c-a58d-449c-af3f-bfe4b7e3aee4 req-47693ab9-4312-4e13-9bff-551b15b975f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:24.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:24.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:26.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:28.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:28 np0005603622 nova_compute[221301]: 2026-01-31 08:39:28.741 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:28.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:28 np0005603622 nova_compute[221301]: 2026-01-31 08:39:28.936 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848753.9347208, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:28 np0005603622 nova_compute[221301]: 2026-01-31 08:39:28.936 221324 INFO nova.compute.manager [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:39:28 np0005603622 nova_compute[221301]: 2026-01-31 08:39:28.966 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:29 np0005603622 nova_compute[221301]: 2026-01-31 08:39:29.037 221324 DEBUG nova.compute.manager [None req-cad24954-319e-4fe9-b0b2-481ff60a901a - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:29 np0005603622 nova_compute[221301]: 2026-01-31 08:39:29.042 221324 DEBUG nova.compute.manager [None req-cad24954-319e-4fe9-b0b2-481ff60a901a - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:39:29 np0005603622 nova_compute[221301]: 2026-01-31 08:39:29.368 221324 INFO nova.compute.manager [None req-cad24954-319e-4fe9-b0b2-481ff60a901a - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:39:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:30.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:30.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:30.895 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:30.895 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:30.896 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.909 221324 DEBUG nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.909 221324 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.909 221324 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.909 221324 DEBUG oslo_concurrency.lockutils [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.910 221324 DEBUG nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:30 np0005603622 nova_compute[221301]: 2026-01-31 08:39:30.910 221324 WARNING nova.compute.manager [req-de314086-56f0-492e-a792-931e705c5f62 req-5eb199ab-7a61-46c8-ab1d-9b486527b68b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:39:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:32.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:32.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:33 np0005603622 nova_compute[221301]: 2026-01-31 08:39:33.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:33 np0005603622 nova_compute[221301]: 2026-01-31 08:39:33.968 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.003 221324 DEBUG nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.003 221324 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.004 221324 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.004 221324 DEBUG oslo_concurrency.lockutils [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.005 221324 DEBUG nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:34 np0005603622 nova_compute[221301]: 2026-01-31 08:39:34.005 221324 WARNING nova.compute.manager [req-19bd9bde-436f-4789-ba19-107be588c9fe req-063757dc-48a8-4a7b-8ce2-25cc3ec32aa2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:34.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:34.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:36.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:36.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:38.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:38 np0005603622 nova_compute[221301]: 2026-01-31 08:39:38.745 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:38.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:38 np0005603622 nova_compute[221301]: 2026-01-31 08:39:38.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:39 np0005603622 nova_compute[221301]: 2026-01-31 08:39:39.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:39.777 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:39.778 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:39:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:40.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:40.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:41Z|00677|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:39:41 np0005603622 nova_compute[221301]: 2026-01-31 08:39:41.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:42.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:42.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:43 np0005603622 nova_compute[221301]: 2026-01-31 08:39:43.748 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:43 np0005603622 nova_compute[221301]: 2026-01-31 08:39:43.974 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.145 221324 DEBUG nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.146 221324 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.146 221324 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.147 221324 DEBUG oslo_concurrency.lockutils [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.147 221324 DEBUG nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:44 np0005603622 nova_compute[221301]: 2026-01-31 08:39:44.148 221324 WARNING nova.compute.manager [req-85a37662-60c1-4594-9e60-0b499c628068 req-d5ea5a43-99da-4108-8eb9-f8bc4a41f9fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:44.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:44.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:46 np0005603622 podman[279314]: 2026-01-31 08:39:46.149861604 +0000 UTC m=+0.065005436 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 31 03:39:46 np0005603622 podman[279313]: 2026-01-31 08:39:46.1654659 +0000 UTC m=+0.082570824 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:39:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:46.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:46 np0005603622 nova_compute[221301]: 2026-01-31 08:39:46.761 221324 INFO nova.compute.manager [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Swapping old allocation on dict_keys(['f7fd90d1-7583-42ff-b709-f5fc55f6e273']) held by migration cf86d346-be26-4774-9a47-8138c96edea8 for instance#033[00m
Jan 31 03:39:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:46.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:47.779 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:48.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:48 np0005603622 nova_compute[221301]: 2026-01-31 08:39:48.750 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:48.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:48 np0005603622 nova_compute[221301]: 2026-01-31 08:39:48.977 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:49 np0005603622 nova_compute[221301]: 2026-01-31 08:39:49.605 221324 DEBUG nova.scheduler.client.report [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Overwriting current allocation {'allocations': {'492dc482-9d1e-49ca-87f3-0104a8508b72': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 75}}, 'project_id': '0bfe11bd9d694684b527666e2c378eed', 'user_id': 'f1c6e7eff11b435a81429826a682b32f', 'consumer_generation': 1} on consumer 815ef28e-2297-49ba-88a1-23f722c3fa0a move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 31 03:39:50 np0005603622 nova_compute[221301]: 2026-01-31 08:39:50.384 221324 INFO nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating port b62616fc-dd91-4cc2-b323-70fffebab4fb with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:39:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:50.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:39:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:50.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.048 221324 DEBUG nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.049 221324 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.050 221324 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.050 221324 DEBUG oslo_concurrency.lockutils [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.050 221324 DEBUG nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:39:51 np0005603622 nova_compute[221301]: 2026-01-31 08:39:51.051 221324 WARNING nova.compute.manager [req-84087c1f-385c-4f5f-8c3c-3b8c2b2924c7 req-74a26f3c-764e-4bfe-bdfc-1952c1b6e265 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:39:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:52.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:52.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.987 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.987 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:52 np0005603622 nova_compute[221301]: 2026-01-31 08:39:52.988 221324 DEBUG nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:39:53 np0005603622 nova_compute[221301]: 2026-01-31 08:39:53.646 221324 DEBUG nova.compute.manager [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:39:53 np0005603622 nova_compute[221301]: 2026-01-31 08:39:53.646 221324 DEBUG nova.compute.manager [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:39:53 np0005603622 nova_compute[221301]: 2026-01-31 08:39:53.647 221324 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:53 np0005603622 nova_compute[221301]: 2026-01-31 08:39:53.754 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:53 np0005603622 nova_compute[221301]: 2026-01-31 08:39:53.979 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:54 np0005603622 nova_compute[221301]: 2026-01-31 08:39:54.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:54.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:39:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:54.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:39:55 np0005603622 nova_compute[221301]: 2026-01-31 08:39:55.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:56 np0005603622 nova_compute[221301]: 2026-01-31 08:39:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:56.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:56.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:56 np0005603622 nova_compute[221301]: 2026-01-31 08:39:56.993 221324 DEBUG nova.network.neutron [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.436 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.438 221324 DEBUG nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.481 221324 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.482 221324 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.530 221324 DEBUG nova.storage.rbd_utils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rolling back rbd image(815ef28e-2297-49ba-88a1-23f722c3fa0a_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.698 221324 DEBUG nova.storage.rbd_utils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] removing snapshot(nova-resize) on rbd image(815ef28e-2297-49ba-88a1-23f722c3fa0a_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:39:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.856 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.857 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.857 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.863 221324 DEBUG nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start _get_guest_xml network_info=[{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.868 221324 WARNING nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.878 221324 DEBUG nova.virt.libvirt.host [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.881 221324 DEBUG nova.virt.libvirt.host [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.885 221324 DEBUG nova.virt.libvirt.host [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.886 221324 DEBUG nova.virt.libvirt.host [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.888 221324 DEBUG nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.888 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.889 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.889 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.890 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.890 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.890 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.890 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.890 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.891 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.891 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.891 221324 DEBUG nova.virt.hardware [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:39:57 np0005603622 nova_compute[221301]: 2026-01-31 08:39:57.891 221324 DEBUG nova.objects.instance [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'vcpu_model' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:39:58 np0005603622 nova_compute[221301]: 2026-01-31 08:39:58.121 221324 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:39:58.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:39:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1520239381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:39:58 np0005603622 nova_compute[221301]: 2026-01-31 08:39:58.559 221324 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:58 np0005603622 nova_compute[221301]: 2026-01-31 08:39:58.598 221324 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:39:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:39:58 np0005603622 nova_compute[221301]: 2026-01-31 08:39:58.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:39:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:39:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:39:58.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:39:58 np0005603622 nova_compute[221301]: 2026-01-31 08:39:58.981 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/541840340' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.063 221324 DEBUG oslo_concurrency.processutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.064 221324 DEBUG nova.virt.libvirt.vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:39:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:39:33Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.064 221324 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.065 221324 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.068 221324 DEBUG nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <uuid>815ef28e-2297-49ba-88a1-23f722c3fa0a</uuid>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <name>instance-00000092</name>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2115362642</nova:name>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:39:57</nova:creationTime>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <nova:port uuid="b62616fc-dd91-4cc2-b323-70fffebab4fb">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="serial">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="uuid">815ef28e-2297-49ba-88a1-23f722c3fa0a</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/815ef28e-2297-49ba-88a1-23f722c3fa0a_disk.config">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:44:41:0a"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <target dev="tapb62616fc-dd"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a/console.log" append="off"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:39:59 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:39:59 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:39:59 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:39:59 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.069 221324 DEBUG nova.compute.manager [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Preparing to wait for external event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.069 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.069 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.069 221324 DEBUG oslo_concurrency.lockutils [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.070 221324 DEBUG nova.virt.libvirt.vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:39:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:39:33Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.070 221324 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.071 221324 DEBUG nova.network.os_vif_util [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.071 221324 DEBUG os_vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.072 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.074 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.074 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb62616fc-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.075 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb62616fc-dd, col_values=(('external_ids', {'iface-id': 'b62616fc-dd91-4cc2-b323-70fffebab4fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:41:0a', 'vm-uuid': '815ef28e-2297-49ba-88a1-23f722c3fa0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.076 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.0778] manager: (tapb62616fc-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.079 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.084 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.085 221324 INFO os_vif [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:39:59 np0005603622 kernel: tapb62616fc-dd: entered promiscuous mode
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.1506] manager: (tapb62616fc-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/306)
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:59Z|00678|binding|INFO|Claiming lport b62616fc-dd91-4cc2-b323-70fffebab4fb for this chassis.
Jan 31 03:39:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:59Z|00679|binding|INFO|b62616fc-dd91-4cc2-b323-70fffebab4fb: Claiming fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:39:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:59Z|00680|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb ovn-installed in OVS
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.157 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 systemd-udevd[279618]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:39:59 np0005603622 systemd-machined[190406]: New machine qemu-70-instance-00000092.
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.1912] device (tapb62616fc-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.1920] device (tapb62616fc-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:39:59 np0005603622 systemd[1]: Started Virtual Machine qemu-70-instance-00000092.
Jan 31 03:39:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:59Z|00681|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb up in Southbound
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.223 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.224 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f bound to our chassis#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.225 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d238e24-9954-4b32-b589-6db6c8760a3f#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.232 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c379c3-400a-429b-8b7f-288be884c639]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.233 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d238e24-91 in ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.235 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d238e24-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.235 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0a22bdc5-8886-4dfb-bff9-1e4f16c29c2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.236 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4521b5ec-3a37-4220-a531-f663161c6d3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.246 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9449997e-fea8-48df-a7a3-2e9bee512b01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.254 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b87182f-35e5-42a5-8f9c-8410a3ea4bd8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.279 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f36229-69f3-4308-adf9-e41f1737b1ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.284 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[627e73cf-8383-48e3-bd5b-6b3b9d4d4dc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.2864] manager: (tap3d238e24-90): new Veth device (/org/freedesktop/NetworkManager/Devices/307)
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.311 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[734ab783-e9ee-4fe4-b832-ef6d28bb6c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.314 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1c66ab7a-9126-4745-96cd-55ddd1f8b4fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.3270] device (tap3d238e24-90): carrier: link connected
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.329 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6fe27d-0c1c-41ff-9a43-41333f721d40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.341 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b41c19b4-0910-425a-9464-3b5cb17ae478]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805066, 'reachable_time': 29706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279652, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.350 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3abd1a0b-9b1a-4f4e-b736-ac9c36552f6e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:782a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 805066, 'tstamp': 805066}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279653, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.361 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1d75ac2e-e4ba-4a1a-bf9f-ecfa93b4345f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d238e24-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:78:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805066, 'reachable_time': 29706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279654, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.384 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1f118f-9371-4fea-b3bb-657d79a17c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.426 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c8703e-a1b3-4f7c-936f-083068d3c504]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.428 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.428 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.429 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d238e24-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.431 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 NetworkManager[49080]: <info>  [1769848799.4318] manager: (tap3d238e24-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 31 03:39:59 np0005603622 kernel: tap3d238e24-90: entered promiscuous mode
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.440 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d238e24-90, col_values=(('external_ids', {'iface-id': '7571c123-c1d9-4ad9-a9c7-718eca889c7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:39:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:39:59Z|00682|binding|INFO|Releasing lport 7571c123-c1d9-4ad9-a9c7-718eca889c7b from this chassis (sb_readonly=0)
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.445 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.446 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[28e9b405-228e-4b70-83fc-773668092c9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.447 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/3d238e24-9954-4b32-b589-6db6c8760a3f.pid.haproxy
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 3d238e24-9954-4b32-b589-6db6c8760a3f
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:39:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:39:59.448 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'env', 'PROCESS_TAG=haproxy-3d238e24-9954-4b32-b589-6db6c8760a3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d238e24-9954-4b32-b589-6db6c8760a3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.622 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848799.622148, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.623 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Started (Lifecycle Event)#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.700 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.704 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848799.6224284, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.704 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:39:59 np0005603622 podman[279728]: 2026-01-31 08:39:59.77048933 +0000 UTC m=+0.047341804 container create 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.790 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.794 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:39:59 np0005603622 systemd[1]: Started libpod-conmon-104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e.scope.
Jan 31 03:39:59 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:39:59 np0005603622 podman[279728]: 2026-01-31 08:39:59.742274897 +0000 UTC m=+0.019127401 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:39:59 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f93b61e95196900a785d6a569634a2f5587cd8911f5b18b395da6df32e86662c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:39:59 np0005603622 podman[279728]: 2026-01-31 08:39:59.858359875 +0000 UTC m=+0.135212379 container init 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:39:59 np0005603622 podman[279728]: 2026-01-31 08:39:59.863638336 +0000 UTC m=+0.140490810 container start 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:39:59 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [NOTICE]   (279747) : New worker (279749) forked
Jan 31 03:39:59 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [NOTICE]   (279747) : Loading success.
Jan 31 03:39:59 np0005603622 nova_compute[221301]: 2026-01-31 08:39:59.958 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:39:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:40:00 np0005603622 nova_compute[221301]: 2026-01-31 08:40:00.028 221324 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:40:00 np0005603622 nova_compute[221301]: 2026-01-31 08:40:00.028 221324 DEBUG nova.network.neutron [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:00 np0005603622 nova_compute[221301]: 2026-01-31 08:40:00.113 221324 DEBUG oslo_concurrency.lockutils [req-e19069bb-5f05-479e-ae2a-13411aa25f6c req-4b24288f-0f0d-4615-b6c5-fc589702377e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:40:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:00.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:00.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.589 221324 DEBUG nova.compute.manager [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.589 221324 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.589 221324 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.590 221324 DEBUG oslo_concurrency.lockutils [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.590 221324 DEBUG nova.compute.manager [req-eb9e934f-2387-4f4a-a0f1-9ef89cb75816 req-7b38a019-995b-4b04-a596-9a412be80bb9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Processing event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.590 221324 DEBUG nova.compute.manager [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.593 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848801.5935001, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.593 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.597 221324 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance running successfully.#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.597 221324 DEBUG nova.virt.libvirt.driver [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.677 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.680 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.770 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.825 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [{"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.885 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-9707f58b-2c76-40df-b8a8-5df66bab50bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.886 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.886 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.936 221324 INFO nova.compute.manager [None req-65f0c213-1c7c-47a0-9c96-b2e2a310f7fa f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance to original state: 'active'#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.949 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.949 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.949 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.949 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:40:01 np0005603622 nova_compute[221301]: 2026-01-31 08:40:01.950 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:40:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/754199890' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.387 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:02.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.511 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.511 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.514 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.515 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.518 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.518 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.667 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.668 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3783MB free_disk=20.714733123779297GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.668 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.669 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.829 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.829 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.829 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 815ef28e-2297-49ba-88a1-23f722c3fa0a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.830 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.830 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:40:02 np0005603622 nova_compute[221301]: 2026-01-31 08:40:02.977 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:40:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3971858148' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.459 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.463 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:40:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.743 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.744 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.744 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.888 221324 DEBUG nova.compute.manager [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.889 221324 DEBUG oslo_concurrency.lockutils [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.889 221324 DEBUG oslo_concurrency.lockutils [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.889 221324 DEBUG oslo_concurrency.lockutils [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.890 221324 DEBUG nova.compute.manager [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:40:03 np0005603622 nova_compute[221301]: 2026-01-31 08:40:03.890 221324 WARNING nova.compute.manager [req-8d64b84e-b56a-4e3a-bc26-492cba2e5f43 req-97234d39-130d-474f-acce-7f33260b4147 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state None.#033[00m
Jan 31 03:40:04 np0005603622 nova_compute[221301]: 2026-01-31 08:40:04.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:04 np0005603622 nova_compute[221301]: 2026-01-31 08:40:04.255 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:04 np0005603622 nova_compute[221301]: 2026-01-31 08:40:04.255 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:04 np0005603622 nova_compute[221301]: 2026-01-31 08:40:04.256 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:04.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:04.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 31 03:40:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:40:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:40:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:06.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:06.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:08.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:08 np0005603622 nova_compute[221301]: 2026-01-31 08:40:08.762 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:08.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:09 np0005603622 nova_compute[221301]: 2026-01-31 08:40:09.079 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:09 np0005603622 nova_compute[221301]: 2026-01-31 08:40:09.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:10.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:10.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:12.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:12.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:13 np0005603622 nova_compute[221301]: 2026-01-31 08:40:13.764 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:14 np0005603622 nova_compute[221301]: 2026-01-31 08:40:14.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:14 np0005603622 ovn_controller[130109]: 2026-01-31T08:40:14Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:41:0a 10.100.0.3
Jan 31 03:40:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:14.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:14.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:16.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:16.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:17 np0005603622 podman[279854]: 2026-01-31 08:40:17.127892443 +0000 UTC m=+0.049301197 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 31 03:40:17 np0005603622 podman[279853]: 2026-01-31 08:40:17.178658678 +0000 UTC m=+0.099682751 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:40:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:18.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:18 np0005603622 nova_compute[221301]: 2026-01-31 08:40:18.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:18.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:19 np0005603622 nova_compute[221301]: 2026-01-31 08:40:19.083 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:19 np0005603622 nova_compute[221301]: 2026-01-31 08:40:19.309 221324 INFO nova.compute.manager [None req-c10df394-54a0-4974-b1bc-e154714ce364 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Get console output#033[00m
Jan 31 03:40:19 np0005603622 nova_compute[221301]: 2026-01-31 08:40:19.318 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:40:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:20.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.004000107s ======
Jan 31 03:40:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:20.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000107s
Jan 31 03:40:21 np0005603622 nova_compute[221301]: 2026-01-31 08:40:21.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:22.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:22 np0005603622 nova_compute[221301]: 2026-01-31 08:40:22.751 221324 DEBUG nova.compute.manager [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:22 np0005603622 nova_compute[221301]: 2026-01-31 08:40:22.751 221324 DEBUG nova.compute.manager [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing instance network info cache due to event network-changed-b62616fc-dd91-4cc2-b323-70fffebab4fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:40:22 np0005603622 nova_compute[221301]: 2026-01-31 08:40:22.752 221324 DEBUG oslo_concurrency.lockutils [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:40:22 np0005603622 nova_compute[221301]: 2026-01-31 08:40:22.752 221324 DEBUG oslo_concurrency.lockutils [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:40:22 np0005603622 nova_compute[221301]: 2026-01-31 08:40:22.752 221324 DEBUG nova.network.neutron [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Refreshing network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:40:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:22.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.135 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.136 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.136 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.136 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.136 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.138 221324 INFO nova.compute.manager [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Terminating instance#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.140 221324 DEBUG nova.compute.manager [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:40:23 np0005603622 kernel: tapb62616fc-dd (unregistering): left promiscuous mode
Jan 31 03:40:23 np0005603622 NetworkManager[49080]: <info>  [1769848823.5111] device (tapb62616fc-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:40:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:40:23Z|00683|binding|INFO|Releasing lport b62616fc-dd91-4cc2-b323-70fffebab4fb from this chassis (sb_readonly=0)
Jan 31 03:40:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:40:23Z|00684|binding|INFO|Setting lport b62616fc-dd91-4cc2-b323-70fffebab4fb down in Southbound
Jan 31 03:40:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:40:23Z|00685|binding|INFO|Removing iface tapb62616fc-dd ovn-installed in OVS
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 31 03:40:23 np0005603622 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000092.scope: Consumed 13.183s CPU time.
Jan 31 03:40:23 np0005603622 systemd-machined[190406]: Machine qemu-70-instance-00000092 terminated.
Jan 31 03:40:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:23.590 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:41:0a 10.100.0.3'], port_security=['fa:16:3e:44:41:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '815ef28e-2297-49ba-88a1-23f722c3fa0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d238e24-9954-4b32-b589-6db6c8760a3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'e09d3566-f99d-4e7a-854e-68c93732c8e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09414a4b-4852-4431-b971-0c29958bdb7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b62616fc-dd91-4cc2-b323-70fffebab4fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:40:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:23.591 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b62616fc-dd91-4cc2-b323-70fffebab4fb in datapath 3d238e24-9954-4b32-b589-6db6c8760a3f unbound from our chassis#033[00m
Jan 31 03:40:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:23.593 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d238e24-9954-4b32-b589-6db6c8760a3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:40:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:23.595 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6ac9bd-40c0-4dcb-8931-4e22937b5819]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:23.595 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f namespace which is not needed anymore#033[00m
Jan 31 03:40:23 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [NOTICE]   (279747) : haproxy version is 2.8.14-c23fe91
Jan 31 03:40:23 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [NOTICE]   (279747) : path to executable is /usr/sbin/haproxy
Jan 31 03:40:23 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [WARNING]  (279747) : Exiting Master process...
Jan 31 03:40:23 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [ALERT]    (279747) : Current worker (279749) exited with code 143 (Terminated)
Jan 31 03:40:23 np0005603622 neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f[279743]: [WARNING]  (279747) : All workers exited. Exiting... (0)
Jan 31 03:40:23 np0005603622 systemd[1]: libpod-104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e.scope: Deactivated successfully.
Jan 31 03:40:23 np0005603622 podman[279920]: 2026-01-31 08:40:23.711536404 +0000 UTC m=+0.053705054 container died 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:40:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.803 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.806 221324 INFO nova.virt.libvirt.driver [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Instance destroyed successfully.#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.806 221324 DEBUG nova.objects.instance [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'resources' on Instance uuid 815ef28e-2297-49ba-88a1-23f722c3fa0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.870 221324 DEBUG nova.virt.libvirt.vif [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:37:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2115362642',display_name='tempest-TestNetworkAdvancedServerOps-server-2115362642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2115362642',id=146,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNl3t7Tpw2ZGAKx5tF0XRvkA5V1qQ9xszE6olcVe0qbeqbBI1oq6Zjq+3DsZDE5JpsKvfdWgNEpJ9rXaTzL6wLNNLR+GbnRbZpWjtFLfeYVgQUQ4VVzMWaZiV7/jRigBGA==',key_name='tempest-TestNetworkAdvancedServerOps-1092980412',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:40:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-4f40bqke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:40:02Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=815ef28e-2297-49ba-88a1-23f722c3fa0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.870 221324 DEBUG nova.network.os_vif_util [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.871 221324 DEBUG nova.network.os_vif_util [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.872 221324 DEBUG os_vif [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.874 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.875 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb62616fc-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:23 np0005603622 nova_compute[221301]: 2026-01-31 08:40:23.883 221324 INFO os_vif [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:41:0a,bridge_name='br-int',has_traffic_filtering=True,id=b62616fc-dd91-4cc2-b323-70fffebab4fb,network=Network(3d238e24-9954-4b32-b589-6db6c8760a3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb62616fc-dd')#033[00m
Jan 31 03:40:23 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:40:23 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f93b61e95196900a785d6a569634a2f5587cd8911f5b18b395da6df32e86662c-merged.mount: Deactivated successfully.
Jan 31 03:40:24 np0005603622 podman[279920]: 2026-01-31 08:40:24.010986754 +0000 UTC m=+0.353155434 container cleanup 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 31 03:40:24 np0005603622 systemd[1]: libpod-conmon-104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e.scope: Deactivated successfully.
Jan 31 03:40:24 np0005603622 podman[279980]: 2026-01-31 08:40:24.235369062 +0000 UTC m=+0.198559719 container remove 104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.241 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[63d9fdd7-31a8-42cb-8cdc-4835113b58c2]: (4, ('Sat Jan 31 08:40:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e)\n104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e\nSat Jan 31 08:40:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f (104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e)\n104506ba3025d9dddbbd3a3d2d70c5da35728f9547c149886794d81b1a77e11e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.243 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[36ff6231-9d82-45a2-b0e8-23c8f1b77597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.245 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d238e24-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:24 np0005603622 nova_compute[221301]: 2026-01-31 08:40:24.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:24 np0005603622 kernel: tap3d238e24-90: left promiscuous mode
Jan 31 03:40:24 np0005603622 nova_compute[221301]: 2026-01-31 08:40:24.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.262 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[378fa17d-cf50-49eb-a9ef-c209545ecfd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.279 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3a6a85-430b-4808-8c9e-f6f4badc91ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.281 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a0cc54-2d19-48e9-897d-e81735895acb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.300 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3ba819-4770-417e-8a6d-e1181c689d74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805061, 'reachable_time': 25907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279993, 'error': None, 'target': 'ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.305 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d238e24-9954-4b32-b589-6db6c8760a3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:40:24 np0005603622 systemd[1]: run-netns-ovnmeta\x2d3d238e24\x2d9954\x2d4b32\x2db589\x2d6db6c8760a3f.mount: Deactivated successfully.
Jan 31 03:40:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:24.306 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b28e6869-4301-472c-a525-dee144280330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:40:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:24.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:24.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:24 np0005603622 nova_compute[221301]: 2026-01-31 08:40:24.917 221324 INFO nova.virt.libvirt.driver [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Deleting instance files /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a_del#033[00m
Jan 31 03:40:24 np0005603622 nova_compute[221301]: 2026-01-31 08:40:24.918 221324 INFO nova.virt.libvirt.driver [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Deletion of /var/lib/nova/instances/815ef28e-2297-49ba-88a1-23f722c3fa0a_del complete#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.588 221324 DEBUG nova.compute.manager [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.588 221324 DEBUG oslo_concurrency.lockutils [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.588 221324 DEBUG oslo_concurrency.lockutils [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.589 221324 DEBUG oslo_concurrency.lockutils [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.589 221324 DEBUG nova.compute.manager [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:40:25 np0005603622 nova_compute[221301]: 2026-01-31 08:40:25.589 221324 DEBUG nova.compute.manager [req-2c383dbb-d724-4c42-98fd-1440e915df25 req-4470d1d3-e8a3-4bd4-ba47-2a8e09cdebef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-unplugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.431 221324 INFO nova.compute.manager [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Took 3.29 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.432 221324 DEBUG oslo.service.loopingcall [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.432 221324 DEBUG nova.compute.manager [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.432 221324 DEBUG nova.network.neutron [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:40:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:26.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.833 221324 DEBUG nova.network.neutron [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updated VIF entry in instance network info cache for port b62616fc-dd91-4cc2-b323-70fffebab4fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:40:26 np0005603622 nova_compute[221301]: 2026-01-31 08:40:26.834 221324 DEBUG nova.network.neutron [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [{"id": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "address": "fa:16:3e:44:41:0a", "network": {"id": "3d238e24-9954-4b32-b589-6db6c8760a3f", "bridge": "br-int", "label": "tempest-network-smoke--1478468357", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb62616fc-dd", "ovs_interfaceid": "b62616fc-dd91-4cc2-b323-70fffebab4fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:26.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.222 221324 DEBUG oslo_concurrency.lockutils [req-605432dd-38ac-47ea-8d4c-781aad36568e req-9b1b45d1-452b-4e69-8921-b9787e6d6b3c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-815ef28e-2297-49ba-88a1-23f722c3fa0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.818 221324 DEBUG nova.compute.manager [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.819 221324 DEBUG oslo_concurrency.lockutils [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.819 221324 DEBUG oslo_concurrency.lockutils [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.819 221324 DEBUG oslo_concurrency.lockutils [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.820 221324 DEBUG nova.compute.manager [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] No waiting events found dispatching network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:40:27 np0005603622 nova_compute[221301]: 2026-01-31 08:40:27.820 221324 WARNING nova.compute.manager [req-6ee6deb4-a69b-4cd4-a651-1c1403f8bb96 req-d905b22f-0118-4c27-b1da-4e7b6dde9533 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received unexpected event network-vif-plugged-b62616fc-dd91-4cc2-b323-70fffebab4fb for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:40:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:28.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:28 np0005603622 nova_compute[221301]: 2026-01-31 08:40:28.805 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:28.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:28 np0005603622 nova_compute[221301]: 2026-01-31 08:40:28.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.471 221324 DEBUG nova.network.neutron [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.617 221324 DEBUG nova.compute.manager [req-c0790998-3550-494f-9652-06f658e9050c req-9631fc36-0118-44cb-94e4-b126a3769bf1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Received event network-vif-deleted-b62616fc-dd91-4cc2-b323-70fffebab4fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.617 221324 INFO nova.compute.manager [req-c0790998-3550-494f-9652-06f658e9050c req-9631fc36-0118-44cb-94e4-b126a3769bf1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Neutron deleted interface b62616fc-dd91-4cc2-b323-70fffebab4fb; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.618 221324 DEBUG nova.network.neutron [req-c0790998-3550-494f-9652-06f658e9050c req-9631fc36-0118-44cb-94e4-b126a3769bf1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.988 221324 INFO nova.compute.manager [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Took 3.56 seconds to deallocate network for instance.#033[00m
Jan 31 03:40:29 np0005603622 nova_compute[221301]: 2026-01-31 08:40:29.994 221324 DEBUG nova.compute.manager [req-c0790998-3550-494f-9652-06f658e9050c req-9631fc36-0118-44cb-94e4-b126a3769bf1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Detach interface failed, port_id=b62616fc-dd91-4cc2-b323-70fffebab4fb, reason: Instance 815ef28e-2297-49ba-88a1-23f722c3fa0a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:40:30 np0005603622 nova_compute[221301]: 2026-01-31 08:40:30.282 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:30 np0005603622 nova_compute[221301]: 2026-01-31 08:40:30.283 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:30 np0005603622 nova_compute[221301]: 2026-01-31 08:40:30.395 221324 DEBUG oslo_concurrency.processutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:30.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:30.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:40:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3706891907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:40:30 np0005603622 nova_compute[221301]: 2026-01-31 08:40:30.895 221324 DEBUG oslo_concurrency.processutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:30.896 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:30.897 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:30.898 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:30 np0005603622 nova_compute[221301]: 2026-01-31 08:40:30.901 221324 DEBUG nova.compute.provider_tree [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:40:31 np0005603622 nova_compute[221301]: 2026-01-31 08:40:31.018 221324 DEBUG nova.scheduler.client.report [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:40:31 np0005603622 nova_compute[221301]: 2026-01-31 08:40:31.395 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:31 np0005603622 nova_compute[221301]: 2026-01-31 08:40:31.532 221324 INFO nova.scheduler.client.report [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Deleted allocations for instance 815ef28e-2297-49ba-88a1-23f722c3fa0a#033[00m
Jan 31 03:40:32 np0005603622 nova_compute[221301]: 2026-01-31 08:40:32.323 221324 DEBUG oslo_concurrency.lockutils [None req-06903cb3-aa9b-4409-9e27-ab5a247b6840 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "815ef28e-2297-49ba-88a1-23f722c3fa0a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:32.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:32.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:33 np0005603622 nova_compute[221301]: 2026-01-31 08:40:33.807 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:33 np0005603622 nova_compute[221301]: 2026-01-31 08:40:33.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:34.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:34.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:36.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:36.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:38.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:38 np0005603622 nova_compute[221301]: 2026-01-31 08:40:38.770 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848823.7696652, 815ef28e-2297-49ba-88a1-23f722c3fa0a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:40:38 np0005603622 nova_compute[221301]: 2026-01-31 08:40:38.771 221324 INFO nova.compute.manager [-] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:40:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:38 np0005603622 nova_compute[221301]: 2026-01-31 08:40:38.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:38.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:38 np0005603622 nova_compute[221301]: 2026-01-31 08:40:38.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:39 np0005603622 nova_compute[221301]: 2026-01-31 08:40:39.285 221324 DEBUG nova.compute.manager [None req-16fcd5b6-0c29-4c59-91c4-451de4d2f49e - - - - - -] [instance: 815ef28e-2297-49ba-88a1-23f722c3fa0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:40:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:40.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:40.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:42.291 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:40:42 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:42.292 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:40:42 np0005603622 nova_compute[221301]: 2026-01-31 08:40:42.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:42.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:42.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:43 np0005603622 nova_compute[221301]: 2026-01-31 08:40:43.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:43 np0005603622 nova_compute[221301]: 2026-01-31 08:40:43.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:44.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:44.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 31 03:40:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:46.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:46.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:40:47Z|00686|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:40:47 np0005603622 nova_compute[221301]: 2026-01-31 08:40:47.963 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:48 np0005603622 podman[280018]: 2026-01-31 08:40:48.131077395 +0000 UTC m=+0.057568287 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:40:48 np0005603622 podman[280017]: 2026-01-31 08:40:48.158016914 +0000 UTC m=+0.082115082 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 31 03:40:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:48.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e347 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:48 np0005603622 nova_compute[221301]: 2026-01-31 08:40:48.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:48 np0005603622 nova_compute[221301]: 2026-01-31 08:40:48.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:48.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:40:49.294 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:40:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:50.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:50.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:52.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:52.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:53 np0005603622 nova_compute[221301]: 2026-01-31 08:40:53.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:53 np0005603622 nova_compute[221301]: 2026-01-31 08:40:53.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e347 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:53 np0005603622 nova_compute[221301]: 2026-01-31 08:40:53.814 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:53 np0005603622 nova_compute[221301]: 2026-01-31 08:40:53.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:54.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:54 np0005603622 nova_compute[221301]: 2026-01-31 08:40:54.735 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:54 np0005603622 nova_compute[221301]: 2026-01-31 08:40:54.735 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:54 np0005603622 nova_compute[221301]: 2026-01-31 08:40:54.736 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:40:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:54.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 31 03:40:56 np0005603622 nova_compute[221301]: 2026-01-31 08:40:56.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:56.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:40:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:56.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:40:57 np0005603622 nova_compute[221301]: 2026-01-31 08:40:57.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.475 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.475 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:40:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:40:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:40:58.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.544 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.545 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.545 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.545 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.546 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:40:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.818 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:58 np0005603622 nova_compute[221301]: 2026-01-31 08:40:58.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:40:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:40:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:40:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:40:58.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:40:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:40:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2987037705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.122 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.576s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.289 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.290 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.295 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.295 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000008e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.448 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.449 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3877MB free_disk=20.785202026367188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.449 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.449 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.710 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.710 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.711 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.711 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:40:59 np0005603622 nova_compute[221301]: 2026-01-31 08:40:59.826 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4067965981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:00 np0005603622 nova_compute[221301]: 2026-01-31 08:41:00.278 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:00 np0005603622 nova_compute[221301]: 2026-01-31 08:41:00.283 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:00 np0005603622 nova_compute[221301]: 2026-01-31 08:41:00.360 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:00 np0005603622 nova_compute[221301]: 2026-01-31 08:41:00.405 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:41:00 np0005603622 nova_compute[221301]: 2026-01-31 08:41:00.406 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:00.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:00.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:02.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:03 np0005603622 nova_compute[221301]: 2026-01-31 08:41:03.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:03 np0005603622 nova_compute[221301]: 2026-01-31 08:41:03.890 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:04.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:04.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:05 np0005603622 nova_compute[221301]: 2026-01-31 08:41:05.338 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:05 np0005603622 nova_compute[221301]: 2026-01-31 08:41:05.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:41:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1105704917' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:41:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:41:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1105704917' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:41:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:06.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:06.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:41:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:41:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 31 03:41:08 np0005603622 nova_compute[221301]: 2026-01-31 08:41:08.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:08.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:08 np0005603622 nova_compute[221301]: 2026-01-31 08:41:08.841 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:08 np0005603622 nova_compute[221301]: 2026-01-31 08:41:08.892 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:08.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:10.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:10.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:12.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:12.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:41:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:13 np0005603622 nova_compute[221301]: 2026-01-31 08:41:13.844 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:13 np0005603622 nova_compute[221301]: 2026-01-31 08:41:13.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:14 np0005603622 nova_compute[221301]: 2026-01-31 08:41:14.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:14.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:14.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.927519) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875927595, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 1588, "num_deletes": 258, "total_data_size": 3489125, "memory_usage": 3545104, "flush_reason": "Manual Compaction"}
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875945099, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 2277783, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63793, "largest_seqno": 65374, "table_properties": {"data_size": 2271038, "index_size": 3815, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14575, "raw_average_key_size": 20, "raw_value_size": 2257330, "raw_average_value_size": 3126, "num_data_blocks": 168, "num_entries": 722, "num_filter_entries": 722, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848753, "oldest_key_time": 1769848753, "file_creation_time": 1769848875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 17688 microseconds, and 9033 cpu microseconds.
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.945207) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 2277783 bytes OK
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.945231) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.948864) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.948895) EVENT_LOG_v1 {"time_micros": 1769848875948886, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.948922) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 3481783, prev total WAL file size 3481783, number of live WAL files 2.
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.950205) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323633' seq:72057594037927935, type:22 .. '6C6F676D0032353134' seq:0, type:0; will stop at (end)
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(2224KB)], [129(9563KB)]
Jan 31 03:41:15 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848875950271, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 12070592, "oldest_snapshot_seqno": -1}
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8624 keys, 11927004 bytes, temperature: kUnknown
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876070471, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 11927004, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11870320, "index_size": 34043, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21573, "raw_key_size": 227245, "raw_average_key_size": 26, "raw_value_size": 11717972, "raw_average_value_size": 1358, "num_data_blocks": 1307, "num_entries": 8624, "num_filter_entries": 8624, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.070804) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 11927004 bytes
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.072181) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 100.4 rd, 99.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 9.3 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 9156, records dropped: 532 output_compression: NoCompression
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.072196) EVENT_LOG_v1 {"time_micros": 1769848876072189, "job": 82, "event": "compaction_finished", "compaction_time_micros": 120224, "compaction_time_cpu_micros": 47598, "output_level": 6, "num_output_files": 1, "total_output_size": 11927004, "num_input_records": 9156, "num_output_records": 8624, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876072476, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848876073185, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:15.950023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.073315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.073323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.073325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.073327) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:41:16.073329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:41:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:16.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:16.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:17 np0005603622 nova_compute[221301]: 2026-01-31 08:41:17.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:17 np0005603622 nova_compute[221301]: 2026-01-31 08:41:17.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:41:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:18.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:18 np0005603622 nova_compute[221301]: 2026-01-31 08:41:18.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:18 np0005603622 nova_compute[221301]: 2026-01-31 08:41:18.894 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:19 np0005603622 podman[280288]: 2026-01-31 08:41:19.131930261 +0000 UTC m=+0.051862395 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:41:19 np0005603622 podman[280287]: 2026-01-31 08:41:19.154783321 +0000 UTC m=+0.075059004 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:41:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 31 03:41:19 np0005603622 nova_compute[221301]: 2026-01-31 08:41:19.753 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:20Z|00687|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:41:20 np0005603622 nova_compute[221301]: 2026-01-31 08:41:20.259 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:20Z|00688|binding|INFO|Releasing lport 54969bc0-ee8d-420c-ac0c-dd4f9410e42c from this chassis (sb_readonly=0)
Jan 31 03:41:20 np0005603622 nova_compute[221301]: 2026-01-31 08:41:20.331 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:20.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.530 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.531 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.532 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.532 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.533 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.536 221324 INFO nova.compute.manager [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Terminating instance#033[00m
Jan 31 03:41:21 np0005603622 nova_compute[221301]: 2026-01-31 08:41:21.538 221324 DEBUG nova.compute.manager [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:41:22 np0005603622 kernel: tap2ed53bda-fd (unregistering): left promiscuous mode
Jan 31 03:41:22 np0005603622 NetworkManager[49080]: <info>  [1769848882.2187] device (tap2ed53bda-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:41:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:22Z|00689|binding|INFO|Releasing lport 2ed53bda-fdee-4925-8492-3be158eb8d21 from this chassis (sb_readonly=0)
Jan 31 03:41:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:22Z|00690|binding|INFO|Setting lport 2ed53bda-fdee-4925-8492-3be158eb8d21 down in Southbound
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:22Z|00691|binding|INFO|Removing iface tap2ed53bda-fd ovn-installed in OVS
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Jan 31 03:41:22 np0005603622 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008e.scope: Consumed 26.695s CPU time.
Jan 31 03:41:22 np0005603622 systemd-machined[190406]: Machine qemu-68-instance-0000008e terminated.
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.349 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:11:72 10.100.0.9'], port_security=['fa:16:3e:e3:11:72 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9707f58b-2c76-40df-b8a8-5df66bab50bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=2ed53bda-fdee-4925-8492-3be158eb8d21) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.351 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 2ed53bda-fdee-4925-8492-3be158eb8d21 in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.352 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31da00d3-077b-4620-a7d3-68186467ab47#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.371 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe99426a-1c78-475e-a1a6-ea1603c55fa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.381 221324 INFO nova.virt.libvirt.driver [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Instance destroyed successfully.#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.382 221324 DEBUG nova.objects.instance [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'resources' on Instance uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.407 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6d0325-2a26-47d9-908d-77ecc50f3a34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.410 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[18e12741-2237-4a2e-b071-2089fd7d777a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.443 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4088880b-f620-4155-92d1-8f780271c079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.464 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e42d997-e42d-4b6d-ae4e-cb0f7425a884]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31da00d3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:4f:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773720, 'reachable_time': 28432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280358, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.489 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9d632706-e572-4702-a9f4-cdafa534d245]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773731, 'tstamp': 773731}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280359, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap31da00d3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 773733, 'tstamp': 773733}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280359, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.491 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.493 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.499 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31da00d3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.500 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31da00d3-00, col_values=(('external_ids', {'iface-id': '54969bc0-ee8d-420c-ac0c-dd4f9410e42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:22.500 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:41:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:22.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.595 221324 DEBUG nova.virt.libvirt.vif [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:34:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1321512917',display_name='tempest-ServerStableDeviceRescueTest-server-1321512917',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1321512917',id=142,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:35:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-6ftdla88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:35:55Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9707f58b-2c76-40df-b8a8-5df66bab50bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.596 221324 DEBUG nova.network.os_vif_util [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "2ed53bda-fdee-4925-8492-3be158eb8d21", "address": "fa:16:3e:e3:11:72", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ed53bda-fd", "ovs_interfaceid": "2ed53bda-fdee-4925-8492-3be158eb8d21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.597 221324 DEBUG nova.network.os_vif_util [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.597 221324 DEBUG os_vif [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.599 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ed53bda-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:22 np0005603622 nova_compute[221301]: 2026-01-31 08:41:22.606 221324 INFO os_vif [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:11:72,bridge_name='br-int',has_traffic_filtering=True,id=2ed53bda-fdee-4925-8492-3be158eb8d21,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ed53bda-fd')#033[00m
Jan 31 03:41:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.547 221324 DEBUG nova.compute.manager [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.548 221324 DEBUG oslo_concurrency.lockutils [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.548 221324 DEBUG oslo_concurrency.lockutils [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.549 221324 DEBUG oslo_concurrency.lockutils [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.549 221324 DEBUG nova.compute.manager [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.550 221324 DEBUG nova.compute.manager [req-8f247caa-6eab-43e9-99a1-e6710c8b5c12 req-c71e7a11-7e3a-4e2b-9411-83779b51a195 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-unplugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:41:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e351 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:23 np0005603622 nova_compute[221301]: 2026-01-31 08:41:23.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:24.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:24.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.524 221324 DEBUG nova.compute.manager [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.524 221324 DEBUG oslo_concurrency.lockutils [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.524 221324 DEBUG oslo_concurrency.lockutils [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.525 221324 DEBUG oslo_concurrency.lockutils [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.525 221324 DEBUG nova.compute.manager [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] No waiting events found dispatching network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:26 np0005603622 nova_compute[221301]: 2026-01-31 08:41:26.525 221324 WARNING nova.compute.manager [req-dfd26f7f-6780-4699-84ef-0f278b8b40b3 req-b4ef2e9b-e335-41bb-a821-a0eae7644e6c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received unexpected event network-vif-plugged-2ed53bda-fdee-4925-8492-3be158eb8d21 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:41:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:26.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:26.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:27 np0005603622 nova_compute[221301]: 2026-01-31 08:41:27.256 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:27 np0005603622 nova_compute[221301]: 2026-01-31 08:41:27.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:27 np0005603622 nova_compute[221301]: 2026-01-31 08:41:27.780 221324 INFO nova.virt.libvirt.driver [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Deleting instance files /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb_del#033[00m
Jan 31 03:41:27 np0005603622 nova_compute[221301]: 2026-01-31 08:41:27.781 221324 INFO nova.virt.libvirt.driver [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Deletion of /var/lib/nova/instances/9707f58b-2c76-40df-b8a8-5df66bab50bb_del complete#033[00m
Jan 31 03:41:28 np0005603622 nova_compute[221301]: 2026-01-31 08:41:28.397 221324 INFO nova.compute.manager [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Took 6.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:41:28 np0005603622 nova_compute[221301]: 2026-01-31 08:41:28.397 221324 DEBUG oslo.service.loopingcall [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:41:28 np0005603622 nova_compute[221301]: 2026-01-31 08:41:28.398 221324 DEBUG nova.compute.manager [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:41:28 np0005603622 nova_compute[221301]: 2026-01-31 08:41:28.398 221324 DEBUG nova.network.neutron [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:41:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:28.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:28 np0005603622 nova_compute[221301]: 2026-01-31 08:41:28.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:28.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.166 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.274 221324 WARNING nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.275 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.276 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 9707f58b-2c76-40df-b8a8-5df66bab50bb _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.276 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.277 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.278 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:29 np0005603622 nova_compute[221301]: 2026-01-31 08:41:29.332 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:30.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:30.897 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:30.898 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:30.899 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:30.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:32.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:32 np0005603622 nova_compute[221301]: 2026-01-31 08:41:32.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:32.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:33 np0005603622 nova_compute[221301]: 2026-01-31 08:41:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:33 np0005603622 nova_compute[221301]: 2026-01-31 08:41:33.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:41:33 np0005603622 nova_compute[221301]: 2026-01-31 08:41:33.633 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:41:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:33 np0005603622 nova_compute[221301]: 2026-01-31 08:41:33.852 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:34.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:34.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.359 221324 DEBUG nova.compute.manager [req-9d6529a0-86ae-4a2e-88dc-2e13900d89aa req-6f36c5d7-3508-4347-a0da-27ddf75847d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Received event network-vif-deleted-2ed53bda-fdee-4925-8492-3be158eb8d21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.359 221324 INFO nova.compute.manager [req-9d6529a0-86ae-4a2e-88dc-2e13900d89aa req-6f36c5d7-3508-4347-a0da-27ddf75847d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Neutron deleted interface 2ed53bda-fdee-4925-8492-3be158eb8d21; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.360 221324 DEBUG nova.network.neutron [req-9d6529a0-86ae-4a2e-88dc-2e13900d89aa req-6f36c5d7-3508-4347-a0da-27ddf75847d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.363 221324 DEBUG nova.network.neutron [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.509 221324 INFO nova.compute.manager [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Took 7.11 seconds to deallocate network for instance.#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.512 221324 DEBUG nova.compute.manager [req-9d6529a0-86ae-4a2e-88dc-2e13900d89aa req-6f36c5d7-3508-4347-a0da-27ddf75847d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Detach interface failed, port_id=2ed53bda-fdee-4925-8492-3be158eb8d21, reason: Instance 9707f58b-2c76-40df-b8a8-5df66bab50bb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.708 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.709 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:35 np0005603622 nova_compute[221301]: 2026-01-31 08:41:35.842 221324 DEBUG oslo_concurrency.processutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3769383771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:36 np0005603622 nova_compute[221301]: 2026-01-31 08:41:36.313 221324 DEBUG oslo_concurrency.processutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:36 np0005603622 nova_compute[221301]: 2026-01-31 08:41:36.318 221324 DEBUG nova.compute.provider_tree [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:36.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:36.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.052 221324 DEBUG nova.scheduler.client.report [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.294 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.378 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848882.3770843, 9707f58b-2c76-40df-b8a8-5df66bab50bb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.379 221324 INFO nova.compute.manager [-] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.438 221324 INFO nova.scheduler.client.report [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Deleted allocations for instance 9707f58b-2c76-40df-b8a8-5df66bab50bb#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.442 221324 DEBUG nova.compute.manager [None req-2bd6a5ed-208f-4ed2-b19d-ac03fc9a0821 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.769 221324 DEBUG oslo_concurrency.lockutils [None req-8bfae4d7-e388-484f-9a52-70b14c76779a b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 16.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.770 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 8.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.771 221324 INFO nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 9707f58b-2c76-40df-b8a8-5df66bab50bb] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 03:41:37 np0005603622 nova_compute[221301]: 2026-01-31 08:41:37.771 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "9707f58b-2c76-40df-b8a8-5df66bab50bb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:38.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:38 np0005603622 nova_compute[221301]: 2026-01-31 08:41:38.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:38.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:40.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:40.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 31 03:41:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:42.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:42 np0005603622 nova_compute[221301]: 2026-01-31 08:41:42.605 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:42.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:43 np0005603622 nova_compute[221301]: 2026-01-31 08:41:43.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:44.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:44.896 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:44 np0005603622 nova_compute[221301]: 2026-01-31 08:41:44.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:44.897 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:41:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:44.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:45.900 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:46.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.668 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.669 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.670 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.670 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.671 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.672 221324 INFO nova.compute.manager [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Terminating instance#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.674 221324 DEBUG nova.compute.manager [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:41:46 np0005603622 kernel: tap664dc5ff-72 (unregistering): left promiscuous mode
Jan 31 03:41:46 np0005603622 NetworkManager[49080]: <info>  [1769848906.8940] device (tap664dc5ff-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.906 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:46Z|00692|binding|INFO|Releasing lport 664dc5ff-72ac-4e4a-8581-6071db25143c from this chassis (sb_readonly=0)
Jan 31 03:41:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:46Z|00693|binding|INFO|Setting lport 664dc5ff-72ac-4e4a-8581-6071db25143c down in Southbound
Jan 31 03:41:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:41:46Z|00694|binding|INFO|Removing iface tap664dc5ff-72 ovn-installed in OVS
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:46 np0005603622 nova_compute[221301]: 2026-01-31 08:41:46.912 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:46 np0005603622 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 31 03:41:46 np0005603622 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008b.scope: Consumed 30.768s CPU time.
Jan 31 03:41:46 np0005603622 systemd-machined[190406]: Machine qemu-65-instance-0000008b terminated.
Jan 31 03:41:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:46.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.119 221324 INFO nova.virt.libvirt.driver [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Instance destroyed successfully.#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.120 221324 DEBUG nova.objects.instance [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lazy-loading 'resources' on Instance uuid 9aac65c2-63f8-4383-80ed-6600852c6fb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.607 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.677 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:fe:3d 10.100.0.13'], port_security=['fa:16:3e:c8:fe:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '9aac65c2-63f8-4383-80ed-6600852c6fb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31da00d3-077b-4620-a7d3-68186467ab47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e29363ca464487b931af54fe14166b1', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b1c240f5-10ef-43c0-92c2-4688e636b197', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c184d7a-2b72-4f04-8956-830b1e8cd5e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=664dc5ff-72ac-4e4a-8581-6071db25143c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.679 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 664dc5ff-72ac-4e4a-8581-6071db25143c in datapath 31da00d3-077b-4620-a7d3-68186467ab47 unbound from our chassis#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.682 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31da00d3-077b-4620-a7d3-68186467ab47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.683 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[602e1496-55f0-4f9c-af8e-9a546c71ec9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.684 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 namespace which is not needed anymore#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.846 221324 DEBUG nova.virt.libvirt.vif [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:33:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-232987388',display_name='tempest-ServerStableDeviceRescueTest-server-232987388',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-232987388',id=139,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:34:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1e29363ca464487b931af54fe14166b1',ramdisk_id='',reservation_id='r-fnf190ma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-319343227',owner_user_name='tempest-ServerStableDeviceRescueTest-319343227-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:34:49Z,user_data=None,user_id='b6733330b634472ca8c21316f1ee5057',uuid=9aac65c2-63f8-4383-80ed-6600852c6fb5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.847 221324 DEBUG nova.network.os_vif_util [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converting VIF {"id": "664dc5ff-72ac-4e4a-8581-6071db25143c", "address": "fa:16:3e:c8:fe:3d", "network": {"id": "31da00d3-077b-4620-a7d3-68186467ab47", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1178144410-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1e29363ca464487b931af54fe14166b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap664dc5ff-72", "ovs_interfaceid": "664dc5ff-72ac-4e4a-8581-6071db25143c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [NOTICE]   (275705) : haproxy version is 2.8.14-c23fe91
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [NOTICE]   (275705) : path to executable is /usr/sbin/haproxy
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [WARNING]  (275705) : Exiting Master process...
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [WARNING]  (275705) : Exiting Master process...
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [ALERT]    (275705) : Current worker (275707) exited with code 143 (Terminated)
Jan 31 03:41:47 np0005603622 neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47[275701]: [WARNING]  (275705) : All workers exited. Exiting... (0)
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.848 221324 DEBUG nova.network.os_vif_util [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.849 221324 DEBUG os_vif [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:41:47 np0005603622 systemd[1]: libpod-f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f.scope: Deactivated successfully.
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.851 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap664dc5ff-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.858 221324 INFO os_vif [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:fe:3d,bridge_name='br-int',has_traffic_filtering=True,id=664dc5ff-72ac-4e4a-8581-6071db25143c,network=Network(31da00d3-077b-4620-a7d3-68186467ab47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap664dc5ff-72')#033[00m
Jan 31 03:41:47 np0005603622 podman[280437]: 2026-01-31 08:41:47.85924214 +0000 UTC m=+0.059009305 container died f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:41:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f-userdata-shm.mount: Deactivated successfully.
Jan 31 03:41:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0cf9f253d37229f93dbb4ca4c6b22fee2e4f4c926b350d91366ed4f0973e39f8-merged.mount: Deactivated successfully.
Jan 31 03:41:47 np0005603622 podman[280437]: 2026-01-31 08:41:47.904983691 +0000 UTC m=+0.104750876 container cleanup f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:41:47 np0005603622 systemd[1]: libpod-conmon-f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f.scope: Deactivated successfully.
Jan 31 03:41:47 np0005603622 podman[280483]: 2026-01-31 08:41:47.977583548 +0000 UTC m=+0.048311710 container remove f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.983 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae74b89-001f-4c88-9a86-1f11be06e0db]: (4, ('Sat Jan 31 08:41:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f)\nf32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f\nSat Jan 31 08:41:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 (f32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f)\nf32c727541bac67e33adbd5a24c1568d99ac62fe7bbf04abbb6333f00b3a980f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.984 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f5314126-e6bc-440f-bf6d-5e3d7d327164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.985 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31da00d3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.987 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 kernel: tap31da00d3-00: left promiscuous mode
Jan 31 03:41:47 np0005603622 nova_compute[221301]: 2026-01-31 08:41:47.992 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:47.995 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf2377f-454e-4001-af99-5813d07859c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:48.008 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[733ab5d5-bf93-4a11-8d1b-c6fe42b23408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:48.010 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1aeb15-3089-4461-af07-db83818ae628]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:48.024 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8400ebe0-1608-444f-8704-5d3a8da037d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 773714, 'reachable_time': 29111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280501, 'error': None, 'target': 'ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:48 np0005603622 systemd[1]: run-netns-ovnmeta\x2d31da00d3\x2d077b\x2d4620\x2da7d3\x2d68186467ab47.mount: Deactivated successfully.
Jan 31 03:41:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:48.028 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-31da00d3-077b-4620-a7d3-68186467ab47 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:41:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:41:48.028 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[84b08b3f-1e61-4844-9a8d-4cb49285bfa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:41:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:48.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:48 np0005603622 nova_compute[221301]: 2026-01-31 08:41:48.860 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:48.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.812 221324 DEBUG nova.compute.manager [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.812 221324 DEBUG oslo_concurrency.lockutils [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.812 221324 DEBUG oslo_concurrency.lockutils [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.812 221324 DEBUG oslo_concurrency.lockutils [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.813 221324 DEBUG nova.compute.manager [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:49 np0005603622 nova_compute[221301]: 2026-01-31 08:41:49.813 221324 DEBUG nova.compute.manager [req-9a2f9554-5dc1-4ec7-99e3-54c6a7a5625a req-db537cc6-c0d7-4b9c-9b4e-3a4b1baea7a8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-unplugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:41:50 np0005603622 podman[280504]: 2026-01-31 08:41:50.172614091 +0000 UTC m=+0.100595516 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:41:50 np0005603622 podman[280503]: 2026-01-31 08:41:50.215231668 +0000 UTC m=+0.143582333 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.279 221324 INFO nova.virt.libvirt.driver [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Deleting instance files /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5_del#033[00m
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.280 221324 INFO nova.virt.libvirt.driver [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Deletion of /var/lib/nova/instances/9aac65c2-63f8-4383-80ed-6600852c6fb5_del complete#033[00m
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.580 221324 INFO nova.compute.manager [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Took 3.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.581 221324 DEBUG oslo.service.loopingcall [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.581 221324 DEBUG nova.compute.manager [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:41:50 np0005603622 nova_compute[221301]: 2026-01-31 08:41:50.581 221324 DEBUG nova.network.neutron [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:41:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:50.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:50.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 31 03:41:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:52.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:52 np0005603622 nova_compute[221301]: 2026-01-31 08:41:52.854 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:52.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.115 221324 DEBUG nova.compute.manager [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.115 221324 DEBUG oslo_concurrency.lockutils [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.116 221324 DEBUG oslo_concurrency.lockutils [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.116 221324 DEBUG oslo_concurrency.lockutils [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.116 221324 DEBUG nova.compute.manager [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] No waiting events found dispatching network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.117 221324 WARNING nova.compute.manager [req-9153b396-a626-46f4-b0e7-b6574668ebfd req-6c22f92d-92e8-43af-a29b-82daaf4bf791 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received unexpected event network-vif-plugged-664dc5ff-72ac-4e4a-8581-6071db25143c for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:41:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:53 np0005603622 nova_compute[221301]: 2026-01-31 08:41:53.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.345 221324 DEBUG nova.network.neutron [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.368 221324 DEBUG nova.compute.manager [req-c9a2925b-3e57-44ee-8f16-58c7719a7748 req-2cce3c20-93b3-4063-b073-1388043b4ab4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Received event network-vif-deleted-664dc5ff-72ac-4e4a-8581-6071db25143c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.369 221324 INFO nova.compute.manager [req-c9a2925b-3e57-44ee-8f16-58c7719a7748 req-2cce3c20-93b3-4063-b073-1388043b4ab4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Neutron deleted interface 664dc5ff-72ac-4e4a-8581-6071db25143c; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.369 221324 DEBUG nova.network.neutron [req-c9a2925b-3e57-44ee-8f16-58c7719a7748 req-2cce3c20-93b3-4063-b073-1388043b4ab4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.515 221324 INFO nova.compute.manager [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Took 3.93 seconds to deallocate network for instance.#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.523 221324 DEBUG nova.compute.manager [req-c9a2925b-3e57-44ee-8f16-58c7719a7748 req-2cce3c20-93b3-4063-b073-1388043b4ab4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Detach interface failed, port_id=664dc5ff-72ac-4e4a-8581-6071db25143c, reason: Instance 9aac65c2-63f8-4383-80ed-6600852c6fb5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:41:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:54.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.634 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.634 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.649 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.650 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:54 np0005603622 nova_compute[221301]: 2026-01-31 08:41:54.800 221324 DEBUG oslo_concurrency.processutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:54.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2089850273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.256 221324 DEBUG oslo_concurrency.processutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.263 221324 DEBUG nova.compute.provider_tree [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.518 221324 DEBUG nova.scheduler.client.report [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.618 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:55 np0005603622 nova_compute[221301]: 2026-01-31 08:41:55.970 221324 INFO nova.scheduler.client.report [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Deleted allocations for instance 9aac65c2-63f8-4383-80ed-6600852c6fb5#033[00m
Jan 31 03:41:56 np0005603622 nova_compute[221301]: 2026-01-31 08:41:56.283 221324 DEBUG oslo_concurrency.lockutils [None req-eea31e0c-d00b-4bec-9198-f614edfb34c6 b6733330b634472ca8c21316f1ee5057 1e29363ca464487b931af54fe14166b1 - - default default] Lock "9aac65c2-63f8-4383-80ed-6600852c6fb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:41:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:56.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:41:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:56.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:57 np0005603622 nova_compute[221301]: 2026-01-31 08:41:57.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:57 np0005603622 nova_compute[221301]: 2026-01-31 08:41:57.856 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.503 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.503 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.504 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.504 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.505 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:41:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:41:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:41:58.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:41:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.867 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:41:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:41:58 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1563674286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:41:58 np0005603622 nova_compute[221301]: 2026-01-31 08:41:58.954 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:41:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:41:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:41:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:41:58.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.119 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.120 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4414MB free_disk=20.92202377319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.121 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.121 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.617 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.617 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.686 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.737 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.738 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.804 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.891 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:41:59 np0005603622 nova_compute[221301]: 2026-01-31 08:41:59.942 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:42:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:42:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1972078875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:42:00 np0005603622 nova_compute[221301]: 2026-01-31 08:42:00.423 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:42:00 np0005603622 nova_compute[221301]: 2026-01-31 08:42:00.428 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:42:00 np0005603622 nova_compute[221301]: 2026-01-31 08:42:00.537 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:42:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:00.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:00 np0005603622 nova_compute[221301]: 2026-01-31 08:42:00.786 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:42:00 np0005603622 nova_compute[221301]: 2026-01-31 08:42:00.787 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 31 03:42:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:01.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.118 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769848907.117037, 9aac65c2-63f8-4383-80ed-6600852c6fb5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.119 221324 INFO nova.compute.manager [-] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:42:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:42:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 47K writes, 190K keys, 47K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.04 MB/s#012Cumulative WAL: 47K writes, 17K syncs, 2.76 writes per sync, written: 0.18 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6691 writes, 28K keys, 6691 commit groups, 1.0 writes per commit group, ingest: 26.53 MB, 0.04 MB/s#012Interval WAL: 6691 writes, 2566 syncs, 2.61 writes per sync, written: 0.03 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.601 221324 DEBUG nova.compute.manager [None req-c67e8418-9362-4428-b8a8-f1537c84d7ce - - - - - -] [instance: 9aac65c2-63f8-4383-80ed-6600852c6fb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:42:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:02.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.789 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.789 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.790 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:42:02 np0005603622 nova_compute[221301]: 2026-01-31 08:42:02.859 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:03.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:03 np0005603622 nova_compute[221301]: 2026-01-31 08:42:03.094 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:42:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:03 np0005603622 nova_compute[221301]: 2026-01-31 08:42:03.867 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:04 np0005603622 nova_compute[221301]: 2026-01-31 08:42:04.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:04.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:05.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:06 np0005603622 nova_compute[221301]: 2026-01-31 08:42:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:06.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:06 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #52. Immutable memtables: 8.
Jan 31 03:42:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:07.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:07 np0005603622 nova_compute[221301]: 2026-01-31 08:42:07.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:08 np0005603622 nova_compute[221301]: 2026-01-31 08:42:08.562 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:08.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:08 np0005603622 nova_compute[221301]: 2026-01-31 08:42:08.869 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:09.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 31 03:42:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:10.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:11.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 31 03:42:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:12.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:12 np0005603622 nova_compute[221301]: 2026-01-31 08:42:12.865 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:13.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:13 np0005603622 nova_compute[221301]: 2026-01-31 08:42:13.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:14.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:15.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:16.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 31 03:42:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:17.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:17 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:17 np0005603622 nova_compute[221301]: 2026-01-31 08:42:17.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:42:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:42:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:18.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:18 np0005603622 nova_compute[221301]: 2026-01-31 08:42:18.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:19.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:20.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:21.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:21 np0005603622 podman[280746]: 2026-01-31 08:42:21.149735693 +0000 UTC m=+0.062700114 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:42:21 np0005603622 podman[280745]: 2026-01-31 08:42:21.169326716 +0000 UTC m=+0.088650037 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:42:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:22.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:22 np0005603622 nova_compute[221301]: 2026-01-31 08:42:22.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:23.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:23 np0005603622 nova_compute[221301]: 2026-01-31 08:42:23.874 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:24.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:42:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:26.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:27.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:42:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 66K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1679 writes, 8450 keys, 1679 commit groups, 1.0 writes per commit group, ingest: 16.52 MB, 0.03 MB/s#012Interval WAL: 1680 writes, 1680 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     40.4      2.01              0.26        41    0.049       0      0       0.0       0.0#012  L6      1/0   11.37 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9     84.7     72.0      5.53              1.31        40    0.138    270K    21K       0.0       0.0#012 Sum      1/0   11.37 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     62.1     63.6      7.54              1.57        81    0.093    270K    21K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.8     60.0     61.0      1.47              0.30        14    0.105     63K   3612       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     84.7     72.0      5.53              1.31        40    0.138    270K    21K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     40.4      2.01              0.26        40    0.050       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.079, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.47 GB write, 0.10 MB/s write, 0.46 GB read, 0.10 MB/s read, 7.5 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.09 GB read, 0.15 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 50.54 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000651 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2873,48.53 MB,15.9624%) FilterBlock(81,761.98 KB,0.244778%) IndexBlock(81,1.27 MB,0.417263%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:42:27 np0005603622 nova_compute[221301]: 2026-01-31 08:42:27.875 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:28.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:28 np0005603622 nova_compute[221301]: 2026-01-31 08:42:28.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:29.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:30.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:30.898 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:30.899 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:30.899 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:32.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:32 np0005603622 nova_compute[221301]: 2026-01-31 08:42:32.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:33 np0005603622 nova_compute[221301]: 2026-01-31 08:42:33.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:34.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:35.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:36.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.706183) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956706372, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 1140, "num_deletes": 256, "total_data_size": 2383082, "memory_usage": 2420816, "flush_reason": "Manual Compaction"}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956749075, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 1571194, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65379, "largest_seqno": 66514, "table_properties": {"data_size": 1565959, "index_size": 2694, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11602, "raw_average_key_size": 20, "raw_value_size": 1555436, "raw_average_value_size": 2748, "num_data_blocks": 117, "num_entries": 566, "num_filter_entries": 566, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848876, "oldest_key_time": 1769848876, "file_creation_time": 1769848956, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 42954 microseconds, and 6318 cpu microseconds.
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.749144) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 1571194 bytes OK
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.749171) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.753686) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.753702) EVENT_LOG_v1 {"time_micros": 1769848956753696, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.753720) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 2377534, prev total WAL file size 2379479, number of live WAL files 2.
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.754430) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(1534KB)], [132(11MB)]
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956754516, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 13498198, "oldest_snapshot_seqno": -1}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8660 keys, 11617121 bytes, temperature: kUnknown
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956853995, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 11617121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11560402, "index_size": 33950, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21701, "raw_key_size": 228746, "raw_average_key_size": 26, "raw_value_size": 11407602, "raw_average_value_size": 1317, "num_data_blocks": 1297, "num_entries": 8660, "num_filter_entries": 8660, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769848956, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.854340) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 11617121 bytes
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.860357) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.5 rd, 116.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(16.0) write-amplify(7.4) OK, records in: 9190, records dropped: 530 output_compression: NoCompression
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.860375) EVENT_LOG_v1 {"time_micros": 1769848956860367, "job": 84, "event": "compaction_finished", "compaction_time_micros": 99583, "compaction_time_cpu_micros": 44139, "output_level": 6, "num_output_files": 1, "total_output_size": 11617121, "num_input_records": 9190, "num_output_records": 8660, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956860588, "job": 84, "event": "table_file_deletion", "file_number": 134}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769848956861657, "job": 84, "event": "table_file_deletion", "file_number": 132}
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.754220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.861777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.861782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.861783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.861785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:42:36.861786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:42:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:37.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:37 np0005603622 nova_compute[221301]: 2026-01-31 08:42:37.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:38.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:38 np0005603622 nova_compute[221301]: 2026-01-31 08:42:38.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:39.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:40.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:41.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:42.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:42 np0005603622 nova_compute[221301]: 2026-01-31 08:42:42.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:43 np0005603622 nova_compute[221301]: 2026-01-31 08:42:43.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:44.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:45.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.170 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.171 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.172 221324 INFO nova.compute.manager [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Unshelving#033[00m
Jan 31 03:42:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:46.638 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:46.640 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:42:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:46.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.749 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.750 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.760 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'pci_requests' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.863 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'numa_topology' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.961 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:42:46 np0005603622 nova_compute[221301]: 2026-01-31 08:42:46.962 221324 INFO nova.compute.claims [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:42:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:47.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:47 np0005603622 nova_compute[221301]: 2026-01-31 08:42:47.736 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:42:47 np0005603622 nova_compute[221301]: 2026-01-31 08:42:47.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:42:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/54254155' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:42:48 np0005603622 nova_compute[221301]: 2026-01-31 08:42:48.186 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:42:48 np0005603622 nova_compute[221301]: 2026-01-31 08:42:48.193 221324 DEBUG nova.compute.provider_tree [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:42:48 np0005603622 nova_compute[221301]: 2026-01-31 08:42:48.222 221324 DEBUG nova.scheduler.client.report [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:42:48 np0005603622 nova_compute[221301]: 2026-01-31 08:42:48.269 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:42:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:48.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:48 np0005603622 nova_compute[221301]: 2026-01-31 08:42:48.883 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:49.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:50.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:42:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:51.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:42:51 np0005603622 nova_compute[221301]: 2026-01-31 08:42:51.293 221324 INFO nova.network.neutron [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating port b122a11a-5b9d-4b27-a9c3-8327cb8162ae with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 31 03:42:52 np0005603622 podman[280863]: 2026-01-31 08:42:52.152761661 +0000 UTC m=+0.059812197 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 31 03:42:52 np0005603622 podman[280862]: 2026-01-31 08:42:52.201246905 +0000 UTC m=+0.110431938 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller)
Jan 31 03:42:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:42:52.641 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:42:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:52.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:52 np0005603622 nova_compute[221301]: 2026-01-31 08:42:52.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:53.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:53 np0005603622 nova_compute[221301]: 2026-01-31 08:42:53.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:54.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:55.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:55 np0005603622 nova_compute[221301]: 2026-01-31 08:42:55.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:56 np0005603622 nova_compute[221301]: 2026-01-31 08:42:56.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:56 np0005603622 nova_compute[221301]: 2026-01-31 08:42:56.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:56 np0005603622 nova_compute[221301]: 2026-01-31 08:42:56.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:42:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:56.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:57.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:42:57 np0005603622 nova_compute[221301]: 2026-01-31 08:42:57.118 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:42:57 np0005603622 nova_compute[221301]: 2026-01-31 08:42:57.119 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:42:57 np0005603622 nova_compute[221301]: 2026-01-31 08:42:57.119 221324 DEBUG nova.network.neutron [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:42:57 np0005603622 nova_compute[221301]: 2026-01-31 08:42:57.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:42:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:42:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:42:58.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.772 221324 DEBUG nova.compute.manager [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.772 221324 DEBUG nova.compute.manager [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing instance network info cache due to event network-changed-b122a11a-5b9d-4b27-a9c3-8327cb8162ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.773 221324 DEBUG oslo_concurrency.lockutils [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:42:58 np0005603622 nova_compute[221301]: 2026-01-31 08:42:58.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:42:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:42:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:42:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:42:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:42:59.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:00 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:00Z|00695|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:43:00 np0005603622 nova_compute[221301]: 2026-01-31 08:43:00.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:00 np0005603622 nova_compute[221301]: 2026-01-31 08:43:00.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:43:00 np0005603622 nova_compute[221301]: 2026-01-31 08:43:00.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:43:00 np0005603622 nova_compute[221301]: 2026-01-31 08:43:00.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:00.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:01.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.370 221324 DEBUG nova.network.neutron [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.481 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.483 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.483 221324 INFO nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating image(s)#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.512 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.516 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'trusted_certs' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.518 221324 DEBUG oslo_concurrency.lockutils [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.518 221324 DEBUG nova.network.neutron [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Refreshing network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.591 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.645 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.651 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "983a360b5198b68100c9d8340d93fd1bf344afca" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.653 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "983a360b5198b68100c9d8340d93fd1bf344afca" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:02.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:02 np0005603622 nova_compute[221301]: 2026-01-31 08:43:02.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:03.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:03 np0005603622 nova_compute[221301]: 2026-01-31 08:43:03.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:04 np0005603622 nova_compute[221301]: 2026-01-31 08:43:04.514 221324 DEBUG nova.virt.libvirt.imagebackend [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/269becbf-04a2-4537-bd31-66899150ed70/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/269becbf-04a2-4537-bd31-66899150ed70/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:43:04 np0005603622 nova_compute[221301]: 2026-01-31 08:43:04.595 221324 DEBUG nova.virt.libvirt.imagebackend [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/269becbf-04a2-4537-bd31-66899150ed70/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 03:43:04 np0005603622 nova_compute[221301]: 2026-01-31 08:43:04.595 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] cloning images/269becbf-04a2-4537-bd31-66899150ed70@snap to None/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 03:43:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:04.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:04 np0005603622 nova_compute[221301]: 2026-01-31 08:43:04.729 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "983a360b5198b68100c9d8340d93fd1bf344afca" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:04 np0005603622 nova_compute[221301]: 2026-01-31 08:43:04.883 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'migration_context' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.043 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] flattening vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 03:43:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:05.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.424 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Image rbd:vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.426 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.427 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Ensure instance console log exists: /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.427 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.428 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.428 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.431 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start _get_guest_xml network_info=[{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:41:51Z,direct_url=<?>,disk_format='raw',id=269becbf-04a2-4537-bd31-66899150ed70,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1679208816-shelved',owner='621c17d53cba46d386de8efb560a988e',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:42:14Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.437 221324 WARNING nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.444 221324 DEBUG nova.virt.libvirt.host [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.445 221324 DEBUG nova.virt.libvirt.host [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.448 221324 DEBUG nova.virt.libvirt.host [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.449 221324 DEBUG nova.virt.libvirt.host [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.451 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.452 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T08:41:51Z,direct_url=<?>,disk_format='raw',id=269becbf-04a2-4537-bd31-66899150ed70,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1679208816-shelved',owner='621c17d53cba46d386de8efb560a988e',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T08:42:14Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.452 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.453 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.453 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.454 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.454 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.454 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.455 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.455 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.455 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.456 221324 DEBUG nova.virt.hardware [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.456 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'vcpu_model' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.540 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:43:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/278512112' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.967 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:05 np0005603622 nova_compute[221301]: 2026-01-31 08:43:05.997 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.001 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:43:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1314583766' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.465 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.468 221324 DEBUG nova.virt.libvirt.vif [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='269becbf-04a2-4537-bd31-66899150ed70',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member',shelved_at='2026-01-31T08:42:16.247982',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='269becbf-04a2-4537-bd31-66899150ed70'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:42:46Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.469 221324 DEBUG nova.network.os_vif_util [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.470 221324 DEBUG nova.network.os_vif_util [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.472 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'pci_devices' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.525 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <uuid>bd87e542-0f7b-453e-b8d1-643ad6fb64f0</uuid>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <name>instance-00000094</name>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersNegativeTestJSON-server-1679208816</nova:name>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:43:05</nova:creationTime>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:user uuid="516e093a00a44667ba1308900be70d8d">tempest-ServersNegativeTestJSON-183161027-project-member</nova:user>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:project uuid="621c17d53cba46d386de8efb560a988e">tempest-ServersNegativeTestJSON-183161027</nova:project>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="269becbf-04a2-4537-bd31-66899150ed70"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <nova:port uuid="b122a11a-5b9d-4b27-a9c3-8327cb8162ae">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="serial">bd87e542-0f7b-453e-b8d1-643ad6fb64f0</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="uuid">bd87e542-0f7b-453e-b8d1-643ad6fb64f0</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:c1:83:50"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <target dev="tapb122a11a-5b"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/console.log" append="off"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:43:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:43:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:43:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:43:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.526 221324 DEBUG nova.compute.manager [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Preparing to wait for external event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.526 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.526 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.527 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.527 221324 DEBUG nova.virt.libvirt.vif [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='269becbf-04a2-4537-bd31-66899150ed70',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:38:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member',shelved_at='2026-01-31T08:42:16.247982',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='269becbf-04a2-4537-bd31-66899150ed70'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:42:46Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.528 221324 DEBUG nova.network.os_vif_util [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.528 221324 DEBUG nova.network.os_vif_util [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.528 221324 DEBUG os_vif [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.529 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.530 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.533 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb122a11a-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.534 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb122a11a-5b, col_values=(('external_ids', {'iface-id': 'b122a11a-5b9d-4b27-a9c3-8327cb8162ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:83:50', 'vm-uuid': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:06 np0005603622 NetworkManager[49080]: <info>  [1769848986.5361] manager: (tapb122a11a-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.542 221324 INFO os_vif [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b')#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.658 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.658 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.658 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] No VIF found with MAC fa:16:3e:c1:83:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.659 221324 INFO nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Using config drive#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.690 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:06.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.739 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'ec2_ids' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:06 np0005603622 nova_compute[221301]: 2026-01-31 08:43:06.917 221324 DEBUG nova.objects.instance [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'keypairs' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:07.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.206 221324 DEBUG nova.network.neutron [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated VIF entry in instance network info cache for port b122a11a-5b9d-4b27-a9c3-8327cb8162ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.207 221324 DEBUG nova.network.neutron [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.256 221324 DEBUG oslo_concurrency.lockutils [req-8811a8fa-6071-4c12-896b-eaff5c9ac653 req-13112869-f6c8-4ee4-858d-f3ce0ad07abb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.258 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.258 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:43:07 np0005603622 nova_compute[221301]: 2026-01-31 08:43:07.259 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:08.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:08 np0005603622 nova_compute[221301]: 2026-01-31 08:43:08.945 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:09.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.331 221324 INFO nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Creating config drive at /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.336 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3m2382d5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.470 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3m2382d5" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.511 221324 DEBUG nova.storage.rbd_utils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] rbd image bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.516 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.721 221324 DEBUG oslo_concurrency.processutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config bd87e542-0f7b-453e-b8d1-643ad6fb64f0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.722 221324 INFO nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deleting local config drive /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0/disk.config because it was imported into RBD.#033[00m
Jan 31 03:43:09 np0005603622 virtqemud[220925]: End of file while reading data: Input/output error
Jan 31 03:43:09 np0005603622 kernel: tapb122a11a-5b: entered promiscuous mode
Jan 31 03:43:09 np0005603622 NetworkManager[49080]: <info>  [1769848989.7856] manager: (tapb122a11a-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Jan 31 03:43:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:09Z|00696|binding|INFO|Claiming lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae for this chassis.
Jan 31 03:43:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:09Z|00697|binding|INFO|b122a11a-5b9d-4b27-a9c3-8327cb8162ae: Claiming fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.790 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 systemd-udevd[281256]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.817 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 systemd-machined[190406]: New machine qemu-71-instance-00000094.
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.823 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 NetworkManager[49080]: <info>  [1769848989.8266] device (tapb122a11a-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:43:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:09Z|00698|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae ovn-installed in OVS
Jan 31 03:43:09 np0005603622 NetworkManager[49080]: <info>  [1769848989.8273] device (tapb122a11a-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:43:09 np0005603622 nova_compute[221301]: 2026-01-31 08:43:09.828 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:09 np0005603622 systemd[1]: Started Virtual Machine qemu-71-instance-00000094.
Jan 31 03:43:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:09Z|00699|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae up in Southbound
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.866 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.867 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee bound to our chassis#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.868 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 550cf3a2-62ab-424d-afc0-3148a4a687ee#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.878 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dbb5b6-08db-4aae-8823-9462bbe7e0e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.879 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap550cf3a2-61 in ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.883 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap550cf3a2-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.883 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a281da77-c84d-4934-994d-423a2d01c603]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.884 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fd30bca5-e215-4bfc-946d-246e81aecdad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.893 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[855e4aa4-b393-4cd8-bd57-1de78f619441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.903 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68778875-5903-41c4-bca8-a6750df6e229]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.921 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fb58debf-55d1-4601-8e2a-9baf80f8db01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 NetworkManager[49080]: <info>  [1769848989.9272] manager: (tap550cf3a2-60): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Jan 31 03:43:09 np0005603622 systemd-udevd[281259]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.926 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7bba4aad-e055-4430-bf2d-cd4267401bcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.949 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[83ce002a-7004-4125-b382-bb84cd08b74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.951 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9475db-e1ab-4ce4-a7ea-4312e47eda32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 NetworkManager[49080]: <info>  [1769848989.9669] device (tap550cf3a2-60): carrier: link connected
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.973 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c267c8-9002-4857-a20a-b9f76c308305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.985 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7763cb4f-75fd-4656-9b66-20fc6ea763bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824130, 'reachable_time': 23206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281290, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:09.999 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfc3bec-8225-4513-bf68-6bce2b840c50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:fc48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 824130, 'tstamp': 824130}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281298, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.014 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa935b3-8654-44a8-acc7-3ec47ed12b27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824130, 'reachable_time': 23206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281307, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.039 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7821c5d7-d45c-4ebe-a8a0-eb1c4983ffca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.075 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e4adf7-667f-40a8-a5df-d7c48bd886a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.076 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.077 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.077 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap550cf3a2-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:10 np0005603622 NetworkManager[49080]: <info>  [1769848990.0797] manager: (tap550cf3a2-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 31 03:43:10 np0005603622 kernel: tap550cf3a2-60: entered promiscuous mode
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.079 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.082 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap550cf3a2-60, col_values=(('external_ids', {'iface-id': '9f1ac82b-bf6c-400f-a03c-b15ad5392890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:10Z|00700|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.091 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.092 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[83aeab14-04e7-47f5-b82b-f3584c9bda41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.093 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:43:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:10.094 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'env', 'PROCESS_TAG=haproxy-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/550cf3a2-62ab-424d-afc0-3148a4a687ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.177 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848990.1763928, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.177 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Started (Lifecycle Event)#033[00m
Jan 31 03:43:10 np0005603622 podman[281366]: 2026-01-31 08:43:10.437475287 +0000 UTC m=+0.059391586 container create 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:43:10 np0005603622 systemd[1]: Started libpod-conmon-2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd.scope.
Jan 31 03:43:10 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:43:10 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ea23925dc00fccc1e49d108c12cb3c14d1dca54d68fd49935f12b2635bca06c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:43:10 np0005603622 podman[281366]: 2026-01-31 08:43:10.407455726 +0000 UTC m=+0.029372015 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:43:10 np0005603622 podman[281366]: 2026-01-31 08:43:10.51443928 +0000 UTC m=+0.136355579 container init 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:10 np0005603622 podman[281366]: 2026-01-31 08:43:10.518065157 +0000 UTC m=+0.139981426 container start 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.523 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.527 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848990.1765947, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.528 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:43:10 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [NOTICE]   (281385) : New worker (281387) forked
Jan 31 03:43:10 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [NOTICE]   (281385) : Loading success.
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.692 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:10 np0005603622 nova_compute[221301]: 2026-01-31 08:43:10.695 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:10.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:11 np0005603622 nova_compute[221301]: 2026-01-31 08:43:11.054 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:43:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:11.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:11 np0005603622 nova_compute[221301]: 2026-01-31 08:43:11.536 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.174 221324 DEBUG nova.compute.manager [req-a35fb3cd-2bd1-4012-8af4-ccbe93724069 req-fded51c9-ed80-4d8c-8505-168c9e1afda4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.175 221324 DEBUG oslo_concurrency.lockutils [req-a35fb3cd-2bd1-4012-8af4-ccbe93724069 req-fded51c9-ed80-4d8c-8505-168c9e1afda4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.175 221324 DEBUG oslo_concurrency.lockutils [req-a35fb3cd-2bd1-4012-8af4-ccbe93724069 req-fded51c9-ed80-4d8c-8505-168c9e1afda4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.175 221324 DEBUG oslo_concurrency.lockutils [req-a35fb3cd-2bd1-4012-8af4-ccbe93724069 req-fded51c9-ed80-4d8c-8505-168c9e1afda4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.176 221324 DEBUG nova.compute.manager [req-a35fb3cd-2bd1-4012-8af4-ccbe93724069 req-fded51c9-ed80-4d8c-8505-168c9e1afda4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Processing event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.177 221324 DEBUG nova.compute.manager [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.181 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769848992.1814559, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.182 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.184 221324 DEBUG nova.virt.libvirt.driver [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.187 221324 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance spawned successfully.#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.244 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.248 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:12 np0005603622 nova_compute[221301]: 2026-01-31 08:43:12.297 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:43:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:12.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:13.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:13 np0005603622 nova_compute[221301]: 2026-01-31 08:43:13.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.591 221324 DEBUG nova.compute.manager [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.591 221324 DEBUG oslo_concurrency.lockutils [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.592 221324 DEBUG oslo_concurrency.lockutils [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.592 221324 DEBUG oslo_concurrency.lockutils [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.592 221324 DEBUG nova.compute.manager [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:14 np0005603622 nova_compute[221301]: 2026-01-31 08:43:14.592 221324 WARNING nova.compute.manager [req-b94b05e2-ccac-4be7-9c4a-fbccd4b93b27 req-86e79cdd-a9dd-43f0-9a09-413d9583e019 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 31 03:43:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:14.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 31 03:43:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:15.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.725 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.966 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.967 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.968 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.969 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:15 np0005603622 nova_compute[221301]: 2026-01-31 08:43:15.969 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.061 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.062 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.062 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.062 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.063 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:43:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3708569498' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.539 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.541 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:16.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.804 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.805 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.991 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.992 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4215MB free_disk=20.897178649902344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.992 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:16 np0005603622 nova_compute[221301]: 2026-01-31 08:43:16.993 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:17.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:17 np0005603622 nova_compute[221301]: 2026-01-31 08:43:17.802 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:43:17 np0005603622 nova_compute[221301]: 2026-01-31 08:43:17.803 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:43:17 np0005603622 nova_compute[221301]: 2026-01-31 08:43:17.803 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.200 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.224 221324 DEBUG nova.compute.manager [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:43:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1301523886' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.650 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.657 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.676 221324 DEBUG oslo_concurrency.lockutils [None req-5c53c47f-93f8-477c-9e2c-166d72d1b93f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 32.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:18.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.741 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.747 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.747 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:18 np0005603622 nova_compute[221301]: 2026-01-31 08:43:18.952 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:19.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:20.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 31 03:43:21 np0005603622 nova_compute[221301]: 2026-01-31 08:43:21.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:22.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:23 np0005603622 podman[281442]: 2026-01-31 08:43:23.296836771 +0000 UTC m=+0.221729228 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:23 np0005603622 podman[281441]: 2026-01-31 08:43:23.322131876 +0000 UTC m=+0.249631883 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:23 np0005603622 nova_compute[221301]: 2026-01-31 08:43:23.953 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:24.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:25.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:25 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:25Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:43:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:43:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:43:26 np0005603622 nova_compute[221301]: 2026-01-31 08:43:26.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:26.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:27.009 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:27 np0005603622 nova_compute[221301]: 2026-01-31 08:43:27.010 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:27.011 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:43:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:27.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:43:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:28.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:43:28 np0005603622 nova_compute[221301]: 2026-01-31 08:43:28.988 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:29.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:30.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:30.900 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:30.901 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:30.901 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:31.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:31 np0005603622 radosgw[84338]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 31 03:43:31 np0005603622 nova_compute[221301]: 2026-01-31 08:43:31.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:31 np0005603622 nova_compute[221301]: 2026-01-31 08:43:31.945 221324 DEBUG nova.objects.instance [None req-41e8ea0e-b035-4448-a8d6-9406519be43f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'pci_devices' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:32 np0005603622 nova_compute[221301]: 2026-01-31 08:43:32.238 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849012.2381866, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:32 np0005603622 nova_compute[221301]: 2026-01-31 08:43:32.239 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:43:32 np0005603622 nova_compute[221301]: 2026-01-31 08:43:32.509 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:32 np0005603622 nova_compute[221301]: 2026-01-31 08:43:32.516 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:32.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:32 np0005603622 kernel: tapb122a11a-5b (unregistering): left promiscuous mode
Jan 31 03:43:32 np0005603622 NetworkManager[49080]: <info>  [1769849012.9909] device (tapb122a11a-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:43:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:32Z|00701|binding|INFO|Releasing lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae from this chassis (sb_readonly=0)
Jan 31 03:43:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:32Z|00702|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae down in Southbound
Jan 31 03:43:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:32Z|00703|binding|INFO|Removing iface tapb122a11a-5b ovn-installed in OVS
Jan 31 03:43:32 np0005603622 nova_compute[221301]: 2026-01-31 08:43:32.994 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 31 03:43:33 np0005603622 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Consumed 13.325s CPU time.
Jan 31 03:43:33 np0005603622 systemd-machined[190406]: Machine qemu-71-instance-00000094 terminated.
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.086 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.102 221324 DEBUG nova.compute.manager [None req-41e8ea0e-b035-4448-a8d6-9406519be43f 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.256 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.547 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.549 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee unbound from our chassis#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.552 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 550cf3a2-62ab-424d-afc0-3148a4a687ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.554 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e40e8b4e-45a8-4bbb-bdc9-83bfbeded801]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.555 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace which is not needed anymore#033[00m
Jan 31 03:43:33 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [NOTICE]   (281385) : haproxy version is 2.8.14-c23fe91
Jan 31 03:43:33 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [NOTICE]   (281385) : path to executable is /usr/sbin/haproxy
Jan 31 03:43:33 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [WARNING]  (281385) : Exiting Master process...
Jan 31 03:43:33 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [ALERT]    (281385) : Current worker (281387) exited with code 143 (Terminated)
Jan 31 03:43:33 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281381]: [WARNING]  (281385) : All workers exited. Exiting... (0)
Jan 31 03:43:33 np0005603622 systemd[1]: libpod-2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd.scope: Deactivated successfully.
Jan 31 03:43:33 np0005603622 podman[281651]: 2026-01-31 08:43:33.707266877 +0000 UTC m=+0.057699120 container died 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:43:33 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd-userdata-shm.mount: Deactivated successfully.
Jan 31 03:43:33 np0005603622 systemd[1]: var-lib-containers-storage-overlay-6ea23925dc00fccc1e49d108c12cb3c14d1dca54d68fd49935f12b2635bca06c-merged.mount: Deactivated successfully.
Jan 31 03:43:33 np0005603622 podman[281651]: 2026-01-31 08:43:33.796874919 +0000 UTC m=+0.147307152 container cleanup 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:43:33 np0005603622 systemd[1]: libpod-conmon-2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd.scope: Deactivated successfully.
Jan 31 03:43:33 np0005603622 podman[281733]: 2026-01-31 08:43:33.87187488 +0000 UTC m=+0.055465111 container remove 2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.875 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f63bb00c-c1ee-4240-aa7a-a066a654bbd5]: (4, ('Sat Jan 31 08:43:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd)\n2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd\nSat Jan 31 08:43:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd)\n2dafdaa82baef22b928fa1f48ada65ef8fdbb1bb56cdbc5aab8c036d851ca8cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.877 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5691efcb-ea01-4ef3-baa3-c07646bb7ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.878 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 kernel: tap550cf3a2-60: left promiscuous mode
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.892 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc82f89-f628-4870-a19d-4c6f0766d2b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.913 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[44726868-5706-41b7-8d6e-b5df6c0e8d54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.914 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[77b67308-afd7-47aa-8dda-70c5ef81f086]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.925 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[005a3d3b-74c0-4c00-84f4-6aaecd714595]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 824125, 'reachable_time': 29154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281752, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 systemd[1]: run-netns-ovnmeta\x2d550cf3a2\x2d62ab\x2d424d\x2dafc0\x2d3148a4a687ee.mount: Deactivated successfully.
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.930 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:43:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:33.930 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae13eab-9c71-4c87-ac27-4a0ca33cdbd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:33 np0005603622 nova_compute[221301]: 2026-01-31 08:43:33.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:34 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:43:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:34.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:35.013 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:35.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:35 np0005603622 nova_compute[221301]: 2026-01-31 08:43:35.745 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:36 np0005603622 nova_compute[221301]: 2026-01-31 08:43:36.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:36.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:37.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:38.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:38 np0005603622 nova_compute[221301]: 2026-01-31 08:43:38.992 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:39.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.369 221324 DEBUG nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.370 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.370 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.370 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.370 221324 DEBUG nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.371 221324 WARNING nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.371 221324 DEBUG nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.371 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.371 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.372 221324 DEBUG oslo_concurrency.lockutils [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.372 221324 DEBUG nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:39 np0005603622 nova_compute[221301]: 2026-01-31 08:43:39.372 221324 WARNING nova.compute.manager [req-d94fe3d5-f8ee-4fe0-af98-d0ff3e813269 req-e3324f5d-9db7-4af7-b764-e57fb7757827 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:43:40 np0005603622 nova_compute[221301]: 2026-01-31 08:43:40.489 221324 INFO nova.compute.manager [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Resuming#033[00m
Jan 31 03:43:40 np0005603622 nova_compute[221301]: 2026-01-31 08:43:40.490 221324 DEBUG nova.objects.instance [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'flavor' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:40 np0005603622 nova_compute[221301]: 2026-01-31 08:43:40.604 221324 DEBUG oslo_concurrency.lockutils [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:43:40 np0005603622 nova_compute[221301]: 2026-01-31 08:43:40.605 221324 DEBUG oslo_concurrency.lockutils [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:43:40 np0005603622 nova_compute[221301]: 2026-01-31 08:43:40.605 221324 DEBUG nova.network.neutron [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:43:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:40.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:41.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:41 np0005603622 nova_compute[221301]: 2026-01-31 08:43:41.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:42.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:43.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:43 np0005603622 nova_compute[221301]: 2026-01-31 08:43:43.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:44.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.421 221324 DEBUG nova.network.neutron [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.672 221324 DEBUG oslo_concurrency.lockutils [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.676 221324 DEBUG nova.virt.libvirt.vif [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:43:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:43:34Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.677 221324 DEBUG nova.network.os_vif_util [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.677 221324 DEBUG nova.network.os_vif_util [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.678 221324 DEBUG os_vif [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.678 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.678 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.679 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.681 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.681 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb122a11a-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.682 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb122a11a-5b, col_values=(('external_ids', {'iface-id': 'b122a11a-5b9d-4b27-a9c3-8327cb8162ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:83:50', 'vm-uuid': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.682 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.682 221324 INFO os_vif [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b')#033[00m
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.699 221324 DEBUG nova.objects.instance [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'numa_topology' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:45 np0005603622 kernel: tapb122a11a-5b: entered promiscuous mode
Jan 31 03:43:45 np0005603622 NetworkManager[49080]: <info>  [1769849025.9325] manager: (tapb122a11a-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Jan 31 03:43:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:45Z|00704|binding|INFO|Claiming lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae for this chassis.
Jan 31 03:43:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:45Z|00705|binding|INFO|b122a11a-5b9d-4b27-a9c3-8327cb8162ae: Claiming fa:16:3e:c1:83:50 10.100.0.11
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:45Z|00706|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae ovn-installed in OVS
Jan 31 03:43:45 np0005603622 nova_compute[221301]: 2026-01-31 08:43:45.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:45 np0005603622 systemd-machined[190406]: New machine qemu-72-instance-00000094.
Jan 31 03:43:45 np0005603622 systemd-udevd[281767]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:43:45 np0005603622 NetworkManager[49080]: <info>  [1769849025.9686] device (tapb122a11a-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:43:45 np0005603622 NetworkManager[49080]: <info>  [1769849025.9692] device (tapb122a11a-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:43:45 np0005603622 systemd[1]: Started Virtual Machine qemu-72-instance-00000094.
Jan 31 03:43:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:45Z|00707|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae up in Southbound
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:45.999 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.001 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee bound to our chassis#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.002 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 550cf3a2-62ab-424d-afc0-3148a4a687ee#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.010 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cca0b6dc-fe25-44df-b9c4-15770368cec6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.011 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap550cf3a2-61 in ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.013 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap550cf3a2-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.013 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e233cf-efff-49bd-b694-7546ac5d1a9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.014 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ba78e0b9-2f62-42ee-9166-151dc757027d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.024 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9a82e7c7-36e5-42a0-aa3f-19613782a726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.037 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8260555f-6243-4b37-aa8d-72a9882355ab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.058 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b9909a5e-3b5b-41d0-923f-4a301a5bc466]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.063 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a60e9910-e157-4b63-ad3c-3402604262a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 NetworkManager[49080]: <info>  [1769849026.0658] manager: (tap550cf3a2-60): new Veth device (/org/freedesktop/NetworkManager/Devices/314)
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.082 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[594fb922-746b-471d-a9d0-8a984a57e599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.086 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1fac5e-8ce4-433f-898d-08d8d03179b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 NetworkManager[49080]: <info>  [1769849026.1025] device (tap550cf3a2-60): carrier: link connected
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.105 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[797feed5-4b64-4b0e-9984-00e102437d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.117 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1823fc-10ee-498c-b1c0-60bad0a43211]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827744, 'reachable_time': 22507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281806, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.130 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f995bde9-137e-4af8-b213-cb5cbeac283a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:fc48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 827744, 'tstamp': 827744}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281816, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.141 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5ccce3-0360-4fe2-9220-69b2d3911086]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap550cf3a2-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:fc:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827744, 'reachable_time': 22507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281820, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.165 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9af98354-06ca-4808-b1c1-522643c926e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.205 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f7eb69ad-563e-475a-8531-83a864dc8eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.206 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.207 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.207 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap550cf3a2-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 NetworkManager[49080]: <info>  [1769849026.2099] manager: (tap550cf3a2-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 31 03:43:46 np0005603622 kernel: tap550cf3a2-60: entered promiscuous mode
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.212 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.214 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap550cf3a2-60, col_values=(('external_ids', {'iface-id': '9f1ac82b-bf6c-400f-a03c-b15ad5392890'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:43:46Z|00708|binding|INFO|Releasing lport 9f1ac82b-bf6c-400f-a03c-b15ad5392890 from this chassis (sb_readonly=0)
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.216 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.218 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.219 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[51dfe884-4dc0-47df-8405-60cbc7db0e1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.219 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/550cf3a2-62ab-424d-afc0-3148a4a687ee.pid.haproxy
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 550cf3a2-62ab-424d-afc0-3148a4a687ee
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:43:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:43:46.220 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'env', 'PROCESS_TAG=haproxy-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/550cf3a2-62ab-424d-afc0-3148a4a687ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.399 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for bd87e542-0f7b-453e-b8d1-643ad6fb64f0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.399 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849026.39887, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.400 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Started (Lifecycle Event)#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.426 221324 DEBUG nova.compute.manager [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.427 221324 DEBUG nova.objects.instance [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'pci_devices' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:43:46 np0005603622 podman[281876]: 2026-01-31 08:43:46.556458109 +0000 UTC m=+0.057991848 container create a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:46 np0005603622 systemd[1]: Started libpod-conmon-a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149.scope.
Jan 31 03:43:46 np0005603622 podman[281876]: 2026-01-31 08:43:46.529802768 +0000 UTC m=+0.031336557 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:43:46 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:43:46 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ac51549db5d6a8a90e92ef6bd81423fccbaf752e19836b86a66480d56e80c0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:43:46 np0005603622 podman[281876]: 2026-01-31 08:43:46.662275193 +0000 UTC m=+0.163809032 container init a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:43:46 np0005603622 podman[281876]: 2026-01-31 08:43:46.668564471 +0000 UTC m=+0.170098230 container start a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:43:46 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [NOTICE]   (281895) : New worker (281897) forked
Jan 31 03:43:46 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [NOTICE]   (281895) : Loading success.
Jan 31 03:43:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:46.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.814 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:46 np0005603622 nova_compute[221301]: 2026-01-31 08:43:46.819 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.027 221324 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance running successfully.#033[00m
Jan 31 03:43:47 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.029 221324 DEBUG nova.virt.libvirt.guest [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.029 221324 DEBUG nova.compute.manager [None req-c84f7b4b-240e-4ee4-992f-b8a540834ad8 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:47.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.203 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.204 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849026.402585, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.204 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.592 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:43:47 np0005603622 nova_compute[221301]: 2026-01-31 08:43:47.596 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.305 221324 DEBUG nova.compute.manager [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.306 221324 DEBUG oslo_concurrency.lockutils [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.306 221324 DEBUG oslo_concurrency.lockutils [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.306 221324 DEBUG oslo_concurrency.lockutils [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.307 221324 DEBUG nova.compute.manager [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:48 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.307 221324 WARNING nova.compute.manager [req-00f8f7c0-9112-451b-b62d-a07b3f8359f3 req-8dd87277-e976-4f66-8f2e-ca115c7ff9ec fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state None.#033[00m
Jan 31 03:43:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:48.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:49 np0005603622 nova_compute[221301]: 2026-01-31 08:43:48.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:49.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:50.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:51.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.287 221324 DEBUG nova.compute.manager [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.288 221324 DEBUG oslo_concurrency.lockutils [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.289 221324 DEBUG oslo_concurrency.lockutils [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.289 221324 DEBUG oslo_concurrency.lockutils [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.289 221324 DEBUG nova.compute.manager [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.289 221324 WARNING nova.compute.manager [req-74ee8bd9-a933-438d-8250-eaed54755462 req-0c66a364-0544-477e-b89b-3279182c6ce0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state None.#033[00m
Jan 31 03:43:51 np0005603622 nova_compute[221301]: 2026-01-31 08:43:51.582 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:52.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:53.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:54 np0005603622 nova_compute[221301]: 2026-01-31 08:43:54.000 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:54 np0005603622 podman[281907]: 2026-01-31 08:43:54.151744725 +0000 UTC m=+0.072172197 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:43:54 np0005603622 podman[281906]: 2026-01-31 08:43:54.156141813 +0000 UTC m=+0.082020260 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:43:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:54.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:55.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:55 np0005603622 nova_compute[221301]: 2026-01-31 08:43:55.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:56 np0005603622 nova_compute[221301]: 2026-01-31 08:43:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:56 np0005603622 nova_compute[221301]: 2026-01-31 08:43:56.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:43:56 np0005603622 nova_compute[221301]: 2026-01-31 08:43:56.584 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:56.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:57.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:57 np0005603622 nova_compute[221301]: 2026-01-31 08:43:57.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:58 np0005603622 nova_compute[221301]: 2026-01-31 08:43:58.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:43:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:43:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:43:58.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:43:59 np0005603622 nova_compute[221301]: 2026-01-31 08:43:59.003 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:43:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:43:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:43:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:43:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:43:59.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:43:59 np0005603622 nova_compute[221301]: 2026-01-31 08:43:59.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:00.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.890 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.891 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.892 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.892 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:44:00 np0005603622 nova_compute[221301]: 2026-01-31 08:44:00.893 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:44:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:01.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:44:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1091343465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:44:01 np0005603622 nova_compute[221301]: 2026-01-31 08:44:01.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:01 np0005603622 nova_compute[221301]: 2026-01-31 08:44:01.631 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.738s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.302 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.302 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-00000094 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.454 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.455 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4177MB free_disk=20.92180633544922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.456 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:02 np0005603622 nova_compute[221301]: 2026-01-31 08:44:02.456 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:02.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:03.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.400 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.401 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.401 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.468 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:44:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:44:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1422198728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.866 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:03 np0005603622 nova_compute[221301]: 2026-01-31 08:44:03.872 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:44:04 np0005603622 nova_compute[221301]: 2026-01-31 08:44:04.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:04 np0005603622 nova_compute[221301]: 2026-01-31 08:44:04.180 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:44:04 np0005603622 nova_compute[221301]: 2026-01-31 08:44:04.377 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:44:04 np0005603622 nova_compute[221301]: 2026-01-31 08:44:04.377 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:04.650 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:44:04 np0005603622 nova_compute[221301]: 2026-01-31 08:44:04.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:04.652 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:44:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:04.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.378 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.379 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.379 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:06.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.844 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.845 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.845 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:44:06 np0005603622 nova_compute[221301]: 2026-01-31 08:44:06.845 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:44:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:07.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:08.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:09 np0005603622 nova_compute[221301]: 2026-01-31 08:44:09.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:09.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:10.654 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:10.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:11.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:11 np0005603622 nova_compute[221301]: 2026-01-31 08:44:11.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:11 np0005603622 nova_compute[221301]: 2026-01-31 08:44:11.861 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [{"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:44:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:12.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:13.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:13 np0005603622 nova_compute[221301]: 2026-01-31 08:44:13.376 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-bd87e542-0f7b-453e-b8d1-643ad6fb64f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:44:13 np0005603622 nova_compute[221301]: 2026-01-31 08:44:13.377 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:44:13 np0005603622 nova_compute[221301]: 2026-01-31 08:44:13.378 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:13 np0005603622 nova_compute[221301]: 2026-01-31 08:44:13.378 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:14 np0005603622 nova_compute[221301]: 2026-01-31 08:44:14.010 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:14.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:15.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:16 np0005603622 nova_compute[221301]: 2026-01-31 08:44:16.615 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:16.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:17.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:18.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:19 np0005603622 nova_compute[221301]: 2026-01-31 08:44:19.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:19.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:20.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:21.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:21 np0005603622 nova_compute[221301]: 2026-01-31 08:44:21.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:22.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:23.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:24 np0005603622 nova_compute[221301]: 2026-01-31 08:44:24.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:24.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:25 np0005603622 podman[282000]: 2026-01-31 08:44:25.137878198 +0000 UTC m=+0.053998292 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:44:25 np0005603622 podman[281999]: 2026-01-31 08:44:25.177085844 +0000 UTC m=+0.101398916 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 31 03:44:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:25.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:26 np0005603622 nova_compute[221301]: 2026-01-31 08:44:26.666 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:26.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:27.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:28.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:29 np0005603622 nova_compute[221301]: 2026-01-31 08:44:29.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:30.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:30.901 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:30.901 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:30.902 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:31.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:31 np0005603622 nova_compute[221301]: 2026-01-31 08:44:31.669 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:32.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:33.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:44:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/470525319' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:44:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:44:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/470525319' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:44:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:34 np0005603622 nova_compute[221301]: 2026-01-31 08:44:34.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:44:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:44:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:35.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.502 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.504 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.504 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.505 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.505 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.508 221324 INFO nova.compute.manager [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Terminating instance#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.510 221324 DEBUG nova.compute.manager [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:44:36 np0005603622 nova_compute[221301]: 2026-01-31 08:44:36.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:37 np0005603622 kernel: tapb122a11a-5b (unregistering): left promiscuous mode
Jan 31 03:44:37 np0005603622 NetworkManager[49080]: <info>  [1769849077.1149] device (tapb122a11a-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.220 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:44:37Z|00709|binding|INFO|Releasing lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae from this chassis (sb_readonly=0)
Jan 31 03:44:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:44:37Z|00710|binding|INFO|Setting lport b122a11a-5b9d-4b27-a9c3-8327cb8162ae down in Southbound
Jan 31 03:44:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:44:37Z|00711|binding|INFO|Removing iface tapb122a11a-5b ovn-installed in OVS
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.224 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:37.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:37 np0005603622 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 31 03:44:37 np0005603622 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000094.scope: Consumed 2.476s CPU time.
Jan 31 03:44:37 np0005603622 systemd-machined[190406]: Machine qemu-72-instance-00000094 terminated.
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.354 221324 INFO nova.virt.libvirt.driver [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Instance destroyed successfully.#033[00m
Jan 31 03:44:37 np0005603622 nova_compute[221301]: 2026-01-31 08:44:37.355 221324 DEBUG nova.objects.instance [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lazy-loading 'resources' on Instance uuid bd87e542-0f7b-453e-b8d1-643ad6fb64f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:44:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:38.428 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:83:50 10.100.0.11'], port_security=['fa:16:3e:c1:83:50 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bd87e542-0f7b-453e-b8d1-643ad6fb64f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '621c17d53cba46d386de8efb560a988e', 'neutron:revision_number': '11', 'neutron:security_group_ids': '1c8dcf47-c169-4871-843e-ae38c0fc69f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bda2ce92-ce79-4f8b-b120-fd83adc645ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=b122a11a-5b9d-4b27-a9c3-8327cb8162ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:44:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:38.430 139793 INFO neutron.agent.ovn.metadata.agent [-] Port b122a11a-5b9d-4b27-a9c3-8327cb8162ae in datapath 550cf3a2-62ab-424d-afc0-3148a4a687ee unbound from our chassis#033[00m
Jan 31 03:44:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:38.431 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 550cf3a2-62ab-424d-afc0-3148a4a687ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:44:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:38.433 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ee5022-0742-4df1-93dc-ebdbedaee6c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:38.433 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee namespace which is not needed anymore#033[00m
Jan 31 03:44:38 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [NOTICE]   (281895) : haproxy version is 2.8.14-c23fe91
Jan 31 03:44:38 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [NOTICE]   (281895) : path to executable is /usr/sbin/haproxy
Jan 31 03:44:38 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [WARNING]  (281895) : Exiting Master process...
Jan 31 03:44:38 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [ALERT]    (281895) : Current worker (281897) exited with code 143 (Terminated)
Jan 31 03:44:38 np0005603622 neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee[281891]: [WARNING]  (281895) : All workers exited. Exiting... (0)
Jan 31 03:44:38 np0005603622 systemd[1]: libpod-a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149.scope: Deactivated successfully.
Jan 31 03:44:38 np0005603622 podman[282209]: 2026-01-31 08:44:38.778970151 +0000 UTC m=+0.253184717 container died a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.788 221324 DEBUG nova.virt.libvirt.vif [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-31T08:38:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1679208816',display_name='tempest-ServersNegativeTestJSON-server-1679208816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1679208816',id=148,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:43:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='621c17d53cba46d386de8efb560a988e',ramdisk_id='',reservation_id='r-kclqkoza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-183161027',owner_user_name='tempest-ServersNegativeTestJSON-183161027-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:43:47Z,user_data=None,user_id='516e093a00a44667ba1308900be70d8d',uuid=bd87e542-0f7b-453e-b8d1-643ad6fb64f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.789 221324 DEBUG nova.network.os_vif_util [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converting VIF {"id": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "address": "fa:16:3e:c1:83:50", "network": {"id": "550cf3a2-62ab-424d-afc0-3148a4a687ee", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1062247136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "621c17d53cba46d386de8efb560a988e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb122a11a-5b", "ovs_interfaceid": "b122a11a-5b9d-4b27-a9c3-8327cb8162ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.790 221324 DEBUG nova.network.os_vif_util [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.791 221324 DEBUG os_vif [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.793 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.794 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb122a11a-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.797 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.799 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:38 np0005603622 nova_compute[221301]: 2026-01-31 08:44:38.803 221324 INFO os_vif [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:83:50,bridge_name='br-int',has_traffic_filtering=True,id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae,network=Network(550cf3a2-62ab-424d-afc0-3148a4a687ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb122a11a-5b')#033[00m
Jan 31 03:44:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:38.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:39 np0005603622 nova_compute[221301]: 2026-01-31 08:44:39.045 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay-4ac51549db5d6a8a90e92ef6bd81423fccbaf752e19836b86a66480d56e80c0e-merged.mount: Deactivated successfully.
Jan 31 03:44:39 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149-userdata-shm.mount: Deactivated successfully.
Jan 31 03:44:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:39 np0005603622 podman[282209]: 2026-01-31 08:44:39.45948496 +0000 UTC m=+0.933699536 container cleanup a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:44:39 np0005603622 systemd[1]: libpod-conmon-a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149.scope: Deactivated successfully.
Jan 31 03:44:39 np0005603622 podman[282257]: 2026-01-31 08:44:39.74691932 +0000 UTC m=+0.263886532 container remove a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.753 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[468b010a-d64a-4440-89dd-9da73c12a807]: (4, ('Sat Jan 31 08:44:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149)\na0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149\nSat Jan 31 08:44:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee (a0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149)\na0216c3a6ea2b02bd71704d40e902778be11db4bb5b7ac3d5660e0d8fefaa149\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.755 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[360b02ca-1b24-4540-92b7-3aefc2c0a41f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.756 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap550cf3a2-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:39 np0005603622 nova_compute[221301]: 2026-01-31 08:44:39.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:39 np0005603622 kernel: tap550cf3a2-60: left promiscuous mode
Jan 31 03:44:39 np0005603622 nova_compute[221301]: 2026-01-31 08:44:39.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.770 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d8493443-e921-45af-a030-dda2804d48a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.788 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[590c391d-4e5e-465c-9490-dcb00d961ddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.789 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a91456c5-aed5-4f4f-9ae9-420e88a77b86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.805 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[86fa4d9c-a5b9-4d7f-bb49-e9ccb085f9c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827739, 'reachable_time': 44738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282273, 'error': None, 'target': 'ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:39 np0005603622 systemd[1]: run-netns-ovnmeta\x2d550cf3a2\x2d62ab\x2d424d\x2dafc0\x2d3148a4a687ee.mount: Deactivated successfully.
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.808 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-550cf3a2-62ab-424d-afc0-3148a4a687ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:44:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:39.808 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[65729e7c-7d64-4161-8d84-26832ce598dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:44:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:40.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:41.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.666 221324 DEBUG nova.compute.manager [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.667 221324 DEBUG oslo_concurrency.lockutils [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.667 221324 DEBUG oslo_concurrency.lockutils [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.667 221324 DEBUG oslo_concurrency.lockutils [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.668 221324 DEBUG nova.compute.manager [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:44:42 np0005603622 nova_compute[221301]: 2026-01-31 08:44:42.668 221324 DEBUG nova.compute.manager [req-8038f9bf-f24c-4ec8-83ad-5efdfc5f15e2 req-3e215f40-d441-461a-b29d-982afedbe2ac fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-unplugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:44:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:42.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:43.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:43 np0005603622 nova_compute[221301]: 2026-01-31 08:44:43.799 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:44 np0005603622 nova_compute[221301]: 2026-01-31 08:44:44.097 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:44.579 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:44:44 np0005603622 nova_compute[221301]: 2026-01-31 08:44:44.580 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:44 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:44.581 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:44:44 np0005603622 nova_compute[221301]: 2026-01-31 08:44:44.755 221324 INFO nova.virt.libvirt.driver [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deleting instance files /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_del#033[00m
Jan 31 03:44:44 np0005603622 nova_compute[221301]: 2026-01-31 08:44:44.756 221324 INFO nova.virt.libvirt.driver [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deletion of /var/lib/nova/instances/bd87e542-0f7b-453e-b8d1-643ad6fb64f0_del complete#033[00m
Jan 31 03:44:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:44.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.212 221324 DEBUG nova.compute.manager [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.213 221324 DEBUG oslo_concurrency.lockutils [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.213 221324 DEBUG oslo_concurrency.lockutils [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.213 221324 DEBUG oslo_concurrency.lockutils [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.214 221324 DEBUG nova.compute.manager [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] No waiting events found dispatching network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.214 221324 WARNING nova.compute.manager [req-8bc59449-703e-493b-88e9-ce6052f6ecb5 req-267681cd-7c75-462e-ab6c-3c859a325c2e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received unexpected event network-vif-plugged-b122a11a-5b9d-4b27-a9c3-8327cb8162ae for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:44:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:45.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.929 221324 INFO nova.compute.manager [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Took 9.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.929 221324 DEBUG oslo.service.loopingcall [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.930 221324 DEBUG nova.compute.manager [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:44:45 np0005603622 nova_compute[221301]: 2026-01-31 08:44:45.930 221324 DEBUG nova.network.neutron [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:44:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:46.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:47.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:44:48.583 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:44:48 np0005603622 nova_compute[221301]: 2026-01-31 08:44:48.803 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:48.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:49 np0005603622 nova_compute[221301]: 2026-01-31 08:44:49.099 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:49.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:50.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:50 np0005603622 nova_compute[221301]: 2026-01-31 08:44:50.916 221324 DEBUG nova.network.neutron [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:44:50 np0005603622 nova_compute[221301]: 2026-01-31 08:44:50.944 221324 DEBUG nova.compute.manager [req-7aedac4b-89f0-4a8c-be07-213d92d16512 req-db747b62-b841-481a-a8f3-afc7bd47bc5d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Received event network-vif-deleted-b122a11a-5b9d-4b27-a9c3-8327cb8162ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:44:50 np0005603622 nova_compute[221301]: 2026-01-31 08:44:50.945 221324 INFO nova.compute.manager [req-7aedac4b-89f0-4a8c-be07-213d92d16512 req-db747b62-b841-481a-a8f3-afc7bd47bc5d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Neutron deleted interface b122a11a-5b9d-4b27-a9c3-8327cb8162ae; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:44:50 np0005603622 nova_compute[221301]: 2026-01-31 08:44:50.945 221324 DEBUG nova.network.neutron [req-7aedac4b-89f0-4a8c-be07-213d92d16512 req-db747b62-b841-481a-a8f3-afc7bd47bc5d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:44:51 np0005603622 nova_compute[221301]: 2026-01-31 08:44:51.102 221324 INFO nova.compute.manager [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Took 5.17 seconds to deallocate network for instance.#033[00m
Jan 31 03:44:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:51.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:51 np0005603622 nova_compute[221301]: 2026-01-31 08:44:51.270 221324 DEBUG nova.compute.manager [req-7aedac4b-89f0-4a8c-be07-213d92d16512 req-db747b62-b841-481a-a8f3-afc7bd47bc5d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Detach interface failed, port_id=b122a11a-5b9d-4b27-a9c3-8327cb8162ae, reason: Instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:44:51 np0005603622 nova_compute[221301]: 2026-01-31 08:44:51.441 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:44:51 np0005603622 nova_compute[221301]: 2026-01-31 08:44:51.442 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:44:51 np0005603622 nova_compute[221301]: 2026-01-31 08:44:51.638 221324 DEBUG oslo_concurrency.processutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:44:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:44:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2359509805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.175 221324 DEBUG oslo_concurrency.processutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.180 221324 DEBUG nova.compute.provider_tree [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.351 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849077.3502662, bd87e542-0f7b-453e-b8d1-643ad6fb64f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.352 221324 INFO nova.compute.manager [-] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.488 221324 DEBUG nova.scheduler.client.report [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:44:52 np0005603622 nova_compute[221301]: 2026-01-31 08:44:52.613 221324 DEBUG nova.compute.manager [None req-05823824-d72a-48a8-b8d1-fade10d43dca - - - - - -] [instance: bd87e542-0f7b-453e-b8d1-643ad6fb64f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:44:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:52.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:53 np0005603622 nova_compute[221301]: 2026-01-31 08:44:53.155 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:44:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:53.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:44:53 np0005603622 nova_compute[221301]: 2026-01-31 08:44:53.378 221324 INFO nova.scheduler.client.report [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Deleted allocations for instance bd87e542-0f7b-453e-b8d1-643ad6fb64f0#033[00m
Jan 31 03:44:53 np0005603622 nova_compute[221301]: 2026-01-31 08:44:53.807 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:54 np0005603622 nova_compute[221301]: 2026-01-31 08:44:54.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:54 np0005603622 nova_compute[221301]: 2026-01-31 08:44:54.383 221324 DEBUG oslo_concurrency.lockutils [None req-1ef0340c-3ae7-4f83-a934-6f140d65f914 516e093a00a44667ba1308900be70d8d 621c17d53cba46d386de8efb560a988e - - default default] Lock "bd87e542-0f7b-453e-b8d1-643ad6fb64f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:44:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:44:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:54.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:44:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:55.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:55 np0005603622 nova_compute[221301]: 2026-01-31 08:44:55.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:56 np0005603622 podman[282348]: 2026-01-31 08:44:56.169781434 +0000 UTC m=+0.087022184 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 03:44:56 np0005603622 podman[282347]: 2026-01-31 08:44:56.189907321 +0000 UTC m=+0.102814305 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:44:56 np0005603622 nova_compute[221301]: 2026-01-31 08:44:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:44:56 np0005603622 nova_compute[221301]: 2026-01-31 08:44:56.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:44:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:56.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:57.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:58 np0005603622 nova_compute[221301]: 2026-01-31 08:44:58.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:44:58.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:44:59 np0005603622 nova_compute[221301]: 2026-01-31 08:44:59.191 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:44:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:44:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:44:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:44:59.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:44:59 np0005603622 nova_compute[221301]: 2026-01-31 08:44:59.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:00 np0005603622 nova_compute[221301]: 2026-01-31 08:45:00.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:00 np0005603622 nova_compute[221301]: 2026-01-31 08:45:00.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:00 np0005603622 nova_compute[221301]: 2026-01-31 08:45:00.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:00.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:01.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:01 np0005603622 nova_compute[221301]: 2026-01-31 08:45:01.891 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:01 np0005603622 nova_compute[221301]: 2026-01-31 08:45:01.892 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:01 np0005603622 nova_compute[221301]: 2026-01-31 08:45:01.892 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:01 np0005603622 nova_compute[221301]: 2026-01-31 08:45:01.893 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:45:01 np0005603622 nova_compute[221301]: 2026-01-31 08:45:01.893 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:45:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1096761827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:45:02 np0005603622 nova_compute[221301]: 2026-01-31 08:45:02.377 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:02 np0005603622 nova_compute[221301]: 2026-01-31 08:45:02.517 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:45:02 np0005603622 nova_compute[221301]: 2026-01-31 08:45:02.519 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4378MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:45:02 np0005603622 nova_compute[221301]: 2026-01-31 08:45:02.519 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:02 np0005603622 nova_compute[221301]: 2026-01-31 08:45:02.520 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:02.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:03.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:03 np0005603622 nova_compute[221301]: 2026-01-31 08:45:03.815 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:04 np0005603622 nova_compute[221301]: 2026-01-31 08:45:04.193 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:04 np0005603622 nova_compute[221301]: 2026-01-31 08:45:04.790 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:45:04 np0005603622 nova_compute[221301]: 2026-01-31 08:45:04.791 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:45:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:04.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:05 np0005603622 nova_compute[221301]: 2026-01-31 08:45:05.212 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:45:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:05.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:45:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:45:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1528469096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:45:05 np0005603622 nova_compute[221301]: 2026-01-31 08:45:05.674 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:05 np0005603622 nova_compute[221301]: 2026-01-31 08:45:05.682 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:45:05 np0005603622 nova_compute[221301]: 2026-01-31 08:45:05.783 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:45:06 np0005603622 nova_compute[221301]: 2026-01-31 08:45:06.135 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:45:06 np0005603622 nova_compute[221301]: 2026-01-31 08:45:06.135 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:06.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:07.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.136 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.137 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.137 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.295 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.295 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.648 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:08 np0005603622 nova_compute[221301]: 2026-01-31 08:45:08.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:08.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:09 np0005603622 nova_compute[221301]: 2026-01-31 08:45:09.195 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:09.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:10 np0005603622 nova_compute[221301]: 2026-01-31 08:45:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:10.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:11.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:12.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:13.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:13 np0005603622 nova_compute[221301]: 2026-01-31 08:45:13.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:14 np0005603622 nova_compute[221301]: 2026-01-31 08:45:14.197 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:45:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2076473435' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:45:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:45:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2076473435' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:45:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:14.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:15.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:16.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:17.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:18 np0005603622 nova_compute[221301]: 2026-01-31 08:45:18.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:18 np0005603622 nova_compute[221301]: 2026-01-31 08:45:18.859 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:45:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:18.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:45:19 np0005603622 nova_compute[221301]: 2026-01-31 08:45:19.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:19.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:20.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:21.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:22.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:23.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:23 np0005603622 nova_compute[221301]: 2026-01-31 08:45:23.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:24 np0005603622 nova_compute[221301]: 2026-01-31 08:45:24.202 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:24.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:25.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:26.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:27 np0005603622 podman[282440]: 2026-01-31 08:45:27.129041634 +0000 UTC m=+0.051366872 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:45:27 np0005603622 podman[282439]: 2026-01-31 08:45:27.183224529 +0000 UTC m=+0.106994336 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:45:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:27.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:28 np0005603622 nova_compute[221301]: 2026-01-31 08:45:28.864 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:28.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:29 np0005603622 nova_compute[221301]: 2026-01-31 08:45:29.204 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:29.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:29 np0005603622 nova_compute[221301]: 2026-01-31 08:45:29.456 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:29 np0005603622 nova_compute[221301]: 2026-01-31 08:45:29.457 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:29 np0005603622 nova_compute[221301]: 2026-01-31 08:45:29.607 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:45:30 np0005603622 nova_compute[221301]: 2026-01-31 08:45:30.554 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:30 np0005603622 nova_compute[221301]: 2026-01-31 08:45:30.555 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:30 np0005603622 nova_compute[221301]: 2026-01-31 08:45:30.571 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:45:30 np0005603622 nova_compute[221301]: 2026-01-31 08:45:30.572 221324 INFO nova.compute.claims [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:30.902 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:30.903 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:30.903 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:30.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:31 np0005603622 nova_compute[221301]: 2026-01-31 08:45:31.090 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:31.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:45:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4254094698' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:45:31 np0005603622 nova_compute[221301]: 2026-01-31 08:45:31.589 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:31 np0005603622 nova_compute[221301]: 2026-01-31 08:45:31.595 221324 DEBUG nova.compute.provider_tree [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:45:31 np0005603622 nova_compute[221301]: 2026-01-31 08:45:31.715 221324 DEBUG nova.scheduler.client.report [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.230 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.230 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.454 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.455 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.567 221324 INFO nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.779 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:45:32 np0005603622 nova_compute[221301]: 2026-01-31 08:45:32.922 221324 DEBUG nova.policy [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5e1edb815004883bd699ac3ac702cdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f45314d9274542ac86156e752065d617', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:45:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:32.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.358 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.360 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.360 221324 INFO nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Creating image(s)#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.394 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.428 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.463 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.468 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.535 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.536 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.537 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.537 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.572 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.578 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.869 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:33 np0005603622 nova_compute[221301]: 2026-01-31 08:45:33.952 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.375s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.039 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] resizing rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.142 221324 DEBUG nova.objects.instance [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lazy-loading 'migration_context' on Instance uuid 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.638 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Successfully created port: 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.649 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.650 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Ensure instance console log exists: /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.650 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.651 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:34 np0005603622 nova_compute[221301]: 2026-01-31 08:45:34.651 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:34.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:35.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:36.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.145 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Successfully updated port: 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:45:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:45:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:37.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.550 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.550 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquired lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.550 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.565 221324 DEBUG nova.compute.manager [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-changed-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.565 221324 DEBUG nova.compute.manager [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Refreshing instance network info cache due to event network-changed-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:45:37 np0005603622 nova_compute[221301]: 2026-01-31 08:45:37.566 221324 DEBUG oslo_concurrency.lockutils [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:45:38 np0005603622 nova_compute[221301]: 2026-01-31 08:45:38.000 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:45:38 np0005603622 nova_compute[221301]: 2026-01-31 08:45:38.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:38.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:39 np0005603622 nova_compute[221301]: 2026-01-31 08:45:39.207 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:39.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:39 np0005603622 nova_compute[221301]: 2026-01-31 08:45:39.801 221324 DEBUG nova.network.neutron [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Updating instance_info_cache with network_info: [{"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.277 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Releasing lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.278 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Instance network_info: |[{"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.279 221324 DEBUG oslo_concurrency.lockutils [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.279 221324 DEBUG nova.network.neutron [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Refreshing network info cache for port 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.286 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Start _get_guest_xml network_info=[{"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.293 221324 WARNING nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.299 221324 DEBUG nova.virt.libvirt.host [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.299 221324 DEBUG nova.virt.libvirt.host [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.304 221324 DEBUG nova.virt.libvirt.host [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.304 221324 DEBUG nova.virt.libvirt.host [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.307 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.308 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.309 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.309 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.310 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.310 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.311 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.311 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.312 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.312 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.313 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.313 221324 DEBUG nova.virt.hardware [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:45:40 np0005603622 nova_compute[221301]: 2026-01-31 08:45:40.318 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:45:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1928065442' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:45:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:40.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.024 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.706s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.048 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.053 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:41.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:45:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3882702301' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.462 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.465 221324 DEBUG nova.virt.libvirt.vif [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:45:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-362125567',display_name='tempest-ServerMetadataTestJSON-server-362125567',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-362125567',id=155,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f45314d9274542ac86156e752065d617',ramdisk_id='',reservation_id='r-z6pteacm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1419266361',owner_user_name='tempest-ServerMetadataTestJSON-1419266361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:45:32Z,user_data=None,user_id='e5e1edb815004883bd699ac3ac702cdb',uuid=4f01e554-c0c2-4645-9cd1-b01fec7a9be2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.466 221324 DEBUG nova.network.os_vif_util [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converting VIF {"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.468 221324 DEBUG nova.network.os_vif_util [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.470 221324 DEBUG nova.objects.instance [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.591 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <uuid>4f01e554-c0c2-4645-9cd1-b01fec7a9be2</uuid>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <name>instance-0000009b</name>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerMetadataTestJSON-server-362125567</nova:name>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:45:40</nova:creationTime>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:user uuid="e5e1edb815004883bd699ac3ac702cdb">tempest-ServerMetadataTestJSON-1419266361-project-member</nova:user>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:project uuid="f45314d9274542ac86156e752065d617">tempest-ServerMetadataTestJSON-1419266361</nova:project>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <nova:port uuid="0fddbfa7-25a8-4db3-972f-a0ad7dc8830b">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="serial">4f01e554-c0c2-4645-9cd1-b01fec7a9be2</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="uuid">4f01e554-c0c2-4645-9cd1-b01fec7a9be2</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:e3:2a:0f"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <target dev="tap0fddbfa7-25"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/console.log" append="off"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:45:41 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:45:41 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:45:41 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:45:41 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.592 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Preparing to wait for external event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.593 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.594 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.595 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.597 221324 DEBUG nova.virt.libvirt.vif [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:45:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-362125567',display_name='tempest-ServerMetadataTestJSON-server-362125567',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-362125567',id=155,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f45314d9274542ac86156e752065d617',ramdisk_id='',reservation_id='r-z6pteacm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1419266361',owner_user_name='tempest-ServerMetadataTestJSON-1419266361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:45:32Z,user_data=None,user_id='e5e1edb815004883bd699ac3ac702cdb',uuid=4f01e554-c0c2-4645-9cd1-b01fec7a9be2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.597 221324 DEBUG nova.network.os_vif_util [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converting VIF {"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.599 221324 DEBUG nova.network.os_vif_util [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.600 221324 DEBUG os_vif [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.601 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.602 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.603 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.614 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fddbfa7-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.618 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0fddbfa7-25, col_values=(('external_ids', {'iface-id': '0fddbfa7-25a8-4db3-972f-a0ad7dc8830b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:2a:0f', 'vm-uuid': '4f01e554-c0c2-4645-9cd1-b01fec7a9be2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:41 np0005603622 NetworkManager[49080]: <info>  [1769849141.6228] manager: (tap0fddbfa7-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:41 np0005603622 nova_compute[221301]: 2026-01-31 08:45:41.630 221324 INFO os_vif [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25')#033[00m
Jan 31 03:45:42 np0005603622 nova_compute[221301]: 2026-01-31 08:45:42.145 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:45:42 np0005603622 nova_compute[221301]: 2026-01-31 08:45:42.147 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:45:42 np0005603622 nova_compute[221301]: 2026-01-31 08:45:42.147 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] No VIF found with MAC fa:16:3e:e3:2a:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:45:42 np0005603622 nova_compute[221301]: 2026-01-31 08:45:42.148 221324 INFO nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Using config drive#033[00m
Jan 31 03:45:42 np0005603622 nova_compute[221301]: 2026-01-31 08:45:42.194 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:42.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:43.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:44 np0005603622 nova_compute[221301]: 2026-01-31 08:45:44.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:44 np0005603622 nova_compute[221301]: 2026-01-31 08:45:44.620 221324 INFO nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Creating config drive at /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config#033[00m
Jan 31 03:45:44 np0005603622 nova_compute[221301]: 2026-01-31 08:45:44.624 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3pbgwd1k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:44 np0005603622 nova_compute[221301]: 2026-01-31 08:45:44.751 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3pbgwd1k" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:44.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.022 221324 DEBUG nova.storage.rbd_utils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] rbd image 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.026 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.044 221324 DEBUG nova.network.neutron [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Updated VIF entry in instance network info cache for port 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.045 221324 DEBUG nova.network.neutron [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Updating instance_info_cache with network_info: [{"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.088 221324 DEBUG oslo_concurrency.lockutils [req-07f40bb6-693d-4a7b-b21c-a96320df3aad req-ffec3f52-392e-4c05-97c3-fb1bdef5b729 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4f01e554-c0c2-4645-9cd1-b01fec7a9be2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.302 221324 DEBUG oslo_concurrency.processutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config 4f01e554-c0c2-4645-9cd1-b01fec7a9be2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.276s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.302 221324 INFO nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Deleting local config drive /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2/disk.config because it was imported into RBD.#033[00m
Jan 31 03:45:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:45.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:45 np0005603622 kernel: tap0fddbfa7-25: entered promiscuous mode
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:45:45Z|00712|binding|INFO|Claiming lport 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b for this chassis.
Jan 31 03:45:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:45:45Z|00713|binding|INFO|0fddbfa7-25a8-4db3-972f-a0ad7dc8830b: Claiming fa:16:3e:e3:2a:0f 10.100.0.9
Jan 31 03:45:45 np0005603622 NetworkManager[49080]: <info>  [1769849145.3709] manager: (tap0fddbfa7-25): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 31 03:45:45 np0005603622 systemd-udevd[283001]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:45:45Z|00714|binding|INFO|Setting lport 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b ovn-installed in OVS
Jan 31 03:45:45 np0005603622 nova_compute[221301]: 2026-01-31 08:45:45.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:45 np0005603622 systemd-machined[190406]: New machine qemu-73-instance-0000009b.
Jan 31 03:45:45 np0005603622 NetworkManager[49080]: <info>  [1769849145.4023] device (tap0fddbfa7-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:45:45 np0005603622 systemd[1]: Started Virtual Machine qemu-73-instance-0000009b.
Jan 31 03:45:45 np0005603622 NetworkManager[49080]: <info>  [1769849145.4044] device (tap0fddbfa7-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:45:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:45:45Z|00715|binding|INFO|Setting lport 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b up in Southbound
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.759 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:2a:0f 10.100.0.9'], port_security=['fa:16:3e:e3:2a:0f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f01e554-c0c2-4645-9cd1-b01fec7a9be2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f45314d9274542ac86156e752065d617', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7caa50e7-37e2-4568-b5f0-e171e05077ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b1a0fbe-8d23-4175-99a7-7808bf8ad5d5, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.761 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b in datapath 93884f74-bf8f-4a8d-8d2e-f855a21eaddc bound to our chassis#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.763 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 93884f74-bf8f-4a8d-8d2e-f855a21eaddc#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.772 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[45aa88a6-247a-43ea-8534-8e731cd97b64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.773 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap93884f74-b1 in ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.776 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap93884f74-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.776 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[46c83790-be11-44ae-a31a-5ae6fe7c4c8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.777 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6ffe2811-d874-4097-b60b-03ee32e24e4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.785 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[edc10a24-0135-421e-a50f-b8f90997ea8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.794 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4e852eb4-ec2e-4870-bc07-1a39b169adc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.815 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[47fd97ad-0e31-4c0b-a83a-1f3ca94c151f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.820 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[91408528-3885-4c31-9e2a-ffaf5e1ac0d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 NetworkManager[49080]: <info>  [1769849145.8219] manager: (tap93884f74-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/318)
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.847 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[33579593-c258-449c-9427-d898f33aefc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.850 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e11217ab-6e8e-4fb0-b697-68a742ed6f87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 NetworkManager[49080]: <info>  [1769849145.8664] device (tap93884f74-b0): carrier: link connected
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.869 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a87c8dc2-4276-467b-baea-5faf136662a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.880 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f63327f8-b7c9-4362-8d44-5a614cb79420]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93884f74-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:5c:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 839720, 'reachable_time': 23376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283091, 'error': None, 'target': 'ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.890 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[87d103f7-5d7a-4540-9cb1-e79695e5d11c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:5cfc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 839720, 'tstamp': 839720}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283092, 'error': None, 'target': 'ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.901 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a507469d-f2dd-44e1-aa5c-5afaec86d58e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93884f74-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:5c:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 839720, 'reachable_time': 23376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283093, 'error': None, 'target': 'ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.921 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04a35304-6dc1-49b5-9f63-9c55f7915ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.980 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e04d1859-aaf4-434b-8c90-f7988a4c9815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.983 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93884f74-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.983 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:45:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:45.984 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93884f74-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:46 np0005603622 kernel: tap93884f74-b0: entered promiscuous mode
Jan 31 03:45:46 np0005603622 NetworkManager[49080]: <info>  [1769849146.0182] manager: (tap93884f74-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.020 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:46.020 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap93884f74-b0, col_values=(('external_ids', {'iface-id': 'ae08d78c-2eff-41cb-894b-1f7248151c28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:45:46Z|00716|binding|INFO|Releasing lport ae08d78c-2eff-41cb-894b-1f7248151c28 from this chassis (sb_readonly=0)
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:46.030 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/93884f74-bf8f-4a8d-8d2e-f855a21eaddc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/93884f74-bf8f-4a8d-8d2e-f855a21eaddc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:46.031 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[58be392a-46aa-4ab9-82de-ac35bb40e809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:46.032 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-93884f74-bf8f-4a8d-8d2e-f855a21eaddc
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/93884f74-bf8f-4a8d-8d2e-f855a21eaddc.pid.haproxy
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 93884f74-bf8f-4a8d-8d2e-f855a21eaddc
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:45:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:46.033 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'env', 'PROCESS_TAG=haproxy-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/93884f74-bf8f-4a8d-8d2e-f855a21eaddc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.168 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849146.167777, 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.168 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] VM Started (Lifecycle Event)#033[00m
Jan 31 03:45:46 np0005603622 podman[283167]: 2026-01-31 08:45:46.390382571 +0000 UTC m=+0.081372603 container create 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.409 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.414 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849146.167964, 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.414 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:45:46 np0005603622 podman[283167]: 2026-01-31 08:45:46.332692891 +0000 UTC m=+0.023682953 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:45:46 np0005603622 systemd[1]: Started libpod-conmon-066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7.scope.
Jan 31 03:45:46 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:45:46 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12913e13aa5c303693f58ae000af15c4c5b4bed3c093ebd0ec5ad4a4ad80a419/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:45:46 np0005603622 podman[283167]: 2026-01-31 08:45:46.511753019 +0000 UTC m=+0.202743061 container init 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:45:46 np0005603622 podman[283167]: 2026-01-31 08:45:46.516984299 +0000 UTC m=+0.207974341 container start 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:45:46 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [NOTICE]   (283186) : New worker (283188) forked
Jan 31 03:45:46 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [NOTICE]   (283186) : Loading success.
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.592 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.596 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.621 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:46 np0005603622 nova_compute[221301]: 2026-01-31 08:45:46.751 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:45:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:46.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:47.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:47 np0005603622 ceph-osd[79107]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 31 03:45:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:48.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:49 np0005603622 nova_compute[221301]: 2026-01-31 08:45:49.212 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:49.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.194 221324 DEBUG nova.compute.manager [req-2c816e20-753a-41e4-8cc8-756dc73d0a7d req-8b5ffbb9-5a9f-43ce-99a3-a7360dd26e8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.194 221324 DEBUG oslo_concurrency.lockutils [req-2c816e20-753a-41e4-8cc8-756dc73d0a7d req-8b5ffbb9-5a9f-43ce-99a3-a7360dd26e8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.195 221324 DEBUG oslo_concurrency.lockutils [req-2c816e20-753a-41e4-8cc8-756dc73d0a7d req-8b5ffbb9-5a9f-43ce-99a3-a7360dd26e8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.195 221324 DEBUG oslo_concurrency.lockutils [req-2c816e20-753a-41e4-8cc8-756dc73d0a7d req-8b5ffbb9-5a9f-43ce-99a3-a7360dd26e8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.195 221324 DEBUG nova.compute.manager [req-2c816e20-753a-41e4-8cc8-756dc73d0a7d req-8b5ffbb9-5a9f-43ce-99a3-a7360dd26e8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Processing event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.196 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.200 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849150.199986, 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.200 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.202 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.206 221324 INFO nova.virt.libvirt.driver [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Instance spawned successfully.#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.206 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.312 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.320 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.324 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.324 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.325 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.325 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.325 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.326 221324 DEBUG nova.virt.libvirt.driver [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.478 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.703 221324 INFO nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Took 17.34 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.703 221324 DEBUG nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:45:50 np0005603622 nova_compute[221301]: 2026-01-31 08:45:50.931 221324 INFO nova.compute.manager [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Took 20.90 seconds to build instance.#033[00m
Jan 31 03:45:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:50.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:51.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:51 np0005603622 nova_compute[221301]: 2026-01-31 08:45:51.466 221324 DEBUG oslo_concurrency.lockutils [None req-fdcd984a-f880-4ac1-b981-5ddab85b3b5b e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:51 np0005603622 nova_compute[221301]: 2026-01-31 08:45:51.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:45:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:52.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:53.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:54.057 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:54.058 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:45:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:45:54.059 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.214 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.223 221324 DEBUG nova.compute.manager [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.223 221324 DEBUG oslo_concurrency.lockutils [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.224 221324 DEBUG oslo_concurrency.lockutils [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.224 221324 DEBUG oslo_concurrency.lockutils [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.224 221324 DEBUG nova.compute.manager [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] No waiting events found dispatching network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:45:54 np0005603622 nova_compute[221301]: 2026-01-31 08:45:54.224 221324 WARNING nova.compute.manager [req-b9c2873e-c805-4d06-9939-415a97d04ba0 req-78acc941-f427-4f7a-8905-742b97509bbf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received unexpected event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b for instance with vm_state active and task_state None.#033[00m
Jan 31 03:45:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:54.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:55.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:56 np0005603622 nova_compute[221301]: 2026-01-31 08:45:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:56 np0005603622 nova_compute[221301]: 2026-01-31 08:45:56.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:45:56 np0005603622 nova_compute[221301]: 2026-01-31 08:45:56.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:56.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:45:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:57.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:45:57 np0005603622 nova_compute[221301]: 2026-01-31 08:45:57.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:45:58 np0005603622 podman[283249]: 2026-01-31 08:45:58.1496543 +0000 UTC m=+0.067919183 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:45:58 np0005603622 podman[283248]: 2026-01-31 08:45:58.154963482 +0000 UTC m=+0.079062511 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:45:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:45:58.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:45:59 np0005603622 nova_compute[221301]: 2026-01-31 08:45:59.216 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:45:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:45:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:45:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:45:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:45:59.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:00 np0005603622 nova_compute[221301]: 2026-01-31 08:46:00.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:00 np0005603622 nova_compute[221301]: 2026-01-31 08:46:00.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:00 np0005603622 nova_compute[221301]: 2026-01-31 08:46:00.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:00.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:01.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.539 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.539 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.580 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.581 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.582 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.582 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.583 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:01 np0005603622 nova_compute[221301]: 2026-01-31 08:46:01.627 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3765239010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.048 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.224 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.225 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-0000009b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.395 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.396 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4209MB free_disk=20.92184066772461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.396 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.396 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.842 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.844 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.844 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:46:02 np0005603622 nova_compute[221301]: 2026-01-31 08:46:02.962 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:02.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:46:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2134027346' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:46:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1891323133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:03Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e3:2a:0f 10.100.0.9
Jan 31 03:46:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:03Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:2a:0f 10.100.0.9
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.739 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.777s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.743 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.938 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.939 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.940 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.940 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.940 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.942 221324 INFO nova.compute.manager [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Terminating instance#033[00m
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.943 221324 DEBUG nova.compute.manager [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:46:03 np0005603622 kernel: tap0fddbfa7-25 (unregistering): left promiscuous mode
Jan 31 03:46:03 np0005603622 NetworkManager[49080]: <info>  [1769849163.9911] device (tap0fddbfa7-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:46:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:03Z|00717|binding|INFO|Releasing lport 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b from this chassis (sb_readonly=0)
Jan 31 03:46:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:03Z|00718|binding|INFO|Setting lport 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b down in Southbound
Jan 31 03:46:03 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.997 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:03Z|00719|binding|INFO|Removing iface tap0fddbfa7-25 ovn-installed in OVS
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:03.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 31 03:46:04 np0005603622 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009b.scope: Consumed 12.887s CPU time.
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.040 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:46:04 np0005603622 systemd-machined[190406]: Machine qemu-73-instance-0000009b terminated.
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.156 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:2a:0f 10.100.0.9'], port_security=['fa:16:3e:e3:2a:0f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f01e554-c0c2-4645-9cd1-b01fec7a9be2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f45314d9274542ac86156e752065d617', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7caa50e7-37e2-4568-b5f0-e171e05077ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b1a0fbe-8d23-4175-99a7-7808bf8ad5d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.157 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b in datapath 93884f74-bf8f-4a8d-8d2e-f855a21eaddc unbound from our chassis#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.158 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93884f74-bf8f-4a8d-8d2e-f855a21eaddc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.159 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f8701fc0-02e5-42dd-b59a-386e03d3d670]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.160 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc namespace which is not needed anymore#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.182 221324 INFO nova.virt.libvirt.driver [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Instance destroyed successfully.#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.183 221324 DEBUG nova.objects.instance [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lazy-loading 'resources' on Instance uuid 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.217 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:04 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [NOTICE]   (283186) : haproxy version is 2.8.14-c23fe91
Jan 31 03:46:04 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [NOTICE]   (283186) : path to executable is /usr/sbin/haproxy
Jan 31 03:46:04 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [WARNING]  (283186) : Exiting Master process...
Jan 31 03:46:04 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [ALERT]    (283186) : Current worker (283188) exited with code 143 (Terminated)
Jan 31 03:46:04 np0005603622 neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc[283182]: [WARNING]  (283186) : All workers exited. Exiting... (0)
Jan 31 03:46:04 np0005603622 systemd[1]: libpod-066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7.scope: Deactivated successfully.
Jan 31 03:46:04 np0005603622 podman[283372]: 2026-01-31 08:46:04.281902945 +0000 UTC m=+0.041969011 container died 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:46:04 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7-userdata-shm.mount: Deactivated successfully.
Jan 31 03:46:04 np0005603622 systemd[1]: var-lib-containers-storage-overlay-12913e13aa5c303693f58ae000af15c4c5b4bed3c093ebd0ec5ad4a4ad80a419-merged.mount: Deactivated successfully.
Jan 31 03:46:04 np0005603622 podman[283372]: 2026-01-31 08:46:04.322479048 +0000 UTC m=+0.082545124 container cleanup 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:46:04 np0005603622 systemd[1]: libpod-conmon-066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7.scope: Deactivated successfully.
Jan 31 03:46:04 np0005603622 podman[283401]: 2026-01-31 08:46:04.388683245 +0000 UTC m=+0.050453758 container remove 066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.393 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[560488d8-bfe3-4e4e-ad1d-0924ad74608f]: (4, ('Sat Jan 31 08:46:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc (066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7)\n066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7\nSat Jan 31 08:46:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc (066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7)\n066d56c1d173c9a63ae9806ecedae6a25285e5b79dc38a24f00a3b3d377b70c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.395 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c184272b-8e07-431e-85eb-e1e1b83ac83a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.395 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93884f74-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 kernel: tap93884f74-b0: left promiscuous mode
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.406 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.409 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c71cda3-5914-4dd4-b08f-18592f6cee3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.435 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c59f973d-975e-4b8c-abb8-c61c550ddee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.436 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b56058-789e-4d37-b038-770c8bc97ee2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.452 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2fa5b9-9a18-432c-ae89-3165b496400c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 839715, 'reachable_time': 15602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283421, 'error': None, 'target': 'ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 systemd[1]: run-netns-ovnmeta\x2d93884f74\x2dbf8f\x2d4a8d\x2d8d2e\x2df855a21eaddc.mount: Deactivated successfully.
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.455 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-93884f74-bf8f-4a8d-8d2e-f855a21eaddc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:46:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:04.456 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9e168e-bf33-4f0e-8cd8-26ea01624525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.754 221324 DEBUG nova.virt.libvirt.vif [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:45:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-362125567',display_name='tempest-ServerMetadataTestJSON-server-362125567',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-362125567',id=155,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:45:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f45314d9274542ac86156e752065d617',ramdisk_id='',reservation_id='r-z6pteacm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-1419266361',owner_user_name='tempest-ServerMetadataTestJSON-1419266361-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:46:02Z,user_data=None,user_id='e5e1edb815004883bd699ac3ac702cdb',uuid=4f01e554-c0c2-4645-9cd1-b01fec7a9be2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.755 221324 DEBUG nova.network.os_vif_util [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converting VIF {"id": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "address": "fa:16:3e:e3:2a:0f", "network": {"id": "93884f74-bf8f-4a8d-8d2e-f855a21eaddc", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-874117325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f45314d9274542ac86156e752065d617", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fddbfa7-25", "ovs_interfaceid": "0fddbfa7-25a8-4db3-972f-a0ad7dc8830b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.756 221324 DEBUG nova.network.os_vif_util [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.756 221324 DEBUG os_vif [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.757 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.758 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fddbfa7-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.759 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.761 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.763 221324 INFO os_vif [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2a:0f,bridge_name='br-int',has_traffic_filtering=True,id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b,network=Network(93884f74-bf8f-4a8d-8d2e-f855a21eaddc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fddbfa7-25')#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.842 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:46:04 np0005603622 nova_compute[221301]: 2026-01-31 08:46:04.843 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:05.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:05.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.483 221324 INFO nova.virt.libvirt.driver [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Deleting instance files /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2_del#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.484 221324 INFO nova.virt.libvirt.driver [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Deletion of /var/lib/nova/instances/4f01e554-c0c2-4645-9cd1-b01fec7a9be2_del complete#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.700 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.701 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.701 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.899 221324 DEBUG nova.compute.manager [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-unplugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.900 221324 DEBUG oslo_concurrency.lockutils [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.900 221324 DEBUG oslo_concurrency.lockutils [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.900 221324 DEBUG oslo_concurrency.lockutils [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.901 221324 DEBUG nova.compute.manager [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] No waiting events found dispatching network-vif-unplugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.901 221324 DEBUG nova.compute.manager [req-f650be62-1d61-4df3-91c2-dca5342ff4dc req-9bfe55cc-fad8-4568-92c4-1f6837905fcd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-unplugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.916 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 03:46:05 np0005603622 nova_compute[221301]: 2026-01-31 08:46:05.917 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:46:06 np0005603622 nova_compute[221301]: 2026-01-31 08:46:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:06 np0005603622 nova_compute[221301]: 2026-01-31 08:46:06.398 221324 INFO nova.compute.manager [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Took 2.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:46:06 np0005603622 nova_compute[221301]: 2026-01-31 08:46:06.399 221324 DEBUG oslo.service.loopingcall [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:46:06 np0005603622 nova_compute[221301]: 2026-01-31 08:46:06.400 221324 DEBUG nova.compute.manager [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:46:06 np0005603622 nova_compute[221301]: 2026-01-31 08:46:06.400 221324 DEBUG nova.network.neutron [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.688739) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166689406, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2380, "num_deletes": 252, "total_data_size": 5700241, "memory_usage": 5782192, "flush_reason": "Manual Compaction"}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166721822, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3726365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 66519, "largest_seqno": 68894, "table_properties": {"data_size": 3716782, "index_size": 6012, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19999, "raw_average_key_size": 20, "raw_value_size": 3697597, "raw_average_value_size": 3792, "num_data_blocks": 262, "num_entries": 975, "num_filter_entries": 975, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769848956, "oldest_key_time": 1769848956, "file_creation_time": 1769849166, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 33156 microseconds, and 10424 cpu microseconds.
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.721896) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3726365 bytes OK
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.721921) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.723680) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.723705) EVENT_LOG_v1 {"time_micros": 1769849166723697, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.723730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5689869, prev total WAL file size 5689869, number of live WAL files 2.
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.725364) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3639KB)], [135(11MB)]
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166725467, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 15343486, "oldest_snapshot_seqno": -1}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 9110 keys, 13361354 bytes, temperature: kUnknown
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166831114, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 13361354, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13300297, "index_size": 37200, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22789, "raw_key_size": 238865, "raw_average_key_size": 26, "raw_value_size": 13138363, "raw_average_value_size": 1442, "num_data_blocks": 1432, "num_entries": 9110, "num_filter_entries": 9110, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849166, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.831338) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 13361354 bytes
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.832898) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.2 rd, 126.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 9635, records dropped: 525 output_compression: NoCompression
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.832914) EVENT_LOG_v1 {"time_micros": 1769849166832906, "job": 86, "event": "compaction_finished", "compaction_time_micros": 105698, "compaction_time_cpu_micros": 47483, "output_level": 6, "num_output_files": 1, "total_output_size": 13361354, "num_input_records": 9635, "num_output_records": 9110, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166833259, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849166834299, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.725086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.834427) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.834435) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.834440) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.834444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:06 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:06.834448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:07.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:07.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:09.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.152 221324 DEBUG nova.compute.manager [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.153 221324 DEBUG oslo_concurrency.lockutils [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.153 221324 DEBUG oslo_concurrency.lockutils [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.154 221324 DEBUG oslo_concurrency.lockutils [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.154 221324 DEBUG nova.compute.manager [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] No waiting events found dispatching network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.155 221324 WARNING nova.compute.manager [req-605187dd-7948-4151-af67-93cdcaadac8d req-2f27e1a6-e9a3-4b9f-89fa-603bbc924202 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received unexpected event network-vif-plugged-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:09.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:09 np0005603622 nova_compute[221301]: 2026-01-31 08:46:09.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:10 np0005603622 nova_compute[221301]: 2026-01-31 08:46:10.801 221324 DEBUG nova.network.neutron [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:10 np0005603622 nova_compute[221301]: 2026-01-31 08:46:10.961 221324 DEBUG nova.compute.manager [req-6265d191-010f-4340-aa20-c408ed8586a7 req-45c70399-9bdf-4d8d-b649-cdd5cfa1f7f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Received event network-vif-deleted-0fddbfa7-25a8-4db3-972f-a0ad7dc8830b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:10 np0005603622 nova_compute[221301]: 2026-01-31 08:46:10.962 221324 INFO nova.compute.manager [req-6265d191-010f-4340-aa20-c408ed8586a7 req-45c70399-9bdf-4d8d-b649-cdd5cfa1f7f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Neutron deleted interface 0fddbfa7-25a8-4db3-972f-a0ad7dc8830b; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:46:10 np0005603622 nova_compute[221301]: 2026-01-31 08:46:10.962 221324 DEBUG nova.network.neutron [req-6265d191-010f-4340-aa20-c408ed8586a7 req-45c70399-9bdf-4d8d-b649-cdd5cfa1f7f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:11.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.094 221324 INFO nova.compute.manager [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Took 4.69 seconds to deallocate network for instance.#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.103 221324 DEBUG nova.compute.manager [req-6265d191-010f-4340-aa20-c408ed8586a7 req-45c70399-9bdf-4d8d-b649-cdd5cfa1f7f1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Detach interface failed, port_id=0fddbfa7-25a8-4db3-972f-a0ad7dc8830b, reason: Instance 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.293 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.294 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.370 221324 DEBUG oslo_concurrency.processutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:11.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2476190541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.799 221324 DEBUG oslo_concurrency.processutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.806 221324 DEBUG nova.compute.provider_tree [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.852 221324 DEBUG nova.scheduler.client.report [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:46:11 np0005603622 nova_compute[221301]: 2026-01-31 08:46:11.956 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:12 np0005603622 nova_compute[221301]: 2026-01-31 08:46:12.013 221324 INFO nova.scheduler.client.report [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Deleted allocations for instance 4f01e554-c0c2-4645-9cd1-b01fec7a9be2#033[00m
Jan 31 03:46:12 np0005603622 nova_compute[221301]: 2026-01-31 08:46:12.276 221324 DEBUG oslo_concurrency.lockutils [None req-93f7b63b-13e4-4aec-9fd7-1f07791180d1 e5e1edb815004883bd699ac3ac702cdb f45314d9274542ac86156e752065d617 - - default default] Lock "4f01e554-c0c2-4645-9cd1-b01fec7a9be2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:13.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:13.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:14 np0005603622 nova_compute[221301]: 2026-01-31 08:46:14.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:14 np0005603622 nova_compute[221301]: 2026-01-31 08:46:14.774 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:15.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:15.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:17.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:17.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:19.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:19 np0005603622 nova_compute[221301]: 2026-01-31 08:46:19.182 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849164.180653, 4f01e554-c0c2-4645-9cd1-b01fec7a9be2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:19 np0005603622 nova_compute[221301]: 2026-01-31 08:46:19.182 221324 INFO nova.compute.manager [-] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:46:19 np0005603622 nova_compute[221301]: 2026-01-31 08:46:19.224 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:19 np0005603622 nova_compute[221301]: 2026-01-31 08:46:19.240 221324 DEBUG nova.compute.manager [None req-defb02c5-8229-4373-bf10-03105c8c23e0 - - - - - -] [instance: 4f01e554-c0c2-4645-9cd1-b01fec7a9be2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:19.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:19 np0005603622 nova_compute[221301]: 2026-01-31 08:46:19.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:21.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:21.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:21.903354) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849181903467, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 401, "num_deletes": 251, "total_data_size": 375834, "memory_usage": 383280, "flush_reason": "Manual Compaction"}
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849181973707, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 231490, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68900, "largest_seqno": 69295, "table_properties": {"data_size": 229197, "index_size": 392, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6275, "raw_average_key_size": 20, "raw_value_size": 224646, "raw_average_value_size": 729, "num_data_blocks": 18, "num_entries": 308, "num_filter_entries": 308, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849167, "oldest_key_time": 1769849167, "file_creation_time": 1769849181, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 70428 microseconds, and 2847 cpu microseconds.
Jan 31 03:46:21 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:21.973785) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 231490 bytes OK
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:21.973822) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.057825) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.057880) EVENT_LOG_v1 {"time_micros": 1769849182057868, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.057909) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 373244, prev total WAL file size 373244, number of live WAL files 2.
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.058704) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323533' seq:72057594037927935, type:22 .. '6D6772737461740032353035' seq:0, type:0; will stop at (end)
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(226KB)], [138(12MB)]
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182058775, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13592844, "oldest_snapshot_seqno": -1}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 8909 keys, 9756410 bytes, temperature: kUnknown
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182163898, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 9756410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9701404, "index_size": 31597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22341, "raw_key_size": 234900, "raw_average_key_size": 26, "raw_value_size": 9547713, "raw_average_value_size": 1071, "num_data_blocks": 1198, "num_entries": 8909, "num_filter_entries": 8909, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849182, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.164178) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 9756410 bytes
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.171574) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.2 rd, 92.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.7 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(100.9) write-amplify(42.1) OK, records in: 9418, records dropped: 509 output_compression: NoCompression
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.171615) EVENT_LOG_v1 {"time_micros": 1769849182171597, "job": 88, "event": "compaction_finished", "compaction_time_micros": 105219, "compaction_time_cpu_micros": 34606, "output_level": 6, "num_output_files": 1, "total_output_size": 9756410, "num_input_records": 9418, "num_output_records": 8909, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182171893, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849182173182, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.058605) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.173383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.173399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.173402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.173404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:46:22.173406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:46:22 np0005603622 nova_compute[221301]: 2026-01-31 08:46:22.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:22 np0005603622 nova_compute[221301]: 2026-01-31 08:46:22.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:46:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:23.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:23.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:24 np0005603622 nova_compute[221301]: 2026-01-31 08:46:24.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:24 np0005603622 nova_compute[221301]: 2026-01-31 08:46:24.777 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:25.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:27.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:27 np0005603622 nova_compute[221301]: 2026-01-31 08:46:27.435 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.272 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.273 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.488 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.973 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.974 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.989 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:46:28 np0005603622 nova_compute[221301]: 2026-01-31 08:46:28.989 221324 INFO nova.compute.claims [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:46:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:29 np0005603622 podman[283464]: 2026-01-31 08:46:29.159022216 +0000 UTC m=+0.076648176 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:46:29 np0005603622 podman[283463]: 2026-01-31 08:46:29.194997146 +0000 UTC m=+0.109075772 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 03:46:29 np0005603622 nova_compute[221301]: 2026-01-31 08:46:29.228 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:29.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:29 np0005603622 nova_compute[221301]: 2026-01-31 08:46:29.571 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:29 np0005603622 nova_compute[221301]: 2026-01-31 08:46:29.827 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:46:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3344877734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.078 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.088 221324 DEBUG nova.compute.provider_tree [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.157 221324 DEBUG nova.scheduler.client.report [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.385 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.386 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.634 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.635 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:46:30 np0005603622 nova_compute[221301]: 2026-01-31 08:46:30.706 221324 INFO nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:30.904 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:30.905 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:30.905 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:31.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.064 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:46:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:31.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.439 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.442 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.443 221324 INFO nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Creating image(s)#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.487 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.529 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.558 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.562 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.635 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.636 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.636 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.637 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.662 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:31 np0005603622 nova_compute[221301]: 2026-01-31 08:46:31.665 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 802cab69-67de-4c6f-b952-e9947bc4e48c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.051 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 802cab69-67de-4c6f-b952-e9947bc4e48c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.137 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] resizing rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.260 221324 DEBUG nova.objects.instance [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lazy-loading 'migration_context' on Instance uuid 802cab69-67de-4c6f-b952-e9947bc4e48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.369 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.370 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Ensure instance console log exists: /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.371 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.372 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.372 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:32 np0005603622 nova_compute[221301]: 2026-01-31 08:46:32.411 221324 DEBUG nova.policy [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd169542a81014a91ac7ff995f44e45a5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8d191d40d924e418ffde5e8b5483cf4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:46:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:33.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:33.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:34 np0005603622 nova_compute[221301]: 2026-01-31 08:46:34.268 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:34 np0005603622 nova_compute[221301]: 2026-01-31 08:46:34.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:35.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:35.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:35 np0005603622 nova_compute[221301]: 2026-01-31 08:46:35.890 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:35 np0005603622 nova_compute[221301]: 2026-01-31 08:46:35.890 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:46:36 np0005603622 nova_compute[221301]: 2026-01-31 08:46:36.063 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:46:36 np0005603622 nova_compute[221301]: 2026-01-31 08:46:36.083 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Successfully created port: 960d9122-17be-47d7-b9ba-d2d9dea04b65 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:46:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:37.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:37.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.201 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Successfully updated port: 960d9122-17be-47d7-b9ba-d2d9dea04b65 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.270 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:39.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.542 221324 DEBUG nova.compute.manager [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Received event network-changed-960d9122-17be-47d7-b9ba-d2d9dea04b65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.542 221324 DEBUG nova.compute.manager [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Refreshing instance network info cache due to event network-changed-960d9122-17be-47d7-b9ba-d2d9dea04b65. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.543 221324 DEBUG oslo_concurrency.lockutils [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.543 221324 DEBUG oslo_concurrency.lockutils [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.543 221324 DEBUG nova.network.neutron [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Refreshing network info cache for port 960d9122-17be-47d7-b9ba-d2d9dea04b65 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.546 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:46:39 np0005603622 nova_compute[221301]: 2026-01-31 08:46:39.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:40 np0005603622 nova_compute[221301]: 2026-01-31 08:46:40.029 221324 DEBUG nova.network.neutron [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:46:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:41.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:41 np0005603622 nova_compute[221301]: 2026-01-31 08:46:41.233 221324 DEBUG nova.network.neutron [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:41 np0005603622 nova_compute[221301]: 2026-01-31 08:46:41.272 221324 DEBUG oslo_concurrency.lockutils [req-8e571931-5e49-4367-ab09-798f2da3dd1c req-2e8427c2-aaca-4533-ac79-a7c84c914e0f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:46:41 np0005603622 nova_compute[221301]: 2026-01-31 08:46:41.273 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquired lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:46:41 np0005603622 nova_compute[221301]: 2026-01-31 08:46:41.273 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:46:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:41.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:42 np0005603622 nova_compute[221301]: 2026-01-31 08:46:42.072 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:46:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:43.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:44 np0005603622 nova_compute[221301]: 2026-01-31 08:46:44.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:44 np0005603622 nova_compute[221301]: 2026-01-31 08:46:44.567 221324 DEBUG nova.network.neutron [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Updating instance_info_cache with network_info: [{"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:46:44 np0005603622 nova_compute[221301]: 2026-01-31 08:46:44.865 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:45.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.166 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Releasing lock "refresh_cache-802cab69-67de-4c6f-b952-e9947bc4e48c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.166 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance network_info: |[{"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.169 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Start _get_guest_xml network_info=[{"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.176 221324 WARNING nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.187 221324 DEBUG nova.virt.libvirt.host [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.188 221324 DEBUG nova.virt.libvirt.host [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.208 221324 DEBUG nova.virt.libvirt.host [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.209 221324 DEBUG nova.virt.libvirt.host [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.210 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.211 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.211 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.211 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.211 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.212 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.212 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.212 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.212 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.213 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.213 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.213 221324 DEBUG nova.virt.hardware [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.217 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:45.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:46:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3007170401' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.646 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.684 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:45 np0005603622 nova_compute[221301]: 2026-01-31 08:46:45.689 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:46:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4025810976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.260 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.262 221324 DEBUG nova.virt.libvirt.vif [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:46:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1974979382',id=157,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8d191d40d924e418ffde5e8b5483cf4',ramdisk_id='',reservation_id='r-2o6y1jgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-289523751',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-289523751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:46:31Z,user_data=None,user_id='d169542a81014a91ac7ff995f44e45a5',uuid=802cab69-67de-4c6f-b952-e9947bc4e48c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.263 221324 DEBUG nova.network.os_vif_util [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converting VIF {"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.265 221324 DEBUG nova.network.os_vif_util [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.267 221324 DEBUG nova.objects.instance [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 802cab69-67de-4c6f-b952-e9947bc4e48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.348 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <uuid>802cab69-67de-4c6f-b952-e9947bc4e48c</uuid>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <name>instance-0000009d</name>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1974979382</nova:name>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:46:45</nova:creationTime>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:user uuid="d169542a81014a91ac7ff995f44e45a5">tempest-ServersNegativeTestMultiTenantJSON-289523751-project-member</nova:user>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:project uuid="d8d191d40d924e418ffde5e8b5483cf4">tempest-ServersNegativeTestMultiTenantJSON-289523751</nova:project>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <nova:port uuid="960d9122-17be-47d7-b9ba-d2d9dea04b65">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="serial">802cab69-67de-4c6f-b952-e9947bc4e48c</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="uuid">802cab69-67de-4c6f-b952-e9947bc4e48c</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/802cab69-67de-4c6f-b952-e9947bc4e48c_disk">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:78:d6:23"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <target dev="tap960d9122-17"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/console.log" append="off"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:46:46 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:46:46 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:46:46 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:46:46 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.349 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Preparing to wait for external event network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.349 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.349 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.350 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.350 221324 DEBUG nova.virt.libvirt.vif [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:46:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1974979382',id=157,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8d191d40d924e418ffde5e8b5483cf4',ramdisk_id='',reservation_id='r-2o6y1jgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-289523751',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-289523751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:46:31Z,user_data=None,user_id='d169542a81014a91ac7ff995f44e45a5',uuid=802cab69-67de-4c6f-b952-e9947bc4e48c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.351 221324 DEBUG nova.network.os_vif_util [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converting VIF {"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.351 221324 DEBUG nova.network.os_vif_util [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.352 221324 DEBUG os_vif [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.352 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.353 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.353 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.357 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.358 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap960d9122-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.358 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap960d9122-17, col_values=(('external_ids', {'iface-id': '960d9122-17be-47d7-b9ba-d2d9dea04b65', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:d6:23', 'vm-uuid': '802cab69-67de-4c6f-b952-e9947bc4e48c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.361 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:46:46 np0005603622 NetworkManager[49080]: <info>  [1769849206.3619] manager: (tap960d9122-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.364 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.365 221324 INFO os_vif [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17')#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.649 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.649 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.650 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] No VIF found with MAC fa:16:3e:78:d6:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.650 221324 INFO nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Using config drive#033[00m
Jan 31 03:46:46 np0005603622 nova_compute[221301]: 2026-01-31 08:46:46.674 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:47.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:47.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:47 np0005603622 nova_compute[221301]: 2026-01-31 08:46:47.455 221324 INFO nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Creating config drive at /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config#033[00m
Jan 31 03:46:47 np0005603622 nova_compute[221301]: 2026-01-31 08:46:47.459 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7ll_kvin execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:47 np0005603622 nova_compute[221301]: 2026-01-31 08:46:47.582 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7ll_kvin" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:47 np0005603622 nova_compute[221301]: 2026-01-31 08:46:47.740 221324 DEBUG nova.storage.rbd_utils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] rbd image 802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:46:47 np0005603622 nova_compute[221301]: 2026-01-31 08:46:47.744 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config 802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.648 221324 DEBUG oslo_concurrency.processutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config 802cab69-67de-4c6f-b952-e9947bc4e48c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.904s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.649 221324 INFO nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Deleting local config drive /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c/disk.config because it was imported into RBD.#033[00m
Jan 31 03:46:48 np0005603622 kernel: tap960d9122-17: entered promiscuous mode
Jan 31 03:46:48 np0005603622 NetworkManager[49080]: <info>  [1769849208.6955] manager: (tap960d9122-17): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.739 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:48Z|00720|binding|INFO|Claiming lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 for this chassis.
Jan 31 03:46:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:48Z|00721|binding|INFO|960d9122-17be-47d7-b9ba-d2d9dea04b65: Claiming fa:16:3e:78:d6:23 10.100.0.9
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.747 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:48Z|00722|binding|INFO|Setting lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 ovn-installed in OVS
Jan 31 03:46:48 np0005603622 nova_compute[221301]: 2026-01-31 08:46:48.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:48 np0005603622 systemd-machined[190406]: New machine qemu-74-instance-0000009d.
Jan 31 03:46:48 np0005603622 systemd-udevd[283832]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:46:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:48Z|00723|binding|INFO|Setting lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 up in Southbound
Jan 31 03:46:48 np0005603622 systemd[1]: Started Virtual Machine qemu-74-instance-0000009d.
Jan 31 03:46:48 np0005603622 NetworkManager[49080]: <info>  [1769849208.7885] device (tap960d9122-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.788 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:23 10.100.0.9'], port_security=['fa:16:3e:78:d6:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '802cab69-67de-4c6f-b952-e9947bc4e48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8d191d40d924e418ffde5e8b5483cf4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69d7985a-09cc-4059-a9d3-12844670a26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad10d70f-c31d-4925-940e-6cf6169568b4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=960d9122-17be-47d7-b9ba-d2d9dea04b65) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:48 np0005603622 NetworkManager[49080]: <info>  [1769849208.7896] device (tap960d9122-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.789 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 960d9122-17be-47d7-b9ba-d2d9dea04b65 in datapath ef6b632f-b593-411e-97a8-ffb8767a0eb8 bound to our chassis#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.791 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef6b632f-b593-411e-97a8-ffb8767a0eb8#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.804 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5822621b-a25a-4a05-9f7f-a182b2b2c928]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.805 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef6b632f-b1 in ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.808 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef6b632f-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.808 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc3e81f-5c57-47fd-9b5f-d08d8a76898c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.809 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[671694d1-1b5d-4f68-8442-4681dced15d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.817 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc12eae-8d6e-47ed-8264-9a4450589bc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.828 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2dd260-c5d4-4fc5-848a-b77282eb6d00]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.849 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[97a59536-9ed0-4f77-84b8-628e068ef998]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.854 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6a900d-2bf5-4702-8ac8-39030b3174a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 NetworkManager[49080]: <info>  [1769849208.8554] manager: (tapef6b632f-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/322)
Jan 31 03:46:48 np0005603622 systemd-udevd[283834]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.873 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5ad7b0-2f8e-4e36-b143-9714378508dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.877 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[42a5fa14-f6c2-46b8-b526-2ee4cf016443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 NetworkManager[49080]: <info>  [1769849208.8938] device (tapef6b632f-b0): carrier: link connected
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.897 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2952e326-ec15-4a25-a295-a91de25c8d37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.911 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea98d18-b60a-44d6-a8ca-1819496c4aa4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef6b632f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:36:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846023, 'reachable_time': 32595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283865, 'error': None, 'target': 'ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.926 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5c2c55-606e-4be7-9776-8f487b2fd76b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe51:36e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 846023, 'tstamp': 846023}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283866, 'error': None, 'target': 'ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.943 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4cd8e9-1df6-4227-a947-22d43677b2b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef6b632f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:36:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846023, 'reachable_time': 32595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283867, 'error': None, 'target': 'ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:48.965 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fd87fff4-23b2-49dd-b3bd-4c2750ded86a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.008 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a64fb8fc-3f09-4fe0-8c7c-bd6c6d9873d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.009 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef6b632f-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.009 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.010 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef6b632f-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.011 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:49 np0005603622 NetworkManager[49080]: <info>  [1769849209.0118] manager: (tapef6b632f-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 31 03:46:49 np0005603622 kernel: tapef6b632f-b0: entered promiscuous mode
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.014 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef6b632f-b0, col_values=(('external_ids', {'iface-id': '1ed2cb94-c0f2-4cd5-b550-d822e4dc2fe1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:49Z|00724|binding|INFO|Releasing lport 1ed2cb94-c0f2-4cd5-b550-d822e4dc2fe1 from this chassis (sb_readonly=0)
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.023 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef6b632f-b593-411e-97a8-ffb8767a0eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef6b632f-b593-411e-97a8-ffb8767a0eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.024 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbc0dbe-f1cc-4f82-9890-427e1c33e604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.025 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-ef6b632f-b593-411e-97a8-ffb8767a0eb8
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/ef6b632f-b593-411e-97a8-ffb8767a0eb8.pid.haproxy
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID ef6b632f-b593-411e-97a8-ffb8767a0eb8
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:46:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:49.025 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'env', 'PROCESS_TAG=haproxy-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef6b632f-b593-411e-97a8-ffb8767a0eb8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:46:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:49.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.370 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849209.3703332, 802cab69-67de-4c6f-b952-e9947bc4e48c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.371 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] VM Started (Lifecycle Event)#033[00m
Jan 31 03:46:49 np0005603622 podman[283936]: 2026-01-31 08:46:49.324608253 +0000 UTC m=+0.026349844 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:46:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:46:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:49.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.459 221324 DEBUG nova.compute.manager [req-2f72c5eb-1368-4e03-b51a-84f8371de934 req-b2e3d132-e395-4fba-906d-90b8db514c8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Received event network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.460 221324 DEBUG oslo_concurrency.lockutils [req-2f72c5eb-1368-4e03-b51a-84f8371de934 req-b2e3d132-e395-4fba-906d-90b8db514c8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.461 221324 DEBUG oslo_concurrency.lockutils [req-2f72c5eb-1368-4e03-b51a-84f8371de934 req-b2e3d132-e395-4fba-906d-90b8db514c8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.461 221324 DEBUG oslo_concurrency.lockutils [req-2f72c5eb-1368-4e03-b51a-84f8371de934 req-b2e3d132-e395-4fba-906d-90b8db514c8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.462 221324 DEBUG nova.compute.manager [req-2f72c5eb-1368-4e03-b51a-84f8371de934 req-b2e3d132-e395-4fba-906d-90b8db514c8a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Processing event network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.463 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.468 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.475 221324 INFO nova.virt.libvirt.driver [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance spawned successfully.#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.476 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.578 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.582 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.610 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.611 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.611 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.612 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.613 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.613 221324 DEBUG nova.virt.libvirt.driver [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:46:49 np0005603622 podman[283936]: 2026-01-31 08:46:49.822793127 +0000 UTC m=+0.524534758 container create aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:46:49 np0005603622 systemd[1]: Started libpod-conmon-aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102.scope.
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.887 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.889 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849209.3704689, 802cab69-67de-4c6f-b952-e9947bc4e48c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.895 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:46:49 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:46:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f11298d912afb19ff550c606ff8f13fbe1285ad54ffb7e46eebc351628d1b096/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.984 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.991 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849209.4671962, 802cab69-67de-4c6f-b952-e9947bc4e48c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:46:49 np0005603622 nova_compute[221301]: 2026-01-31 08:46:49.991 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.006 221324 INFO nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Took 18.57 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.006 221324 DEBUG nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:50 np0005603622 podman[283936]: 2026-01-31 08:46:50.086295899 +0000 UTC m=+0.788037520 container init aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:46:50 np0005603622 podman[283936]: 2026-01-31 08:46:50.092891005 +0000 UTC m=+0.794632586 container start aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:46:50 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [NOTICE]   (283960) : New worker (283962) forked
Jan 31 03:46:50 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [NOTICE]   (283960) : Loading success.
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.327 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.331 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.492 221324 INFO nova.compute.manager [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Took 21.58 seconds to build instance.#033[00m
Jan 31 03:46:50 np0005603622 nova_compute[221301]: 2026-01-31 08:46:50.976 221324 DEBUG oslo_concurrency.lockutils [None req-7852d98c-631e-4658-b35c-d87f680d0389 d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:51.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.361 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:51.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.694 221324 DEBUG nova.compute.manager [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Received event network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.696 221324 DEBUG oslo_concurrency.lockutils [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.696 221324 DEBUG oslo_concurrency.lockutils [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.697 221324 DEBUG oslo_concurrency.lockutils [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.697 221324 DEBUG nova.compute.manager [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] No waiting events found dispatching network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:46:51 np0005603622 nova_compute[221301]: 2026-01-31 08:46:51.697 221324 WARNING nova.compute.manager [req-86cf2847-5daa-401e-b901-f6f2069f7a1d req-385bbdeb-5ab1-4c5e-832c-2fb32c13a521 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Received unexpected event network-vif-plugged-960d9122-17be-47d7-b9ba-d2d9dea04b65 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:46:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:46:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:46:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:53.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:53.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:54 np0005603622 nova_compute[221301]: 2026-01-31 08:46:54.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:55.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:55.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:55 np0005603622 nova_compute[221301]: 2026-01-31 08:46:55.737 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:55.739 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:55.741 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:46:56 np0005603622 nova_compute[221301]: 2026-01-31 08:46:56.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:57.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:46:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:57.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.570 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.767 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.767 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.768 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.768 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.769 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.770 221324 INFO nova.compute.manager [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Terminating instance#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.772 221324 DEBUG nova.compute.manager [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:46:57 np0005603622 kernel: tap960d9122-17 (unregistering): left promiscuous mode
Jan 31 03:46:57 np0005603622 NetworkManager[49080]: <info>  [1769849217.8254] device (tap960d9122-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:46:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:57Z|00725|binding|INFO|Releasing lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 from this chassis (sb_readonly=0)
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.833 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:57Z|00726|binding|INFO|Setting lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 down in Southbound
Jan 31 03:46:57 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:57Z|00727|binding|INFO|Removing iface tap960d9122-17 ovn-installed in OVS
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.837 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:57 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:57 np0005603622 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Jan 31 03:46:57 np0005603622 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009d.scope: Consumed 9.006s CPU time.
Jan 31 03:46:57 np0005603622 systemd-machined[190406]: Machine qemu-74-instance-0000009d terminated.
Jan 31 03:46:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:57.985 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:23 10.100.0.9'], port_security=['fa:16:3e:78:d6:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '802cab69-67de-4c6f-b952-e9947bc4e48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8d191d40d924e418ffde5e8b5483cf4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69d7985a-09cc-4059-a9d3-12844670a26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad10d70f-c31d-4925-940e-6cf6169568b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=960d9122-17be-47d7-b9ba-d2d9dea04b65) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:57.987 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 960d9122-17be-47d7-b9ba-d2d9dea04b65 in datapath ef6b632f-b593-411e-97a8-ffb8767a0eb8 unbound from our chassis#033[00m
Jan 31 03:46:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:57.988 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef6b632f-b593-411e-97a8-ffb8767a0eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:46:57 np0005603622 kernel: tap960d9122-17: entered promiscuous mode
Jan 31 03:46:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:57.990 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9c056f-8863-4fcd-905b-406b300ddd17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:57.991 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8 namespace which is not needed anymore#033[00m
Jan 31 03:46:57 np0005603622 kernel: tap960d9122-17 (unregistering): left promiscuous mode
Jan 31 03:46:57 np0005603622 NetworkManager[49080]: <info>  [1769849217.9932] manager: (tap960d9122-17): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:57.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:57Z|00728|binding|INFO|Claiming lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 for this chassis.
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:57Z|00729|binding|INFO|960d9122-17be-47d7-b9ba-d2d9dea04b65: Claiming fa:16:3e:78:d6:23 10.100.0.9
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.014 221324 INFO nova.virt.libvirt.driver [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Instance destroyed successfully.#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.015 221324 DEBUG nova.objects.instance [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lazy-loading 'resources' on Instance uuid 802cab69-67de-4c6f-b952-e9947bc4e48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:58Z|00730|binding|INFO|Setting lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 ovn-installed in OVS
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:58Z|00731|if_status|INFO|Dropped 4 log messages in last 1608 seconds (most recently, 1608 seconds ago) due to excessive rate
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:58Z|00732|if_status|INFO|Not setting lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 down as sb is readonly
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.024 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:23 10.100.0.9'], port_security=['fa:16:3e:78:d6:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '802cab69-67de-4c6f-b952-e9947bc4e48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8d191d40d924e418ffde5e8b5483cf4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69d7985a-09cc-4059-a9d3-12844670a26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad10d70f-c31d-4925-940e-6cf6169568b4, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=960d9122-17be-47d7-b9ba-d2d9dea04b65) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:58 np0005603622 ovn_controller[130109]: 2026-01-31T08:46:58Z|00733|binding|INFO|Releasing lport 960d9122-17be-47d7-b9ba-d2d9dea04b65 from this chassis (sb_readonly=0)
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.032 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.069 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:d6:23 10.100.0.9'], port_security=['fa:16:3e:78:d6:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '802cab69-67de-4c6f-b952-e9947bc4e48c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8d191d40d924e418ffde5e8b5483cf4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69d7985a-09cc-4059-a9d3-12844670a26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad10d70f-c31d-4925-940e-6cf6169568b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=960d9122-17be-47d7-b9ba-d2d9dea04b65) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.109 221324 DEBUG nova.virt.libvirt.vif [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:46:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1974979382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1974979382',id=157,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:46:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d8d191d40d924e418ffde5e8b5483cf4',ramdisk_id='',reservation_id='r-2o6y1jgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-289523751',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-289523751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:46:50Z,user_data=None,user_id='d169542a81014a91ac7ff995f44e45a5',uuid=802cab69-67de-4c6f-b952-e9947bc4e48c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.109 221324 DEBUG nova.network.os_vif_util [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converting VIF {"id": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "address": "fa:16:3e:78:d6:23", "network": {"id": "ef6b632f-b593-411e-97a8-ffb8767a0eb8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-308301110-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d8d191d40d924e418ffde5e8b5483cf4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap960d9122-17", "ovs_interfaceid": "960d9122-17be-47d7-b9ba-d2d9dea04b65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.110 221324 DEBUG nova.network.os_vif_util [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.110 221324 DEBUG os_vif [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:46:58 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [NOTICE]   (283960) : haproxy version is 2.8.14-c23fe91
Jan 31 03:46:58 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [NOTICE]   (283960) : path to executable is /usr/sbin/haproxy
Jan 31 03:46:58 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [WARNING]  (283960) : Exiting Master process...
Jan 31 03:46:58 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [ALERT]    (283960) : Current worker (283962) exited with code 143 (Terminated)
Jan 31 03:46:58 np0005603622 neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8[283956]: [WARNING]  (283960) : All workers exited. Exiting... (0)
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.112 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.112 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap960d9122-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:58 np0005603622 systemd[1]: libpod-aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102.scope: Deactivated successfully.
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.116 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.118 221324 INFO os_vif [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:d6:23,bridge_name='br-int',has_traffic_filtering=True,id=960d9122-17be-47d7-b9ba-d2d9dea04b65,network=Network(ef6b632f-b593-411e-97a8-ffb8767a0eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap960d9122-17')#033[00m
Jan 31 03:46:58 np0005603622 podman[284186]: 2026-01-31 08:46:58.122082129 +0000 UTC m=+0.048845874 container died aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:46:58 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102-userdata-shm.mount: Deactivated successfully.
Jan 31 03:46:58 np0005603622 systemd[1]: var-lib-containers-storage-overlay-f11298d912afb19ff550c606ff8f13fbe1285ad54ffb7e46eebc351628d1b096-merged.mount: Deactivated successfully.
Jan 31 03:46:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:46:58 np0005603622 podman[284186]: 2026-01-31 08:46:58.187551787 +0000 UTC m=+0.114315512 container cleanup aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 03:46:58 np0005603622 systemd[1]: libpod-conmon-aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102.scope: Deactivated successfully.
Jan 31 03:46:58 np0005603622 podman[284234]: 2026-01-31 08:46:58.267476779 +0000 UTC m=+0.062145909 container remove aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.271 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9d158658-531d-41d6-b043-1f2b99f71312]: (4, ('Sat Jan 31 08:46:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8 (aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102)\naa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102\nSat Jan 31 08:46:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8 (aa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102)\naa7270cb8cc702d2943837cde3e88872d085436a201b18408afe64c1ae9ea102\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.272 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6998f9-3d7d-4f8d-a280-8ea1d2a6270b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.274 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef6b632f-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.276 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 kernel: tapef6b632f-b0: left promiscuous mode
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.280 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c364268b-3466-44e8-9979-df295de6a7bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.283 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.298 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[345562f9-470c-4883-8492-66ddd1b2ef3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.299 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ee67007f-d9de-4355-ac24-0205e8f80325]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.314 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[897e523a-20f9-45a3-a97d-9353062ee22e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846018, 'reachable_time': 44790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284250, 'error': None, 'target': 'ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.316 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef6b632f-b593-411e-97a8-ffb8767a0eb8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.316 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[8b758d54-a5c1-4f4a-a54c-9180058d2f04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.317 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 960d9122-17be-47d7-b9ba-d2d9dea04b65 in datapath ef6b632f-b593-411e-97a8-ffb8767a0eb8 unbound from our chassis#033[00m
Jan 31 03:46:58 np0005603622 systemd[1]: run-netns-ovnmeta\x2def6b632f\x2db593\x2d411e\x2d97a8\x2dffb8767a0eb8.mount: Deactivated successfully.
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.319 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef6b632f-b593-411e-97a8-ffb8767a0eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.320 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a86e5c06-316c-4768-a223-0b4075078aca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.320 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 960d9122-17be-47d7-b9ba-d2d9dea04b65 in datapath ef6b632f-b593-411e-97a8-ffb8767a0eb8 unbound from our chassis#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.322 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef6b632f-b593-411e-97a8-ffb8767a0eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:46:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:58.322 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8acf6962-a863-49a0-b03e-94120789e19b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.867 221324 INFO nova.virt.libvirt.driver [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Deleting instance files /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c_del#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.869 221324 INFO nova.virt.libvirt.driver [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Deletion of /var/lib/nova/instances/802cab69-67de-4c6f-b952-e9947bc4e48c_del complete#033[00m
Jan 31 03:46:58 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.999 221324 INFO nova.compute.manager [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Took 1.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:46:59 np0005603622 nova_compute[221301]: 2026-01-31 08:46:58.999 221324 DEBUG oslo.service.loopingcall [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:46:59 np0005603622 nova_compute[221301]: 2026-01-31 08:46:59.000 221324 DEBUG nova.compute.manager [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:46:59 np0005603622 nova_compute[221301]: 2026-01-31 08:46:59.000 221324 DEBUG nova.network.neutron [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:46:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:46:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:46:59.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:46:59 np0005603622 nova_compute[221301]: 2026-01-31 08:46:59.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:46:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:46:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:46:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:46:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:46:59.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:46:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:46:59.742 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:00 np0005603622 podman[284253]: 2026-01-31 08:47:00.147906778 +0000 UTC m=+0.068675123 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:47:00 np0005603622 podman[284252]: 2026-01-31 08:47:00.228515189 +0000 UTC m=+0.148351799 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Jan 31 03:47:00 np0005603622 nova_compute[221301]: 2026-01-31 08:47:00.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.074 221324 DEBUG nova.network.neutron [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:01.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.193 221324 DEBUG nova.compute.manager [req-dbc588d8-358b-4ca9-ac45-7cf17fee8c76 req-7b5c6eb0-6027-4558-a3a9-698e1bd45a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Received event network-vif-deleted-960d9122-17be-47d7-b9ba-d2d9dea04b65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.193 221324 INFO nova.compute.manager [req-dbc588d8-358b-4ca9-ac45-7cf17fee8c76 req-7b5c6eb0-6027-4558-a3a9-698e1bd45a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Neutron deleted interface 960d9122-17be-47d7-b9ba-d2d9dea04b65; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.194 221324 DEBUG nova.network.neutron [req-dbc588d8-358b-4ca9-ac45-7cf17fee8c76 req-7b5c6eb0-6027-4558-a3a9-698e1bd45a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.209 221324 INFO nova.compute.manager [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Took 2.21 seconds to deallocate network for instance.#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:01.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.455 221324 DEBUG nova.compute.manager [req-dbc588d8-358b-4ca9-ac45-7cf17fee8c76 req-7b5c6eb0-6027-4558-a3a9-698e1bd45a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Detach interface failed, port_id=960d9122-17be-47d7-b9ba-d2d9dea04b65, reason: Instance 802cab69-67de-4c6f-b952-e9947bc4e48c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.516 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.517 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.518 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.518 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.518 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.582 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.583 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.635 221324 DEBUG nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.678 221324 DEBUG nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.680 221324 DEBUG nova.compute.provider_tree [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.707 221324 DEBUG nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.832 221324 DEBUG nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.931 221324 DEBUG oslo_concurrency.processutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:01 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3777675159' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:01 np0005603622 nova_compute[221301]: 2026-01-31 08:47:01.957 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.118 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.119 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4388MB free_disk=20.94631576538086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.120 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/53824139' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.385 221324 DEBUG oslo_concurrency.processutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.391 221324 DEBUG nova.compute.provider_tree [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:02 np0005603622 nova_compute[221301]: 2026-01-31 08:47:02.905 221324 DEBUG nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:47:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:03.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:47:03 np0005603622 nova_compute[221301]: 2026-01-31 08:47:03.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:03 np0005603622 nova_compute[221301]: 2026-01-31 08:47:03.454 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:03 np0005603622 nova_compute[221301]: 2026-01-31 08:47:03.458 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 1.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:03 np0005603622 nova_compute[221301]: 2026-01-31 08:47:03.780 221324 INFO nova.scheduler.client.report [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Deleted allocations for instance 802cab69-67de-4c6f-b952-e9947bc4e48c#033[00m
Jan 31 03:47:04 np0005603622 nova_compute[221301]: 2026-01-31 08:47:04.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:04 np0005603622 nova_compute[221301]: 2026-01-31 08:47:04.617 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 has allocations against this compute host but is not found in the database.#033[00m
Jan 31 03:47:04 np0005603622 nova_compute[221301]: 2026-01-31 08:47:04.618 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:47:04 np0005603622 nova_compute[221301]: 2026-01-31 08:47:04.618 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:47:04 np0005603622 nova_compute[221301]: 2026-01-31 08:47:04.656 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.005 221324 DEBUG oslo_concurrency.lockutils [None req-472793de-ad54-4643-b002-a866d595891d d169542a81014a91ac7ff995f44e45a5 d8d191d40d924e418ffde5e8b5483cf4 - - default default] Lock "802cab69-67de-4c6f-b952-e9947bc4e48c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:05.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2402355646' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.120 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.124 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.190 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.372 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.373 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.374 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.374 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:05 np0005603622 nova_compute[221301]: 2026-01-31 08:47:05.594 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.071 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.072 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.080 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.080 221324 INFO nova.compute.claims [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.376 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.376 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.376 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.544 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.545 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.545 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.546 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.628 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.899 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:06 np0005603622 nova_compute[221301]: 2026-01-31 08:47:06.900 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.059 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:47:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:07 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/520269733' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:07.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.104 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.109 221324 DEBUG nova.compute.provider_tree [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.193 221324 DEBUG nova.scheduler.client.report [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.406 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.411 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.411 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.413 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.419 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.419 221324 INFO nova.compute.claims [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:47:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:07.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.715 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.716 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.831 221324 INFO nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.916 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:47:07 np0005603622 nova_compute[221301]: 2026-01-31 08:47:07.924 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.231 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.233 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.234 221324 INFO nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Creating image(s)#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.270 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.300 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.329 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.332 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:47:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1341434340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.366 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.371 221324 DEBUG nova.compute.provider_tree [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.383 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.384 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.384 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.384 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.410 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.414 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.432 221324 DEBUG nova.scheduler.client.report [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.496 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.497 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.877 221324 DEBUG nova.policy [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a498364761ef428b99cac3f92e603385', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8397e0fed04b4dabb57148d0924de2dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.926 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.927 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:47:08 np0005603622 nova_compute[221301]: 2026-01-31 08:47:08.991 221324 INFO nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.063 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:47:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:09.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.235 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.236 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.237 221324 INFO nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Creating image(s)#033[00m
Jan 31 03:47:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:09.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.541 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.571 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.605 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.611 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.652 221324 DEBUG nova.policy [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa7f893021af4a84b03d85b476dadfe0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.688 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.690 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.691 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.692 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.726 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.732 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.799 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:09 np0005603622 nova_compute[221301]: 2026-01-31 08:47:09.885 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] resizing rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.001 221324 DEBUG nova.objects.instance [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'migration_context' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.033 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.033 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Ensure instance console log exists: /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.033 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.034 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.034 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.557 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.826s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:10 np0005603622 nova_compute[221301]: 2026-01-31 08:47:10.635 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] resizing rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.045 221324 DEBUG nova.objects.instance [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:11.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.126 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.126 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Ensure instance console log exists: /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.127 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.127 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.128 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:11.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:11 np0005603622 nova_compute[221301]: 2026-01-31 08:47:11.757 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Successfully created port: 8caf8cff-a513-4677-929d-ade94fee3082 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:47:13 np0005603622 nova_compute[221301]: 2026-01-31 08:47:13.011 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849218.0105846, 802cab69-67de-4c6f-b952-e9947bc4e48c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:13 np0005603622 nova_compute[221301]: 2026-01-31 08:47:13.012 221324 INFO nova.compute.manager [-] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:47:13 np0005603622 nova_compute[221301]: 2026-01-31 08:47:13.070 221324 DEBUG nova.compute.manager [None req-3df38569-e202-4255-b844-f63e3797e5aa - - - - - -] [instance: 802cab69-67de-4c6f-b952-e9947bc4e48c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:47:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:13.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:47:13 np0005603622 nova_compute[221301]: 2026-01-31 08:47:13.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:13 np0005603622 nova_compute[221301]: 2026-01-31 08:47:13.287 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Successfully created port: 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:47:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:13.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:14 np0005603622 nova_compute[221301]: 2026-01-31 08:47:14.070 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:14 np0005603622 nova_compute[221301]: 2026-01-31 08:47:14.332 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:15.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:15.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:15 np0005603622 nova_compute[221301]: 2026-01-31 08:47:15.674 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Successfully updated port: 8caf8cff-a513-4677-929d-ade94fee3082 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:47:15 np0005603622 nova_compute[221301]: 2026-01-31 08:47:15.952 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:15 np0005603622 nova_compute[221301]: 2026-01-31 08:47:15.953 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:15 np0005603622 nova_compute[221301]: 2026-01-31 08:47:15.953 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:47:16 np0005603622 nova_compute[221301]: 2026-01-31 08:47:16.126 221324 DEBUG nova.compute.manager [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-changed-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:16 np0005603622 nova_compute[221301]: 2026-01-31 08:47:16.127 221324 DEBUG nova.compute.manager [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing instance network info cache due to event network-changed-8caf8cff-a513-4677-929d-ade94fee3082. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:47:16 np0005603622 nova_compute[221301]: 2026-01-31 08:47:16.127 221324 DEBUG oslo_concurrency.lockutils [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:16 np0005603622 nova_compute[221301]: 2026-01-31 08:47:16.467 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:47:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:17.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:17.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:17 np0005603622 nova_compute[221301]: 2026-01-31 08:47:17.576 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Successfully updated port: 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:47:17 np0005603622 nova_compute[221301]: 2026-01-31 08:47:17.682 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:17 np0005603622 nova_compute[221301]: 2026-01-31 08:47:17.683 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:17 np0005603622 nova_compute[221301]: 2026-01-31 08:47:17.683 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.106 221324 DEBUG nova.compute.manager [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-changed-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.106 221324 DEBUG nova.compute.manager [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Refreshing instance network info cache due to event network-changed-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.107 221324 DEBUG oslo_concurrency.lockutils [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.124 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.518 221324 DEBUG nova.network.neutron [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.535 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.703 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.703 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Instance network_info: |[{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.703 221324 DEBUG oslo_concurrency.lockutils [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.704 221324 DEBUG nova.network.neutron [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.706 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Start _get_guest_xml network_info=[{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.710 221324 WARNING nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.714 221324 DEBUG nova.virt.libvirt.host [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.715 221324 DEBUG nova.virt.libvirt.host [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.718 221324 DEBUG nova.virt.libvirt.host [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.718 221324 DEBUG nova.virt.libvirt.host [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.719 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.720 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.720 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.720 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.720 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.720 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.721 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.721 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.721 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.721 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.721 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.722 221324 DEBUG nova.virt.hardware [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:47:18 np0005603622 nova_compute[221301]: 2026-01-31 08:47:18.724 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:19.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3764736350' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.178 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.204 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.208 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:19.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/767621157' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.627 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.628 221324 DEBUG nova.virt.libvirt.vif [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=160,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-fxo6kllb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=6fd9c6e1-ddff-436d-b240-240ec06b2a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.629 221324 DEBUG nova.network.os_vif_util [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.630 221324 DEBUG nova.network.os_vif_util [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.631 221324 DEBUG nova.objects.instance [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.661 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <uuid>6fd9c6e1-ddff-436d-b240-240ec06b2a15</uuid>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <name>instance-000000a0</name>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:name>multiattach-server-1</nova:name>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:47:18</nova:creationTime>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:user uuid="a498364761ef428b99cac3f92e603385">tempest-AttachVolumeMultiAttachTest-1931311941-project-member</nova:user>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:project uuid="8397e0fed04b4dabb57148d0924de2dc">tempest-AttachVolumeMultiAttachTest-1931311941</nova:project>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <nova:port uuid="8caf8cff-a513-4677-929d-ade94fee3082">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="serial">6fd9c6e1-ddff-436d-b240-240ec06b2a15</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="uuid">6fd9c6e1-ddff-436d-b240-240ec06b2a15</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:64:d6:1f"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <target dev="tap8caf8cff-a5"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/console.log" append="off"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:47:19 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:47:19 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:47:19 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:47:19 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.663 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Preparing to wait for external event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.663 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.663 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.663 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.664 221324 DEBUG nova.virt.libvirt.vif [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=160,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-fxo6kllb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=6fd9c6e1-ddff-436d-b240-240ec06b2a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.664 221324 DEBUG nova.network.os_vif_util [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.665 221324 DEBUG nova.network.os_vif_util [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.665 221324 DEBUG os_vif [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.666 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.666 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.666 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.668 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.669 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8caf8cff-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.669 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8caf8cff-a5, col_values=(('external_ids', {'iface-id': '8caf8cff-a513-4677-929d-ade94fee3082', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:d6:1f', 'vm-uuid': '6fd9c6e1-ddff-436d-b240-240ec06b2a15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:19 np0005603622 NetworkManager[49080]: <info>  [1769849239.6718] manager: (tap8caf8cff-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.672 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.675 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.676 221324 INFO os_vif [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5')#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.889 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.890 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.890 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:64:d6:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.890 221324 INFO nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Using config drive#033[00m
Jan 31 03:47:19 np0005603622 nova_compute[221301]: 2026-01-31 08:47:19.918 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.478 221324 DEBUG nova.network.neutron [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.583 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.584 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance network_info: |[{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.585 221324 DEBUG oslo_concurrency.lockutils [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.586 221324 DEBUG nova.network.neutron [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Refreshing network info cache for port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.589 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start _get_guest_xml network_info=[{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.595 221324 WARNING nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.601 221324 DEBUG nova.virt.libvirt.host [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.602 221324 DEBUG nova.virt.libvirt.host [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.607 221324 DEBUG nova.virt.libvirt.host [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.608 221324 DEBUG nova.virt.libvirt.host [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.609 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.610 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.610 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.610 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.611 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.611 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.611 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.611 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.612 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.612 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.612 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.613 221324 DEBUG nova.virt.hardware [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.616 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.957 221324 INFO nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Creating config drive at /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config#033[00m
Jan 31 03:47:20 np0005603622 nova_compute[221301]: 2026-01-31 08:47:20.963 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvvrkx1r8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100838201' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.082 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.109 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.113 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:47:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.133 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvvrkx1r8" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.161 221324 DEBUG nova.storage.rbd_utils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.165 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:47:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:21.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:47:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:47:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2994035794' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.582 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.583 221324 DEBUG nova.virt.libvirt.vif [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1544301339',display_name='tempest-ServerRescueNegativeTestJSON-server-1544301339',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1544301339',id=161,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-h0qz026s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:09Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.584 221324 DEBUG nova.network.os_vif_util [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.585 221324 DEBUG nova.network.os_vif_util [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.586 221324 DEBUG nova.objects.instance [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.615 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <uuid>4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</uuid>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <name>instance-000000a1</name>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1544301339</nova:name>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:47:20</nova:creationTime>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <nova:port uuid="6e3ffb68-e766-4edd-8e73-e44bb1e58fd4">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="serial">4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="uuid">4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b0:16:3c"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <target dev="tap6e3ffb68-e7"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/console.log" append="off"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:47:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:47:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:47:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:47:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.617 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Preparing to wait for external event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.617 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.617 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.618 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.618 221324 DEBUG nova.virt.libvirt.vif [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1544301339',display_name='tempest-ServerRescueNegativeTestJSON-server-1544301339',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1544301339',id=161,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-h0qz026s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:47:09Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.619 221324 DEBUG nova.network.os_vif_util [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.620 221324 DEBUG nova.network.os_vif_util [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.620 221324 DEBUG os_vif [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.621 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.621 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.622 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.624 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e3ffb68-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.625 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e3ffb68-e7, col_values=(('external_ids', {'iface-id': '6e3ffb68-e766-4edd-8e73-e44bb1e58fd4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:16:3c', 'vm-uuid': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:21 np0005603622 NetworkManager[49080]: <info>  [1769849241.6278] manager: (tap6e3ffb68-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.633 221324 INFO os_vif [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7')#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.773 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.773 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.774 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:b0:16:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.775 221324 INFO nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Using config drive#033[00m
Jan 31 03:47:21 np0005603622 nova_compute[221301]: 2026-01-31 08:47:21.806 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.073 221324 DEBUG nova.network.neutron [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updated VIF entry in instance network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.075 221324 DEBUG nova.network.neutron [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.095 221324 DEBUG oslo_concurrency.processutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config 6fd9c6e1-ddff-436d-b240-240ec06b2a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.930s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.096 221324 INFO nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Deleting local config drive /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15/disk.config because it was imported into RBD.#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.116 221324 DEBUG oslo_concurrency.lockutils [req-f1b472ba-1a97-49ab-8d05-4f1209ebe52c req-0710a890-314c-482c-8245-aafa142f0239 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:22 np0005603622 kernel: tap8caf8cff-a5: entered promiscuous mode
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.1504] manager: (tap8caf8cff-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 31 03:47:22 np0005603622 systemd-udevd[284956]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:22Z|00734|binding|INFO|Claiming lport 8caf8cff-a513-4677-929d-ade94fee3082 for this chassis.
Jan 31 03:47:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:22Z|00735|binding|INFO|8caf8cff-a513-4677-929d-ade94fee3082: Claiming fa:16:3e:64:d6:1f 10.100.0.10
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.202 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.213 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.2170] device (tap8caf8cff-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.2177] device (tap8caf8cff-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.2222] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.2228] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Jan 31 03:47:22 np0005603622 systemd-machined[190406]: New machine qemu-75-instance-000000a0.
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.237 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:d6:1f 10.100.0.10'], port_security=['fa:16:3e:64:d6:1f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6fd9c6e1-ddff-436d-b240-240ec06b2a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8caf8cff-a513-4677-929d-ade94fee3082) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.238 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8caf8cff-a513-4677-929d-ade94fee3082 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 bound to our chassis#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.239 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.248 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[90b1b316-feac-4bdf-beac-2cf3497f9a8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.249 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3afaf607-41 in ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.250 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3afaf607-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.251 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[954e0599-2d82-42e2-a87d-b005edf561c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.251 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[df541b65-34ae-412d-8eda-5586fc3a9365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.258 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d0003466-0b25-4937-b3cb-5666c98ce120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 systemd[1]: Started Virtual Machine qemu-75-instance-000000a0.
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.266 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dad9c903-119b-41de-a257-1dabfbd085b7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.291 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7d57eeb1-0700-428e-8c00-53574ba8d4f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 systemd-udevd[284963]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.296 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1ad7b2-543e-4fa7-8737-cc0485fbe586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.297 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.2991] manager: (tap3afaf607-40): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Jan 31 03:47:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:22Z|00736|binding|INFO|Setting lport 8caf8cff-a513-4677-929d-ade94fee3082 ovn-installed in OVS
Jan 31 03:47:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:22Z|00737|binding|INFO|Setting lport 8caf8cff-a513-4677-929d-ade94fee3082 up in Southbound
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.300 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.317 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4e549460-5fdc-4422-91ed-4112fb46f4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.320 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[de151603-c5ab-475b-8374-4e69c666d523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.3332] device (tap3afaf607-40): carrier: link connected
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.337 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[219c7a83-47ff-435b-81a5-99345339b122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.349 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[21962e3d-a0a9-4f4f-a317-f49db54aa1a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849367, 'reachable_time': 42362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284998, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.362 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc7792a-35ab-41fa-8105-0b7c823aefc8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:8444'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849367, 'tstamp': 849367}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284999, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.375 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99c09534-33f0-47e2-81a8-157ef4c2fd69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849367, 'reachable_time': 42362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285000, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.398 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd42f5f-df6e-4429-9d0d-d9d5fa86df33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.440 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0068a86e-f6b0-47ae-a9a2-b28e787a784f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.441 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.442 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.442 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 NetworkManager[49080]: <info>  [1769849242.4446] manager: (tap3afaf607-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 31 03:47:22 np0005603622 kernel: tap3afaf607-40: entered promiscuous mode
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.448 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:22Z|00738|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.456 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.457 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[efd92ea1-e079-4d52-aa53-2b07243ee5b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.457 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-3afaf607-43a1-4d65-95fc-0a22b5c901d0
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/3afaf607-43a1-4d65-95fc-0a22b5c901d0.pid.haproxy
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 3afaf607-43a1-4d65-95fc-0a22b5c901d0
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:47:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:22.458 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'env', 'PROCESS_TAG=haproxy-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3afaf607-43a1-4d65-95fc-0a22b5c901d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.517 221324 INFO nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Creating config drive at /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.521 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4geljw2a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.648 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp4geljw2a" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.711 221324 DEBUG nova.storage.rbd_utils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:47:22 np0005603622 nova_compute[221301]: 2026-01-31 08:47:22.715 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:22 np0005603622 podman[285075]: 2026-01-31 08:47:22.780381466 +0000 UTC m=+0.025749578 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:47:22 np0005603622 podman[285075]: 2026-01-31 08:47:22.94053014 +0000 UTC m=+0.185898212 container create d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:47:23 np0005603622 systemd[1]: Started libpod-conmon-d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91.scope.
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.026 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849243.0253086, 6fd9c6e1-ddff-436d-b240-240ec06b2a15 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.027 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] VM Started (Lifecycle Event)#033[00m
Jan 31 03:47:23 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:47:23 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d523dc433e6532bd0fba67fdc4d0394dc72f04077c07ea639f25ab44b30e17a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:47:23 np0005603622 podman[285075]: 2026-01-31 08:47:23.11665924 +0000 UTC m=+0.362027322 container init d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.118 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:23 np0005603622 podman[285075]: 2026-01-31 08:47:23.122602618 +0000 UTC m=+0.367970690 container start d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.122 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849243.0254314, 6fd9c6e1-ddff-436d-b240-240ec06b2a15 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.123 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:47:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:23.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.133 221324 DEBUG oslo_concurrency.processutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.134 221324 INFO nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Deleting local config drive /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config because it was imported into RBD.#033[00m
Jan 31 03:47:23 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [NOTICE]   (285139) : New worker (285141) forked
Jan 31 03:47:23 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [NOTICE]   (285139) : Loading success.
Jan 31 03:47:23 np0005603622 kernel: tap6e3ffb68-e7: entered promiscuous mode
Jan 31 03:47:23 np0005603622 systemd-udevd[284986]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.1784] manager: (tap6e3ffb68-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Jan 31 03:47:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:23Z|00739|binding|INFO|Claiming lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for this chassis.
Jan 31 03:47:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:23Z|00740|binding|INFO|6e3ffb68-e766-4edd-8e73-e44bb1e58fd4: Claiming fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.178 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:23Z|00741|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 ovn-installed in OVS
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.1913] device (tap6e3ffb68-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.1921] device (tap6e3ffb68-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:47:23 np0005603622 systemd-machined[190406]: New machine qemu-76-instance-000000a1.
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.214 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.218 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:23 np0005603622 systemd[1]: Started Virtual Machine qemu-76-instance-000000a1.
Jan 31 03:47:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:23Z|00742|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 up in Southbound
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.234 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.245 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.246 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.254 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f08fb99b-76cf-4d51-a272-f272453d2418]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.254 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2128154c-01 in ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.256 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2128154c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.256 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eec998e8-089e-4e82-9eab-ecd9069a5d28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.257 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[80d7d457-18ac-4e38-92f6-fd70352bfa2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.268 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[036803c1-d3c8-45a9-b400-a4016b73f52d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.278 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b12c9707-2a4f-473c-8d8e-671101a104f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.298 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[af2d0e8c-6eff-42da-8542-75615e67b21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.306 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[860c1ea1-8b5c-4641-a648-1a0de4904201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.3071] manager: (tap2128154c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.333 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0a3c74c3-add2-4848-b46e-c5bfb3d50a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.336 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.337 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c33c3b-83c8-4781-b3ed-6447930cefd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.3601] device (tap2128154c-00): carrier: link connected
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.363 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[60881392-3c22-4f1c-838a-ffb4ad0ee2a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.367 221324 DEBUG nova.compute.manager [req-8ea710cf-4e30-431d-8bb1-2ecba1a68e8c req-1d260bb4-7de2-4fc5-828d-6be7cfe61a72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.367 221324 DEBUG oslo_concurrency.lockutils [req-8ea710cf-4e30-431d-8bb1-2ecba1a68e8c req-1d260bb4-7de2-4fc5-828d-6be7cfe61a72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.368 221324 DEBUG oslo_concurrency.lockutils [req-8ea710cf-4e30-431d-8bb1-2ecba1a68e8c req-1d260bb4-7de2-4fc5-828d-6be7cfe61a72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.368 221324 DEBUG oslo_concurrency.lockutils [req-8ea710cf-4e30-431d-8bb1-2ecba1a68e8c req-1d260bb4-7de2-4fc5-828d-6be7cfe61a72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.368 221324 DEBUG nova.compute.manager [req-8ea710cf-4e30-431d-8bb1-2ecba1a68e8c req-1d260bb4-7de2-4fc5-828d-6be7cfe61a72 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Processing event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.369 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.373 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849243.373176, 6fd9c6e1-ddff-436d-b240-240ec06b2a15 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.373 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.375 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.379 221324 INFO nova.virt.libvirt.driver [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Instance spawned successfully.#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.379 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.380 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[28ae7ef4-4a16-43a4-ab21-b26c9358e99c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849470, 'reachable_time': 32871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285182, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.423 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea83a40-0db9-4aee-abdb-b5dbf764dc9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:3208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849470, 'tstamp': 849470}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285183, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.441 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[553e0c29-de11-4f98-bd42-b7cbd1642781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849470, 'reachable_time': 32871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285184, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.464 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a338fa-a17c-4f67-9199-8d769e9524c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:23.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.510 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c41bf0c-bfc8-4bbb-befe-97f83db87e58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.512 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.512 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.513 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:23 np0005603622 NetworkManager[49080]: <info>  [1769849243.5160] manager: (tap2128154c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 31 03:47:23 np0005603622 kernel: tap2128154c-00: entered promiscuous mode
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.516 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.520 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:23Z|00743|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.523 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.523 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.523 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.524 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.524 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b6742469-f4ab-4cc2-a20b-082b1465f583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.524 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:47:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:23.525 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'env', 'PROCESS_TAG=haproxy-2128154c-0218-4f66-9509-e0db66eba3fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2128154c-0218-4f66-9509-e0db66eba3fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.524 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.526 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.526 221324 DEBUG nova.virt.libvirt.driver [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.530 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.532 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.535 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.620 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:23 np0005603622 podman[285216]: 2026-01-31 08:47:23.829884732 +0000 UTC m=+0.035929530 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.981 221324 DEBUG nova.network.neutron [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updated VIF entry in instance network info cache for port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:47:23 np0005603622 nova_compute[221301]: 2026-01-31 08:47:23.981 221324 DEBUG nova.network.neutron [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:23 np0005603622 podman[285216]: 2026-01-31 08:47:23.991571657 +0000 UTC m=+0.197616475 container create 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.003 221324 INFO nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Took 15.77 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.004 221324 DEBUG nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.008 221324 DEBUG oslo_concurrency.lockutils [req-f223a2ba-8fae-4750-a64b-fe308df50cbc req-0d4a0890-19e9-4269-98eb-e237a25bdc33 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:24 np0005603622 systemd[1]: Started libpod-conmon-52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640.scope.
Jan 31 03:47:24 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:47:24 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7a11aa264920b66fde7f19e8a0e25740124107ec257617e2b021531512a30a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:47:24 np0005603622 podman[285216]: 2026-01-31 08:47:24.182184483 +0000 UTC m=+0.388229311 container init 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:47:24 np0005603622 podman[285216]: 2026-01-31 08:47:24.189755485 +0000 UTC m=+0.395800283 container start 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:47:24 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [NOTICE]   (285261) : New worker (285272) forked
Jan 31 03:47:24 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [NOTICE]   (285261) : Loading success.
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.356 221324 INFO nova.compute.manager [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Took 18.31 seconds to build instance.#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.377 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.419 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849244.4183307, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.419 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Started (Lifecycle Event)#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.499 221324 DEBUG oslo_concurrency.lockutils [None req-f0d51054-e23d-4e4d-8628-d27c0e3b7e67 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.507 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.511 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849244.4194798, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.512 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:47:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:24Z|00744|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:47:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:24Z|00745|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.563 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.568 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:24 np0005603622 nova_compute[221301]: 2026-01-31 08:47:24.609 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:25.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:25.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.566 221324 DEBUG nova.compute.manager [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.567 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.567 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.567 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.567 221324 DEBUG nova.compute.manager [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] No waiting events found dispatching network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.568 221324 WARNING nova.compute.manager [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received unexpected event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.568 221324 DEBUG nova.compute.manager [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.568 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.568 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.569 221324 DEBUG oslo_concurrency.lockutils [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.569 221324 DEBUG nova.compute.manager [req-6c886c92-5d3d-4e6d-bbd6-d794c70f173e req-847261dd-ebaa-4215-a68e-c37175f34b58 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Processing event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.570 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.574 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849245.573442, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.574 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.576 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.580 221324 INFO nova.virt.libvirt.driver [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance spawned successfully.#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.580 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.606 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.610 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.619 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.619 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.620 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.620 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.621 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.621 221324 DEBUG nova.virt.libvirt.driver [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.659 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.735 221324 INFO nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Took 16.50 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.735 221324 DEBUG nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.866 221324 INFO nova.compute.manager [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Took 18.48 seconds to build instance.#033[00m
Jan 31 03:47:25 np0005603622 nova_compute[221301]: 2026-01-31 08:47:25.951 221324 DEBUG oslo_concurrency.lockutils [None req-f49fc6e2-c744-40b5-a391-7e20ddc011d0 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:26 np0005603622 nova_compute[221301]: 2026-01-31 08:47:26.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:27.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:27.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.875 221324 DEBUG nova.compute.manager [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.876 221324 DEBUG oslo_concurrency.lockutils [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.876 221324 DEBUG oslo_concurrency.lockutils [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.877 221324 DEBUG oslo_concurrency.lockutils [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.877 221324 DEBUG nova.compute.manager [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:47:27 np0005603622 nova_compute[221301]: 2026-01-31 08:47:27.877 221324 WARNING nova.compute.manager [req-9c4c7c4d-cefd-464c-883f-bb30ec2f7c31 req-b27c7205-e321-4982-9d24-cb94b1768e35 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:47:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:29.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:29 np0005603622 nova_compute[221301]: 2026-01-31 08:47:29.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:29.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:30.904 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:30.905 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:30.906 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:31 np0005603622 podman[285290]: 2026-01-31 08:47:31.139405913 +0000 UTC m=+0.052421850 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:47:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:31.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:31 np0005603622 podman[285289]: 2026-01-31 08:47:31.161987255 +0000 UTC m=+0.075264499 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:47:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:31.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:31 np0005603622 nova_compute[221301]: 2026-01-31 08:47:31.631 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:33.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:33 np0005603622 nova_compute[221301]: 2026-01-31 08:47:33.332 221324 DEBUG nova.compute.manager [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-changed-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:47:33 np0005603622 nova_compute[221301]: 2026-01-31 08:47:33.332 221324 DEBUG nova.compute.manager [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing instance network info cache due to event network-changed-8caf8cff-a513-4677-929d-ade94fee3082. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:47:33 np0005603622 nova_compute[221301]: 2026-01-31 08:47:33.333 221324 DEBUG oslo_concurrency.lockutils [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:47:33 np0005603622 nova_compute[221301]: 2026-01-31 08:47:33.333 221324 DEBUG oslo_concurrency.lockutils [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:47:33 np0005603622 nova_compute[221301]: 2026-01-31 08:47:33.333 221324 DEBUG nova.network.neutron [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:47:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:33.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:34 np0005603622 nova_compute[221301]: 2026-01-31 08:47:34.383 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:35.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:35.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:36.149 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:47:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:36.150 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:47:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:47:36.152 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:47:36 np0005603622 nova_compute[221301]: 2026-01-31 08:47:36.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:36 np0005603622 nova_compute[221301]: 2026-01-31 08:47:36.633 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:37.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:37Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:d6:1f 10.100.0.10
Jan 31 03:47:37 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:37Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:d6:1f 10.100.0.10
Jan 31 03:47:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:37.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:37 np0005603622 nova_compute[221301]: 2026-01-31 08:47:37.559 221324 DEBUG nova.network.neutron [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updated VIF entry in instance network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:47:37 np0005603622 nova_compute[221301]: 2026-01-31 08:47:37.560 221324 DEBUG nova.network.neutron [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:47:37 np0005603622 nova_compute[221301]: 2026-01-31 08:47:37.850 221324 DEBUG oslo_concurrency.lockutils [req-89956c0c-2492-432d-8f0b-b3eb7a7744dc req-78496206-c49c-4128-a81b-ce69bbb04f17 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:47:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:39.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:39 np0005603622 nova_compute[221301]: 2026-01-31 08:47:39.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:39Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:47:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:47:39Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:47:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:39.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:41 np0005603622 nova_compute[221301]: 2026-01-31 08:47:41.358 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:41 np0005603622 nova_compute[221301]: 2026-01-31 08:47:41.358 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:41 np0005603622 nova_compute[221301]: 2026-01-31 08:47:41.430 221324 DEBUG nova.objects.instance [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:41.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:41 np0005603622 nova_compute[221301]: 2026-01-31 08:47:41.637 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:41 np0005603622 nova_compute[221301]: 2026-01-31 08:47:41.660 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.309 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.310 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.310 221324 INFO nova.compute.manager [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Attaching volume dad97247-8d79-4c56-b9b0-e61729262e21 to /dev/vdb#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.601 221324 DEBUG os_brick.utils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.604 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.615 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.616 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[85aced31-4b23-4fbf-9e0c-4ad276ddfed0]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.617 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.622 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.623 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[aff40e61-8052-4ce9-9ffe-19a0578915aa]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.624 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.635 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.636 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[c94de38a-d9bf-4691-877b-6195a9eb7800]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.637 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[28cec348-3f08-4337-81cf-28e477efbac4]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.638 221324 DEBUG oslo_concurrency.processutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.658 221324 DEBUG oslo_concurrency.processutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "nvme version" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.660 221324 DEBUG os_brick.initiator.connectors.lightos [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.661 221324 DEBUG os_brick.initiator.connectors.lightos [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.661 221324 DEBUG os_brick.initiator.connectors.lightos [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.661 221324 DEBUG os_brick.utils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] <== get_connector_properties: return (59ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:47:42 np0005603622 nova_compute[221301]: 2026-01-31 08:47:42.662 221324 DEBUG nova.virt.block_device [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating existing volume attachment record: fce70b36-5636-4d96-85e4-6ab4337847a9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:47:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:43.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:43.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:44 np0005603622 nova_compute[221301]: 2026-01-31 08:47:44.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:44 np0005603622 nova_compute[221301]: 2026-01-31 08:47:44.481 221324 DEBUG nova.objects.instance [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:44 np0005603622 nova_compute[221301]: 2026-01-31 08:47:44.752 221324 DEBUG nova.virt.libvirt.driver [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Attempting to attach volume dad97247-8d79-4c56-b9b0-e61729262e21 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:47:44 np0005603622 nova_compute[221301]: 2026-01-31 08:47:44.755 221324 DEBUG nova.virt.libvirt.guest [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-dad97247-8d79-4c56-b9b0-e61729262e21">
Jan 31 03:47:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:47:44 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <serial>dad97247-8d79-4c56-b9b0-e61729262e21</serial>
Jan 31 03:47:44 np0005603622 nova_compute[221301]:  <shareable/>
Jan 31 03:47:44 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:47:44 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:47:44 np0005603622 nova_compute[221301]: 2026-01-31 08:47:44.959 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:45 np0005603622 nova_compute[221301]: 2026-01-31 08:47:45.040 221324 DEBUG nova.virt.libvirt.driver [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:45 np0005603622 nova_compute[221301]: 2026-01-31 08:47:45.041 221324 DEBUG nova.virt.libvirt.driver [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:45 np0005603622 nova_compute[221301]: 2026-01-31 08:47:45.041 221324 DEBUG nova.virt.libvirt.driver [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:47:45 np0005603622 nova_compute[221301]: 2026-01-31 08:47:45.041 221324 DEBUG nova.virt.libvirt.driver [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:64:d6:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:47:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:45.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:47:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:45.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:47:46 np0005603622 nova_compute[221301]: 2026-01-31 08:47:46.160 221324 DEBUG oslo_concurrency.lockutils [None req-606e41db-83f8-4e14-996c-342cb16895cd a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:46 np0005603622 nova_compute[221301]: 2026-01-31 08:47:46.640 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:47.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:47.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:49.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:49 np0005603622 nova_compute[221301]: 2026-01-31 08:47:49.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:49.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:51.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:47:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:47:51 np0005603622 nova_compute[221301]: 2026-01-31 08:47:51.643 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:51 np0005603622 nova_compute[221301]: 2026-01-31 08:47:51.816 221324 DEBUG oslo_concurrency.lockutils [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:47:51 np0005603622 nova_compute[221301]: 2026-01-31 08:47:51.817 221324 DEBUG oslo_concurrency.lockutils [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.016 221324 INFO nova.compute.manager [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Detaching volume dad97247-8d79-4c56-b9b0-e61729262e21#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.245 221324 INFO nova.virt.block_device [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Attempting to driver detach volume dad97247-8d79-4c56-b9b0-e61729262e21 from mountpoint /dev/vdb#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.255 221324 DEBUG nova.virt.libvirt.driver [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Attempting to detach device vdb from instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.256 221324 DEBUG nova.virt.libvirt.guest [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-dad97247-8d79-4c56-b9b0-e61729262e21">
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <serial>dad97247-8d79-4c56-b9b0-e61729262e21</serial>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <shareable/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:47:52 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.279 221324 INFO nova.virt.libvirt.driver [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully detached device vdb from instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 from the persistent domain config.#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.279 221324 DEBUG nova.virt.libvirt.driver [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.280 221324 DEBUG nova.virt.libvirt.guest [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-dad97247-8d79-4c56-b9b0-e61729262e21">
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <serial>dad97247-8d79-4c56-b9b0-e61729262e21</serial>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <shareable/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:47:52 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:47:52 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.385 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849272.3846886, 6fd9c6e1-ddff-436d-b240-240ec06b2a15 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.386 221324 DEBUG nova.virt.libvirt.driver [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:47:52 np0005603622 nova_compute[221301]: 2026-01-31 08:47:52.388 221324 INFO nova.virt.libvirt.driver [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully detached device vdb from instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 from the live domain config.#033[00m
Jan 31 03:47:53 np0005603622 nova_compute[221301]: 2026-01-31 08:47:53.118 221324 DEBUG nova.objects.instance [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:47:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:53.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:53 np0005603622 nova_compute[221301]: 2026-01-31 08:47:53.440 221324 DEBUG oslo_concurrency.lockutils [None req-dd8dd26a-4437-40c8-9537-fb2ef6774f72 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:47:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:53.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:54 np0005603622 nova_compute[221301]: 2026-01-31 08:47:54.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:54 np0005603622 nova_compute[221301]: 2026-01-31 08:47:54.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:55.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:55.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:56 np0005603622 nova_compute[221301]: 2026-01-31 08:47:56.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.748897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276748947, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 1210, "num_deletes": 255, "total_data_size": 2438998, "memory_usage": 2473280, "flush_reason": "Manual Compaction"}
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276811151, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 1607580, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 69301, "largest_seqno": 70505, "table_properties": {"data_size": 1602497, "index_size": 2542, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11355, "raw_average_key_size": 19, "raw_value_size": 1592067, "raw_average_value_size": 2730, "num_data_blocks": 112, "num_entries": 583, "num_filter_entries": 583, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849182, "oldest_key_time": 1769849182, "file_creation_time": 1769849276, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 62301 microseconds, and 4776 cpu microseconds.
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.811196) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 1607580 bytes OK
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.811233) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.915005) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.915086) EVENT_LOG_v1 {"time_micros": 1769849276915070, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.915129) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 2433237, prev total WAL file size 2479383, number of live WAL files 2.
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.916505) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353133' seq:72057594037927935, type:22 .. '6C6F676D0032373634' seq:0, type:0; will stop at (end)
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(1569KB)], [141(9527KB)]
Jan 31 03:47:56 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849276916575, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 11363990, "oldest_snapshot_seqno": -1}
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 8969 keys, 11230964 bytes, temperature: kUnknown
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849277145689, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 11230964, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11173838, "index_size": 33589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22469, "raw_key_size": 237110, "raw_average_key_size": 26, "raw_value_size": 11017476, "raw_average_value_size": 1228, "num_data_blocks": 1280, "num_entries": 8969, "num_filter_entries": 8969, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849276, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:47:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:57.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.145977) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 11230964 bytes
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.238569) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 49.6 rd, 49.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 9.3 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(14.1) write-amplify(7.0) OK, records in: 9492, records dropped: 523 output_compression: NoCompression
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.238615) EVENT_LOG_v1 {"time_micros": 1769849277238597, "job": 90, "event": "compaction_finished", "compaction_time_micros": 229195, "compaction_time_cpu_micros": 31489, "output_level": 6, "num_output_files": 1, "total_output_size": 11230964, "num_input_records": 9492, "num_output_records": 8969, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849277239142, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849277240890, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:56.916378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.241013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.241018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.241019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.241021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:47:57.241022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:47:57 np0005603622 nova_compute[221301]: 2026-01-31 08:47:57.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:57.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:58 np0005603622 nova_compute[221301]: 2026-01-31 08:47:58.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:47:58 np0005603622 nova_compute[221301]: 2026-01-31 08:47:58.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:47:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:47:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:47:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:47:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:47:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:47:59 np0005603622 nova_compute[221301]: 2026-01-31 08:47:59.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:47:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:47:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:47:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:47:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:47:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:01.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.602 221324 INFO nova.compute.manager [None req-68cfbc79-4bcb-4c86-b560-386b7236a7cf aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Pausing#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.603 221324 DEBUG nova.objects.instance [None req-68cfbc79-4bcb-4c86-b560-386b7236a7cf aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.651 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.888 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849281.8886364, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.889 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.890 221324 DEBUG nova.compute.manager [None req-68cfbc79-4bcb-4c86-b560-386b7236a7cf aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.963 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:01 np0005603622 nova_compute[221301]: 2026-01-31 08:48:01.968 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:02 np0005603622 podman[285497]: 2026-01-31 08:48:02.124344895 +0000 UTC m=+0.051282759 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 03:48:02 np0005603622 podman[285496]: 2026-01-31 08:48:02.147089642 +0000 UTC m=+0.074214501 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:48:02 np0005603622 nova_compute[221301]: 2026-01-31 08:48:02.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:03.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:03 np0005603622 nova_compute[221301]: 2026-01-31 08:48:03.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:03 np0005603622 nova_compute[221301]: 2026-01-31 08:48:03.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:48:03 np0005603622 nova_compute[221301]: 2026-01-31 08:48:03.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:48:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:03.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:04 np0005603622 nova_compute[221301]: 2026-01-31 08:48:04.431 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:04 np0005603622 nova_compute[221301]: 2026-01-31 08:48:04.527 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:04 np0005603622 nova_compute[221301]: 2026-01-31 08:48:04.527 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:04 np0005603622 nova_compute[221301]: 2026-01-31 08:48:04.527 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:48:04 np0005603622 nova_compute[221301]: 2026-01-31 08:48:04.527 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:05.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:48:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.380 221324 INFO nova.compute.manager [None req-858e5ea8-259c-4d6a-b55b-aa1c4c9144f8 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Unpausing#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.381 221324 DEBUG nova.objects.instance [None req-858e5ea8-259c-4d6a-b55b-aa1c4c9144f8 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.447 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849286.4469306, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.447 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:48:06 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.453 221324 DEBUG nova.virt.libvirt.guest [None req-858e5ea8-259c-4d6a-b55b-aa1c4c9144f8 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.454 221324 DEBUG nova.compute.manager [None req-858e5ea8-259c-4d6a-b55b-aa1c4c9144f8 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.562 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.567 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:06 np0005603622 nova_compute[221301]: 2026-01-31 08:48:06.653 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:07.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:09.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:09 np0005603622 nova_compute[221301]: 2026-01-31 08:48:09.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:09.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.753 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.772 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.772 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.773 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.773 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.773 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.774 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.811 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.812 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.812 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.812 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:48:10 np0005603622 nova_compute[221301]: 2026-01-31 08:48:10.813 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1564996416' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.249 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.379 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.379 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.382 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.382 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.546 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.547 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3932MB free_disk=20.718719482421875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.547 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.547 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:11.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.656 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.974 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.974 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.975 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:48:11 np0005603622 nova_compute[221301]: 2026-01-31 08:48:11.975 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.124 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1646953215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.572 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.578 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.608 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.668 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:48:12 np0005603622 nova_compute[221301]: 2026-01-31 08:48:12.669 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:13.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:13 np0005603622 nova_compute[221301]: 2026-01-31 08:48:13.291 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:13.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:14 np0005603622 nova_compute[221301]: 2026-01-31 08:48:14.434 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:15.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:15 np0005603622 nova_compute[221301]: 2026-01-31 08:48:15.577 221324 INFO nova.compute.manager [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Rescuing#033[00m
Jan 31 03:48:15 np0005603622 nova_compute[221301]: 2026-01-31 08:48:15.578 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:15 np0005603622 nova_compute[221301]: 2026-01-31 08:48:15.579 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:15 np0005603622 nova_compute[221301]: 2026-01-31 08:48:15.580 221324 DEBUG nova.network.neutron [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:48:16 np0005603622 nova_compute[221301]: 2026-01-31 08:48:16.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:16.988 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:16.990 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:48:16 np0005603622 nova_compute[221301]: 2026-01-31 08:48:16.991 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:17.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:17.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:19 np0005603622 nova_compute[221301]: 2026-01-31 08:48:19.132 221324 DEBUG nova.network.neutron [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:19.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:19 np0005603622 nova_compute[221301]: 2026-01-31 08:48:19.437 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:19.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:20 np0005603622 nova_compute[221301]: 2026-01-31 08:48:20.294 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:21.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:21.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:21 np0005603622 nova_compute[221301]: 2026-01-31 08:48:21.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:21 np0005603622 nova_compute[221301]: 2026-01-31 08:48:21.753 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:48:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:23.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:23.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:24 np0005603622 nova_compute[221301]: 2026-01-31 08:48:24.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:24 np0005603622 nova_compute[221301]: 2026-01-31 08:48:24.774 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:48:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:24 np0005603622 kernel: tap6e3ffb68-e7 (unregistering): left promiscuous mode
Jan 31 03:48:24 np0005603622 NetworkManager[49080]: <info>  [1769849304.9143] device (tap6e3ffb68-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:48:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:24Z|00746|binding|INFO|Releasing lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 from this chassis (sb_readonly=0)
Jan 31 03:48:24 np0005603622 nova_compute[221301]: 2026-01-31 08:48:24.922 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:24Z|00747|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 down in Southbound
Jan 31 03:48:24 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:24Z|00748|binding|INFO|Removing iface tap6e3ffb68-e7 ovn-installed in OVS
Jan 31 03:48:24 np0005603622 nova_compute[221301]: 2026-01-31 08:48:24.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:24 np0005603622 nova_compute[221301]: 2026-01-31 08:48:24.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:24.961 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:24.963 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:48:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:24.965 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2128154c-0218-4f66-9509-e0db66eba3fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:48:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:24.967 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2e5820-aa1f-4e83-bdc7-d94b4cd26840]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:24.968 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace which is not needed anymore#033[00m
Jan 31 03:48:24 np0005603622 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 31 03:48:24 np0005603622 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a1.scope: Consumed 15.845s CPU time.
Jan 31 03:48:24 np0005603622 systemd-machined[190406]: Machine qemu-76-instance-000000a1 terminated.
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.029 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.041 221324 INFO nova.virt.libvirt.driver [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance destroyed successfully.#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.042 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'numa_topology' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.067 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Attempting rescue#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.069 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.075 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.076 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Creating image(s)#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.114 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.120 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.187 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.217 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:25.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.223 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [NOTICE]   (285261) : haproxy version is 2.8.14-c23fe91
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [NOTICE]   (285261) : path to executable is /usr/sbin/haproxy
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [WARNING]  (285261) : Exiting Master process...
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [WARNING]  (285261) : Exiting Master process...
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [ALERT]    (285261) : Current worker (285272) exited with code 143 (Terminated)
Jan 31 03:48:25 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[285232]: [WARNING]  (285261) : All workers exited. Exiting... (0)
Jan 31 03:48:25 np0005603622 systemd[1]: libpod-52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640.scope: Deactivated successfully.
Jan 31 03:48:25 np0005603622 podman[285667]: 2026-01-31 08:48:25.280179735 +0000 UTC m=+0.203635385 container died 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.288 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.289 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.290 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.291 221324 DEBUG oslo_concurrency.lockutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.325 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.330 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay-b7a11aa264920b66fde7f19e8a0e25740124107ec257617e2b021531512a30a5-merged.mount: Deactivated successfully.
Jan 31 03:48:25 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640-userdata-shm.mount: Deactivated successfully.
Jan 31 03:48:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:25.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:25 np0005603622 podman[285667]: 2026-01-31 08:48:25.775122161 +0000 UTC m=+0.698577811 container cleanup 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:48:25 np0005603622 systemd[1]: libpod-conmon-52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640.scope: Deactivated successfully.
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.836 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.837 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.931 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.932 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start _get_guest_xml network_info=[{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:b0:16:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.932 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.984 221324 WARNING nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.991 221324 DEBUG nova.virt.libvirt.host [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.991 221324 DEBUG nova.virt.libvirt.host [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.996 221324 DEBUG nova.virt.libvirt.host [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.997 221324 DEBUG nova.virt.libvirt.host [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.998 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.998 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.999 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.999 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:48:25 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.999 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:25.999 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.000 221324 DEBUG nova.virt.hardware [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.001 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.040 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:26 np0005603622 podman[285792]: 2026-01-31 08:48:26.221271197 +0000 UTC m=+0.427765016 container remove 52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.226 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3633dacf-db87-4abb-a57b-f64aa93eaf95]: (4, ('Sat Jan 31 08:48:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640)\n52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640\nSat Jan 31 08:48:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640)\n52e247d6d17f9738cf8c021ecee00445168a841256049aae234a6afe9299c640\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.228 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4565c9-9a8e-4d96-b5ad-3afa50e9a641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.229 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.275 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:26 np0005603622 kernel: tap2128154c-00: left promiscuous mode
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.288 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.292 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9a8854-ebbf-4c08-9654-e0ecc5638eac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.314 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7f4339-1344-4496-8015-442f841de3c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.316 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04c53ed8-4247-4da3-aca8-3208f14e3ba4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.329 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48f0b8e3-8630-4930-9137-06453ef88f34]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849463, 'reachable_time': 28178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285833, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 systemd[1]: run-netns-ovnmeta\x2d2128154c\x2d0218\x2d4f66\x2d9509\x2de0db66eba3fc.mount: Deactivated successfully.
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.334 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.335 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[86ee1be1-4d62-4749-a0c2-7d4a3eda2331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:48:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/631875737' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.531 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.532 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:26 np0005603622 nova_compute[221301]: 2026-01-31 08:48:26.665 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:26.992 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:48:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/125524095' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.161 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.629s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.163 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:27.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:27.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:48:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3899319236' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.628 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.629 221324 DEBUG nova.virt.libvirt.vif [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1544301339',display_name='tempest-ServerRescueNegativeTestJSON-server-1544301339',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1544301339',id=161,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:47:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-h0qz026s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:48:06Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:b0:16:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.630 221324 DEBUG nova.network.os_vif_util [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:b0:16:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.631 221324 DEBUG nova.network.os_vif_util [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.632 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.691 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <uuid>4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</uuid>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <name>instance-000000a1</name>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1544301339</nova:name>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:48:25</nova:creationTime>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <nova:port uuid="6e3ffb68-e766-4edd-8e73-e44bb1e58fd4">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="serial">4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="uuid">4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.rescue">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config.rescue">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:b0:16:3c"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <target dev="tap6e3ffb68-e7"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/console.log" append="off"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:48:27 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:48:27 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:48:27 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:48:27 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.698 221324 INFO nova.virt.libvirt.driver [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance destroyed successfully.#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.945 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.945 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.946 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.946 221324 DEBUG nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:b0:16:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.946 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Using config drive#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.979 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.988 221324 DEBUG nova.compute.manager [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.989 221324 DEBUG oslo_concurrency.lockutils [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.989 221324 DEBUG oslo_concurrency.lockutils [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.990 221324 DEBUG oslo_concurrency.lockutils [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.990 221324 DEBUG nova.compute.manager [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:27 np0005603622 nova_compute[221301]: 2026-01-31 08:48:27.990 221324 WARNING nova.compute.manager [req-d073966c-d7d8-4f31-92a8-cc3ecc027fed req-a869125f-8398-4d46-a9fc-e8bec9674ccd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:48:28 np0005603622 nova_compute[221301]: 2026-01-31 08:48:28.194 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:28 np0005603622 nova_compute[221301]: 2026-01-31 08:48:28.282 221324 DEBUG nova.objects.instance [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'keypairs' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:48:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:29.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:48:29 np0005603622 nova_compute[221301]: 2026-01-31 08:48:29.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:29.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.358 221324 DEBUG nova.compute.manager [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.358 221324 DEBUG oslo_concurrency.lockutils [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.359 221324 DEBUG oslo_concurrency.lockutils [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.359 221324 DEBUG oslo_concurrency.lockutils [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.359 221324 DEBUG nova.compute.manager [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.360 221324 WARNING nova.compute.manager [req-909e9644-9450-4f64-be98-94057491ab8f req-04367b40-380c-4ff3-b108-5ecebe9aae5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.710 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Creating config drive at /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.715 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcbgxqe24 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.840 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcbgxqe24" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.877 221324 DEBUG nova.storage.rbd_utils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:30 np0005603622 nova_compute[221301]: 2026-01-31 08:48:30.883 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:30.905 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:30.906 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:30.907 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:31.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.321 221324 DEBUG oslo_concurrency.processutils [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.323 221324 INFO nova.virt.libvirt.driver [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Deleting local config drive /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:48:31 np0005603622 kernel: tap6e3ffb68-e7: entered promiscuous mode
Jan 31 03:48:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:31Z|00749|binding|INFO|Claiming lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for this chassis.
Jan 31 03:48:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:31Z|00750|binding|INFO|6e3ffb68-e766-4edd-8e73-e44bb1e58fd4: Claiming fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.377 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.3804] manager: (tap6e3ffb68-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/335)
Jan 31 03:48:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:31Z|00751|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 ovn-installed in OVS
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.383 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 systemd-udevd[285950]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:48:31 np0005603622 systemd-machined[190406]: New machine qemu-77-instance-000000a1.
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.4156] device (tap6e3ffb68-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.4166] device (tap6e3ffb68-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:48:31 np0005603622 systemd[1]: Started Virtual Machine qemu-77-instance-000000a1.
Jan 31 03:48:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:31Z|00752|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 up in Southbound
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.473 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.475 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.476 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.486 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5846d4ba-0221-4635-975f-28bab804f764]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.486 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2128154c-01 in ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.489 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2128154c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.489 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7357928c-20dd-421d-bf53-7a001a0a2d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.490 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[92e8fbcc-b072-41db-9424-894a26129893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.499 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b939fd42-08cf-460b-8ac6-92ca6c072654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.513 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7eb073-cd04-4cd7-85f2-5ceacf0c18f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.547 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a951db5d-4c9a-432c-99d5-d46bd412877c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.5527] manager: (tap2128154c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/336)
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.553 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[276010c4-7f15-4fda-b515-993fe1258b1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.583 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3eae6f6b-5730-4a63-a466-d9274b1ffce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.587 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[473d2dbc-d5f2-4112-8349-a7761cea13d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:31.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.6069] device (tap2128154c-00): carrier: link connected
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.611 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3d8bcf-18cd-4c66-bfae-e363c28c2009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.624 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8853e70-033d-4ce4-9b3b-e54c8a766fab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856294, 'reachable_time': 40242, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285985, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.635 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a9baf1a6-affb-474a-8f45-1fd7b4825fae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:3208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 856294, 'tstamp': 856294}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285986, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.646 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[49a0a770-7404-4682-a9a3-96cfd96f32ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856294, 'reachable_time': 40242, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285987, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.666 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[be96ba2a-1f06-4615-928c-6e53c76b54ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.708 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5bcbc739-99ef-400e-9fcf-0ecc7cafb160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.709 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.709 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.710 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 NetworkManager[49080]: <info>  [1769849311.7124] manager: (tap2128154c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 31 03:48:31 np0005603622 kernel: tap2128154c-00: entered promiscuous mode
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.722 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.722 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.723 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:31Z|00753|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=1)
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.728 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:48:31 np0005603622 nova_compute[221301]: 2026-01-31 08:48:31.728 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.729 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a85a4b49-d7be-483f-ab9e-5765404387eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.732 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:48:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:31.732 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'env', 'PROCESS_TAG=haproxy-2128154c-0218-4f66-9509-e0db66eba3fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2128154c-0218-4f66-9509-e0db66eba3fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:48:32 np0005603622 podman[286034]: 2026-01-31 08:48:32.055451728 +0000 UTC m=+0.017122658 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:48:32 np0005603622 nova_compute[221301]: 2026-01-31 08:48:32.384 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:48:32 np0005603622 nova_compute[221301]: 2026-01-31 08:48:32.385 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849312.3845012, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:32 np0005603622 nova_compute[221301]: 2026-01-31 08:48:32.385 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:48:32 np0005603622 nova_compute[221301]: 2026-01-31 08:48:32.389 221324 DEBUG nova.compute.manager [None req-bf3a4368-bc10-49ed-a5ef-f80f3943cc4e aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:32 np0005603622 podman[286034]: 2026-01-31 08:48:32.445964129 +0000 UTC m=+0.407635059 container create 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:48:32 np0005603622 systemd[1]: Started libpod-conmon-9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686.scope.
Jan 31 03:48:32 np0005603622 podman[286092]: 2026-01-31 08:48:32.601935551 +0000 UTC m=+0.109104823 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:48:32 np0005603622 podman[286091]: 2026-01-31 08:48:32.634975453 +0000 UTC m=+0.142524085 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:48:32 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:48:32 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/474511dd53209fab95b8c3a5242a1da75efc69b56e2b5730161e4ddb3fc4d982/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:48:32 np0005603622 podman[286034]: 2026-01-31 08:48:32.898083442 +0000 UTC m=+0.859754402 container init 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:48:32 np0005603622 podman[286034]: 2026-01-31 08:48:32.902953973 +0000 UTC m=+0.864624913 container start 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:48:32 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [NOTICE]   (286143) : New worker (286145) forked
Jan 31 03:48:32 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [NOTICE]   (286143) : Loading success.
Jan 31 03:48:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:33.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:33.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:34 np0005603622 nova_compute[221301]: 2026-01-31 08:48:34.472 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.105 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.110 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.171 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.172 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849312.3858428, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.173 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Started (Lifecycle Event)#033[00m
Jan 31 03:48:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:35.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.344 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.348 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:35.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.785 221324 DEBUG nova.compute.manager [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-changed-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.785 221324 DEBUG nova.compute.manager [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing instance network info cache due to event network-changed-8caf8cff-a513-4677-929d-ade94fee3082. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.786 221324 DEBUG oslo_concurrency.lockutils [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.786 221324 DEBUG oslo_concurrency.lockutils [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:35 np0005603622 nova_compute[221301]: 2026-01-31 08:48:35.786 221324 DEBUG nova.network.neutron [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Refreshing network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:48:36 np0005603622 nova_compute[221301]: 2026-01-31 08:48:36.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:48:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:37.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:48:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:37.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:39.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.474 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.502 221324 DEBUG nova.compute.manager [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.503 221324 DEBUG oslo_concurrency.lockutils [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.503 221324 DEBUG oslo_concurrency.lockutils [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.503 221324 DEBUG oslo_concurrency.lockutils [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.503 221324 DEBUG nova.compute.manager [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:39 np0005603622 nova_compute[221301]: 2026-01-31 08:48:39.504 221324 WARNING nova.compute.manager [req-04770145-8bc7-4876-b7c2-ffbf364b4ad5 req-4a06a964-6677-4316-a50e-41f2215751bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:48:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:39.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.174 221324 INFO nova.compute.manager [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Unrescuing#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.175 221324 DEBUG oslo_concurrency.lockutils [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.176 221324 DEBUG oslo_concurrency.lockutils [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.176 221324 DEBUG nova.network.neutron [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:48:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:41.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:41.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.674 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.969 221324 DEBUG nova.compute.manager [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.970 221324 DEBUG oslo_concurrency.lockutils [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.970 221324 DEBUG oslo_concurrency.lockutils [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.971 221324 DEBUG oslo_concurrency.lockutils [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.972 221324 DEBUG nova.compute.manager [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:41 np0005603622 nova_compute[221301]: 2026-01-31 08:48:41.972 221324 WARNING nova.compute.manager [req-518b903a-7c82-4f0b-beb1-b28b37f44822 req-0fb29ed5-98bc-47cb-a746-36df03471228 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:48:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:42Z|00754|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:48:42 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:42Z|00755|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:48:42 np0005603622 nova_compute[221301]: 2026-01-31 08:48:42.730 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:42 np0005603622 nova_compute[221301]: 2026-01-31 08:48:42.905 221324 DEBUG nova.network.neutron [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updated VIF entry in instance network info cache for port 8caf8cff-a513-4677-929d-ade94fee3082. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:48:42 np0005603622 nova_compute[221301]: 2026-01-31 08:48:42.905 221324 DEBUG nova.network.neutron [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:42 np0005603622 nova_compute[221301]: 2026-01-31 08:48:42.991 221324 DEBUG oslo_concurrency.lockutils [req-1e231505-9305-40a0-9c23-2bc67a2a4133 req-77954982-fb25-475d-aae4-3c7ae4d93394 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:44 np0005603622 nova_compute[221301]: 2026-01-31 08:48:44.478 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:44 np0005603622 nova_compute[221301]: 2026-01-31 08:48:44.739 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:44 np0005603622 nova_compute[221301]: 2026-01-31 08:48:44.739 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:44 np0005603622 nova_compute[221301]: 2026-01-31 08:48:44.906 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:48:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:45.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:45 np0005603622 nova_compute[221301]: 2026-01-31 08:48:45.295 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:45 np0005603622 nova_compute[221301]: 2026-01-31 08:48:45.295 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:45Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:48:45 np0005603622 nova_compute[221301]: 2026-01-31 08:48:45.309 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:48:45 np0005603622 nova_compute[221301]: 2026-01-31 08:48:45.310 221324 INFO nova.compute.claims [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:48:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:45.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:45 np0005603622 nova_compute[221301]: 2026-01-31 08:48:45.782 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:48:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/718556523' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:48:46 np0005603622 nova_compute[221301]: 2026-01-31 08:48:46.233 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:46 np0005603622 nova_compute[221301]: 2026-01-31 08:48:46.242 221324 DEBUG nova.compute.provider_tree [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:48:46 np0005603622 nova_compute[221301]: 2026-01-31 08:48:46.678 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:46 np0005603622 nova_compute[221301]: 2026-01-31 08:48:46.709 221324 DEBUG nova.network.neutron [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:48:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:47.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:47.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.477 221324 DEBUG nova.scheduler.client.report [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.571 221324 DEBUG oslo_concurrency.lockutils [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.573 221324 DEBUG nova.objects.instance [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.606 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.608 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:48:48 np0005603622 kernel: tap6e3ffb68-e7 (unregistering): left promiscuous mode
Jan 31 03:48:48 np0005603622 NetworkManager[49080]: <info>  [1769849328.9147] device (tap6e3ffb68-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.921 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.921 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:48:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:48Z|00756|binding|INFO|Releasing lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 from this chassis (sb_readonly=0)
Jan 31 03:48:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:48Z|00757|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 down in Southbound
Jan 31 03:48:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:48Z|00758|binding|INFO|Removing iface tap6e3ffb68-e7 ovn-installed in OVS
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:48 np0005603622 nova_compute[221301]: 2026-01-31 08:48:48.930 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:48 np0005603622 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 31 03:48:48 np0005603622 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a1.scope: Consumed 13.679s CPU time.
Jan 31 03:48:48 np0005603622 systemd-machined[190406]: Machine qemu-77-instance-000000a1 terminated.
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.061 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.063 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.065 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2128154c-0218-4f66-9509-e0db66eba3fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.067 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7c801908-6e11-4ebb-9e54-ae9240c6cdae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.068 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace which is not needed anymore#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.101 221324 INFO nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.133 221324 INFO nova.virt.libvirt.driver [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance destroyed successfully.#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.134 221324 DEBUG nova.objects.instance [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'numa_topology' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.172 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:48:49 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [NOTICE]   (286143) : haproxy version is 2.8.14-c23fe91
Jan 31 03:48:49 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [NOTICE]   (286143) : path to executable is /usr/sbin/haproxy
Jan 31 03:48:49 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [WARNING]  (286143) : Exiting Master process...
Jan 31 03:48:49 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [ALERT]    (286143) : Current worker (286145) exited with code 143 (Terminated)
Jan 31 03:48:49 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286130]: [WARNING]  (286143) : All workers exited. Exiting... (0)
Jan 31 03:48:49 np0005603622 systemd[1]: libpod-9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686.scope: Deactivated successfully.
Jan 31 03:48:49 np0005603622 podman[286208]: 2026-01-31 08:48:49.243762925 +0000 UTC m=+0.060264395 container died 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:48:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:49.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:49 np0005603622 kernel: tap6e3ffb68-e7: entered promiscuous mode
Jan 31 03:48:49 np0005603622 systemd-udevd[286178]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 NetworkManager[49080]: <info>  [1769849329.2635] manager: (tap6e3ffb68-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 31 03:48:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:49Z|00759|binding|INFO|Claiming lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for this chassis.
Jan 31 03:48:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:49Z|00760|binding|INFO|6e3ffb68-e766-4edd-8e73-e44bb1e58fd4: Claiming fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:48:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:49Z|00761|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 ovn-installed in OVS
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 NetworkManager[49080]: <info>  [1769849329.2762] device (tap6e3ffb68-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:48:49 np0005603622 NetworkManager[49080]: <info>  [1769849329.2770] device (tap6e3ffb68-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:48:49 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:49Z|00762|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 up in Southbound
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.279 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:49 np0005603622 systemd-machined[190406]: New machine qemu-78-instance-000000a1.
Jan 31 03:48:49 np0005603622 systemd[1]: Started Virtual Machine qemu-78-instance-000000a1.
Jan 31 03:48:49 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686-userdata-shm.mount: Deactivated successfully.
Jan 31 03:48:49 np0005603622 systemd[1]: var-lib-containers-storage-overlay-474511dd53209fab95b8c3a5242a1da75efc69b56e2b5730161e4ddb3fc4d982-merged.mount: Deactivated successfully.
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.397 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.399 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.399 221324 INFO nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Creating image(s)#033[00m
Jan 31 03:48:49 np0005603622 podman[286208]: 2026-01-31 08:48:49.412364518 +0000 UTC m=+0.228866018 container cleanup 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:48:49 np0005603622 systemd[1]: libpod-conmon-9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686.scope: Deactivated successfully.
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.437 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.480 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.512 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.516 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.536 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.572 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.573 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.573 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.574 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.600 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.604 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4af4043c-8199-4d0f-acf9-38d029560167_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:48:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:49.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.822 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.824 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849329.8216064, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.824 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:48:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:49 np0005603622 podman[286272]: 2026-01-31 08:48:49.844861887 +0000 UTC m=+0.405591629 container remove 9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.850 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5b5fdb-fbb8-407f-8d5a-b1a3141bf32b]: (4, ('Sat Jan 31 08:48:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686)\n9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686\nSat Jan 31 08:48:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686)\n9284848fccaf003f203a282dff5bc1b52ba501fdded1f488004d38a043a0e686\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.852 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[180e88e9-0526-42a9-8a0e-95ec0cfb51fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.853 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:49 np0005603622 kernel: tap2128154c-00: left promiscuous mode
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.865 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[30387488-847b-424e-8a52-b9ad860e4554]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 nova_compute[221301]: 2026-01-31 08:48:49.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.875 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[66b7ee92-cc3c-4e30-b5eb-a28df5ff7b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.877 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1d6e41-2ce9-41a6-bd7f-bfd8c1052ae0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.890 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ab849d-c1f4-4a4a-9bb6-7ff6eca1423a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856288, 'reachable_time': 31923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286422, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 systemd[1]: run-netns-ovnmeta\x2d2128154c\x2d0218\x2d4f66\x2d9509\x2de0db66eba3fc.mount: Deactivated successfully.
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.896 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.896 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a29b4e-14c6-4406-88eb-26ab0ca58cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.896 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.898 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.905 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eae84de6-6b4a-4036-a72e-6fa1fa713c39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.906 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2128154c-01 in ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.909 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2128154c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.909 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[530ae5a2-e2d2-4701-a9cc-505aa78627a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.910 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa89bf2-399e-4cac-a982-b5641b51d4b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.924 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d007318a-104b-4636-89d1-f304468ba314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.937 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[69b6254d-3770-41d4-83ed-e159e0589a35]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.968 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9f76a236-7a31-4ca5-b596-36c2d544cae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:49 np0005603622 NetworkManager[49080]: <info>  [1769849329.9774] manager: (tap2128154c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Jan 31 03:48:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:49.975 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[32e6424c-406f-4feb-9202-00b3a611135c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.001 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0df92d36-f749-47c0-8b36-734cf39e7972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.004 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6c914cf8-e8ea-425d-b3be-20f6701ddc76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 NetworkManager[49080]: <info>  [1769849330.0320] device (tap2128154c-00): carrier: link connected
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.033 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5e9d0c-bf4a-458f-ab79-2caa1f492215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.046 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f99566-a360-49fb-8cb5-f86480fe20a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286450, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.062 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8302279d-b99c-4cbb-903e-1b8970d845e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:3208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858136, 'tstamp': 858136}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286451, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.075 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.083 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.085 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6416a7de-3487-45e3-b9d0-2620c3eeb232]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286452, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.119 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[872e9320-97db-4eec-82dc-7caadf142b8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.125 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.126 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849329.8232613, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.126 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Started (Lifecycle Event)#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.161 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.165 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.197 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[93043b29-97fc-4b02-ae0c-04fb35fbd0c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.198 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.198 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.199 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.201 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:50 np0005603622 kernel: tap2128154c-00: entered promiscuous mode
Jan 31 03:48:50 np0005603622 NetworkManager[49080]: <info>  [1769849330.2038] manager: (tap2128154c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.204 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.206 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.207 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:48:50Z|00763|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.216 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.217 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[589ccc3c-9d88-427b-9d4b-fe1b3ae2ffce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.218 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/2128154c-0218-4f66-9509-e0db66eba3fc.pid.haproxy
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 2128154c-0218-4f66-9509-e0db66eba3fc
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:48:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:50.219 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'env', 'PROCESS_TAG=haproxy-2128154c-0218-4f66-9509-e0db66eba3fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2128154c-0218-4f66-9509-e0db66eba3fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.265 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.295 221324 DEBUG nova.policy [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a498364761ef428b99cac3f92e603385', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8397e0fed04b4dabb57148d0924de2dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.564 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 4af4043c-8199-4d0f-acf9-38d029560167_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.960s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:48:50 np0005603622 nova_compute[221301]: 2026-01-31 08:48:50.639 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] resizing rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:48:50 np0005603622 podman[286484]: 2026-01-31 08:48:50.553865147 +0000 UTC m=+0.024323482 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:48:50 np0005603622 podman[286484]: 2026-01-31 08:48:50.742966729 +0000 UTC m=+0.213425044 container create 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:48:50 np0005603622 systemd[1]: Started libpod-conmon-1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf.scope.
Jan 31 03:48:50 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:48:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0092fe2adfa00cb4a7d2b301e55ed44982192678d5bb66c7b0b0e901fa9b3f20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:48:51 np0005603622 podman[286484]: 2026-01-31 08:48:51.01756481 +0000 UTC m=+0.488023195 container init 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 03:48:51 np0005603622 podman[286484]: 2026-01-31 08:48:51.027027143 +0000 UTC m=+0.497485478 container start 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:48:51 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [NOTICE]   (286557) : New worker (286559) forked
Jan 31 03:48:51 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [NOTICE]   (286557) : Loading success.
Jan 31 03:48:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:51.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.332 221324 DEBUG nova.compute.manager [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.333 221324 DEBUG oslo_concurrency.lockutils [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.333 221324 DEBUG oslo_concurrency.lockutils [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.333 221324 DEBUG oslo_concurrency.lockutils [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.333 221324 DEBUG nova.compute.manager [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.334 221324 WARNING nova.compute.manager [req-b32cf4f3-3f8c-4bd5-be63-e34559fc15d3 req-ebf147a1-6058-4572-a94f-d7787ed93d67 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.558 221324 DEBUG nova.objects.instance [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'migration_context' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.604 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.605 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Ensure instance console log exists: /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.605 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.606 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.606 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:51.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:51 np0005603622 nova_compute[221301]: 2026-01-31 08:48:51.682 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:52 np0005603622 nova_compute[221301]: 2026-01-31 08:48:52.849 221324 DEBUG nova.compute.manager [None req-325aa9cd-5dd4-4f9c-a2db-cf70e7c10d91 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:48:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:48:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:53.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:48:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.021 221324 DEBUG nova.compute.manager [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.022 221324 DEBUG oslo_concurrency.lockutils [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.022 221324 DEBUG oslo_concurrency.lockutils [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.022 221324 DEBUG oslo_concurrency.lockutils [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.023 221324 DEBUG nova.compute.manager [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.023 221324 WARNING nova.compute.manager [req-660d737a-c584-4cf4-9407-58508e3a1fd1 req-98e8e4f2-f2e5-4a4f-ac95-9b087c5a4bf3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:54.261 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.261 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:54.262 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:48:54 np0005603622 nova_compute[221301]: 2026-01-31 08:48:54.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:48:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:55.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:55 np0005603622 nova_compute[221301]: 2026-01-31 08:48:55.450 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Successfully created port: 3aae5c0f-f2ed-4352-a4e2-017466399641 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:48:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:55.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:48:56.265 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.686 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.741 221324 DEBUG nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.741 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.742 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.742 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.742 221324 DEBUG nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.743 221324 WARNING nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.743 221324 DEBUG nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.744 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.744 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.744 221324 DEBUG oslo_concurrency.lockutils [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.745 221324 DEBUG nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:48:56 np0005603622 nova_compute[221301]: 2026-01-31 08:48:56.745 221324 WARNING nova.compute.manager [req-7ff36810-5c17-4d6d-bd73-9bd85049c16a req-faec4e5e-6913-4a4e-aec8-7e9da93e4b1e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:48:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:57.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:48:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:57.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:48:58 np0005603622 nova_compute[221301]: 2026-01-31 08:48:58.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.179 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Successfully updated port: 3aae5c0f-f2ed-4352-a4e2-017466399641 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.238 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.238 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.239 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:48:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:48:59.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.442 221324 DEBUG nova.compute.manager [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.443 221324 DEBUG nova.compute.manager [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.443 221324 DEBUG oslo_concurrency.lockutils [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.483 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:48:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:48:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:48:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:48:59.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:48:59 np0005603622 nova_compute[221301]: 2026-01-31 08:48:59.745 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:48:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:00 np0005603622 nova_compute[221301]: 2026-01-31 08:49:00.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:00 np0005603622 nova_compute[221301]: 2026-01-31 08:49:00.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:49:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:01.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:01.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:01 np0005603622 nova_compute[221301]: 2026-01-31 08:49:01.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:02 np0005603622 nova_compute[221301]: 2026-01-31 08:49:02.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:02 np0005603622 nova_compute[221301]: 2026-01-31 08:49:02.952 221324 DEBUG nova.network.neutron [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.035 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.036 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance network_info: |[{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.037 221324 DEBUG oslo_concurrency.lockutils [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.037 221324 DEBUG nova.network.neutron [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.043 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Start _get_guest_xml network_info=[{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.051 221324 WARNING nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.070 221324 DEBUG nova.virt.libvirt.host [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.071 221324 DEBUG nova.virt.libvirt.host [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.078 221324 DEBUG nova.virt.libvirt.host [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.078 221324 DEBUG nova.virt.libvirt.host [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.080 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.080 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.081 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.081 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.081 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.081 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.081 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.082 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.082 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.082 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.082 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.082 221324 DEBUG nova.virt.hardware [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.085 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:03Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:16:3c 10.100.0.13
Jan 31 03:49:03 np0005603622 podman[286589]: 2026-01-31 08:49:03.132139025 +0000 UTC m=+0.048034217 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:49:03 np0005603622 podman[286588]: 2026-01-31 08:49:03.168869909 +0000 UTC m=+0.085509090 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:49:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:03.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:49:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1318698675' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.505 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.538 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.544 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.893 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.893 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.893 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:49:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4176614108' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.961 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.963 221324 DEBUG nova.virt.libvirt.vif [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:48:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.963 221324 DEBUG nova.network.os_vif_util [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.964 221324 DEBUG nova.network.os_vif_util [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.965 221324 DEBUG nova.objects.instance [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.989 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <uuid>4af4043c-8199-4d0f-acf9-38d029560167</uuid>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <name>instance-000000a5</name>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:name>multiattach-server-1</nova:name>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:49:03</nova:creationTime>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:user uuid="a498364761ef428b99cac3f92e603385">tempest-AttachVolumeMultiAttachTest-1931311941-project-member</nova:user>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:project uuid="8397e0fed04b4dabb57148d0924de2dc">tempest-AttachVolumeMultiAttachTest-1931311941</nova:project>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <nova:port uuid="3aae5c0f-f2ed-4352-a4e2-017466399641">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="serial">4af4043c-8199-4d0f-acf9-38d029560167</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="uuid">4af4043c-8199-4d0f-acf9-38d029560167</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4af4043c-8199-4d0f-acf9-38d029560167_disk">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/4af4043c-8199-4d0f-acf9-38d029560167_disk.config">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:69:73:3d"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <target dev="tap3aae5c0f-f2"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/console.log" append="off"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:49:03 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:49:03 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:49:03 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:49:03 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.990 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Preparing to wait for external event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.990 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.990 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.991 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.991 221324 DEBUG nova.virt.libvirt.vif [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:48:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.992 221324 DEBUG nova.network.os_vif_util [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.992 221324 DEBUG nova.network.os_vif_util [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.993 221324 DEBUG os_vif [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.994 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.994 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.998 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aae5c0f-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:03 np0005603622 nova_compute[221301]: 2026-01-31 08:49:03.999 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3aae5c0f-f2, col_values=(('external_ids', {'iface-id': '3aae5c0f-f2ed-4352-a4e2-017466399641', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:73:3d', 'vm-uuid': '4af4043c-8199-4d0f-acf9-38d029560167'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.044 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:04 np0005603622 NetworkManager[49080]: <info>  [1769849344.0459] manager: (tap3aae5c0f-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.048 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.051 221324 INFO os_vif [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2')#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.133 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.133 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.134 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:69:73:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.134 221324 INFO nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Using config drive#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.217 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:04 np0005603622 nova_compute[221301]: 2026-01-31 08:49:04.485 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.248 221324 INFO nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Creating config drive at /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.253 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppthjddyp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:05.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.379 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppthjddyp" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.406 221324 DEBUG nova.storage.rbd_utils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] rbd image 4af4043c-8199-4d0f-acf9-38d029560167_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.409 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config 4af4043c-8199-4d0f-acf9-38d029560167_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.564 221324 DEBUG oslo_concurrency.processutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config 4af4043c-8199-4d0f-acf9-38d029560167_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.565 221324 INFO nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Deleting local config drive /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/disk.config because it was imported into RBD.#033[00m
Jan 31 03:49:05 np0005603622 kernel: tap3aae5c0f-f2: entered promiscuous mode
Jan 31 03:49:05 np0005603622 NetworkManager[49080]: <info>  [1769849345.6062] manager: (tap3aae5c0f-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:05Z|00764|binding|INFO|Claiming lport 3aae5c0f-f2ed-4352-a4e2-017466399641 for this chassis.
Jan 31 03:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:05Z|00765|binding|INFO|3aae5c0f-f2ed-4352-a4e2-017466399641: Claiming fa:16:3e:69:73:3d 10.100.0.9
Jan 31 03:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:05Z|00766|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 ovn-installed in OVS
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:05Z|00767|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 up in Southbound
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.627 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:73:3d 10.100.0.9'], port_security=['fa:16:3e:69:73:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4af4043c-8199-4d0f-acf9-38d029560167', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3aae5c0f-f2ed-4352-a4e2-017466399641) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.628 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3aae5c0f-f2ed-4352-a4e2-017466399641 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 bound to our chassis#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.630 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:49:05 np0005603622 systemd-udevd[286905]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:49:05 np0005603622 systemd-machined[190406]: New machine qemu-79-instance-000000a5.
Jan 31 03:49:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:05.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:05 np0005603622 NetworkManager[49080]: <info>  [1769849345.6453] device (tap3aae5c0f-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:49:05 np0005603622 NetworkManager[49080]: <info>  [1769849345.6457] device (tap3aae5c0f-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:49:05 np0005603622 systemd[1]: Started Virtual Machine qemu-79-instance-000000a5.
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.748 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[23509567-228e-4d6e-b5a5-d245781b42aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.768 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e992048a-f93c-43b0-bc83-c8615afb8ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.771 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a1bff6c5-fdf9-4a2b-8ed8-282cd3ea53ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.789 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8a649d6a-eb73-4e36-b479-d915daeecd30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.806 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e029e419-904b-4777-b41d-d503023a68c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849367, 'reachable_time': 42362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286969, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae341b1-56af-415d-a720-6c4d1f810cb4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849375, 'tstamp': 849375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286979, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849377, 'tstamp': 849377}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286979, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.825 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.826 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603622 nova_compute[221301]: 2026-01-31 08:49:05.827 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.830 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.830 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.831 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:05.831 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:05 np0005603622 podman[286948]: 2026-01-31 08:49:05.844036594 +0000 UTC m=+0.072557643 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 03:49:05 np0005603622 podman[286948]: 2026-01-31 08:49:05.950620808 +0000 UTC m=+0.179141847 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.098 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849346.0982392, 4af4043c-8199-4d0f-acf9-38d029560167 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.099 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Started (Lifecycle Event)#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.143 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.147 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849346.1010427, 4af4043c-8199-4d0f-acf9-38d029560167 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.147 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.152 221324 DEBUG nova.compute.manager [req-452d4b6a-f636-4a0c-b62f-9a0b0c20b9cb req-ff3b2f50-2fe5-41b5-a805-ee1969501b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.152 221324 DEBUG oslo_concurrency.lockutils [req-452d4b6a-f636-4a0c-b62f-9a0b0c20b9cb req-ff3b2f50-2fe5-41b5-a805-ee1969501b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.152 221324 DEBUG oslo_concurrency.lockutils [req-452d4b6a-f636-4a0c-b62f-9a0b0c20b9cb req-ff3b2f50-2fe5-41b5-a805-ee1969501b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.153 221324 DEBUG oslo_concurrency.lockutils [req-452d4b6a-f636-4a0c-b62f-9a0b0c20b9cb req-ff3b2f50-2fe5-41b5-a805-ee1969501b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.153 221324 DEBUG nova.compute.manager [req-452d4b6a-f636-4a0c-b62f-9a0b0c20b9cb req-ff3b2f50-2fe5-41b5-a805-ee1969501b25 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Processing event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.154 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.158 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.160 221324 INFO nova.virt.libvirt.driver [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance spawned successfully.#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.160 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.219 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.225 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849346.1572702, 4af4043c-8199-4d0f-acf9-38d029560167 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.225 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.228 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.228 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.229 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.229 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.229 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.230 221324 DEBUG nova.virt.libvirt.driver [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.275 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.279 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.333 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.363 221324 INFO nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Took 16.97 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.364 221324 DEBUG nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.478 221324 INFO nova.compute.manager [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Took 21.35 seconds to build instance.#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.540 221324 DEBUG oslo_concurrency.lockutils [None req-07d739f5-ddb8-4bd7-b703-f7eadc094ece a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.864 221324 DEBUG nova.network.neutron [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.866 221324 DEBUG nova.network.neutron [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:06 np0005603622 nova_compute[221301]: 2026-01-31 08:49:06.897 221324 DEBUG oslo_concurrency.lockutils [req-b487fd27-8a16-4acb-8996-25df679e50ec req-eb75bb8b-1ff2-4bd0-9e06-4416ae4490e3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:07.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:49:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:07 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.634 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:07.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.670 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.671 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.672 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.673 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.673 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.720 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.721 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.721 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.722 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:49:07 np0005603622 nova_compute[221301]: 2026-01-31 08:49:07.722 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:08 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3273876448' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:08 np0005603622 nova_compute[221301]: 2026-01-31 08:49:08.188 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:09 np0005603622 nova_compute[221301]: 2026-01-31 08:49:09.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:09.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:09 np0005603622 nova_compute[221301]: 2026-01-31 08:49:09.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:09.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.779 221324 DEBUG nova.compute.manager [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.780 221324 DEBUG oslo_concurrency.lockutils [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.780 221324 DEBUG oslo_concurrency.lockutils [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.780 221324 DEBUG oslo_concurrency.lockutils [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.780 221324 DEBUG nova.compute.manager [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.781 221324 WARNING nova.compute.manager [req-fe5811da-38b4-4f47-a118-6c2f5bece4a3 req-cf9be635-9707-4fd3-a3f1-77c8df0e6b15 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.941 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.941 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.946 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.946 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.950 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:10 np0005603622 nova_compute[221301]: 2026-01-31 08:49:10.951 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.122 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.124 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3748MB free_disk=20.673076629638672GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.124 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.124 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:11.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.378 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.379 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.379 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4af4043c-8199-4d0f-acf9-38d029560167 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.381 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.381 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:49:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:11.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:11 np0005603622 nova_compute[221301]: 2026-01-31 08:49:11.871 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/177821887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:12 np0005603622 nova_compute[221301]: 2026-01-31 08:49:12.314 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:12 np0005603622 nova_compute[221301]: 2026-01-31 08:49:12.319 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:49:12 np0005603622 nova_compute[221301]: 2026-01-31 08:49:12.370 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:49:12 np0005603622 nova_compute[221301]: 2026-01-31 08:49:12.468 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:49:12 np0005603622 nova_compute[221301]: 2026-01-31 08:49:12.469 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:13 np0005603622 nova_compute[221301]: 2026-01-31 08:49:13.192 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:13.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:13 np0005603622 nova_compute[221301]: 2026-01-31 08:49:13.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:13.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:14 np0005603622 nova_compute[221301]: 2026-01-31 08:49:14.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:14 np0005603622 nova_compute[221301]: 2026-01-31 08:49:14.491 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:15.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:15.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.126 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.126 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.262 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:49:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.393 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.394 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.402 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.402 221324 INFO nova.compute.claims [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:49:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:17.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:17 np0005603622 nova_compute[221301]: 2026-01-31 08:49:17.924 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:49:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4176434898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:49:18 np0005603622 nova_compute[221301]: 2026-01-31 08:49:18.393 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:18 np0005603622 nova_compute[221301]: 2026-01-31 08:49:18.401 221324 DEBUG nova.compute.provider_tree [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:49:18 np0005603622 nova_compute[221301]: 2026-01-31 08:49:18.505 221324 DEBUG nova.scheduler.client.report [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.050 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.051 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.106 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.187 221324 DEBUG nova.compute.manager [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.188 221324 DEBUG nova.compute.manager [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.188 221324 DEBUG oslo_concurrency.lockutils [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.189 221324 DEBUG oslo_concurrency.lockutils [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.189 221324 DEBUG nova.network.neutron [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.201 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.201 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:49:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:19.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.544 221324 INFO nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.595 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.637 221324 DEBUG nova.policy [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa7f893021af4a84b03d85b476dadfe0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:49:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:19.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.920 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.922 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.922 221324 INFO nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Creating image(s)#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.955 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:19 np0005603622 nova_compute[221301]: 2026-01-31 08:49:19.995 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.020 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.024 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.096 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.097 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.098 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.099 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.132 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.138 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 84307a3a-98ff-4145-a789-fda9cf8b1905_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/194760373' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:20 np0005603622 ceph-osd[79107]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 31 03:49:20 np0005603622 nova_compute[221301]: 2026-01-31 08:49:20.851 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:21.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:21.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:21 np0005603622 nova_compute[221301]: 2026-01-31 08:49:21.776 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Successfully created port: 26ecf32a-6227-4ef9-8e92-55f3262e4031 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:49:22 np0005603622 nova_compute[221301]: 2026-01-31 08:49:22.323 221324 DEBUG nova.network.neutron [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:49:22 np0005603622 nova_compute[221301]: 2026-01-31 08:49:22.323 221324 DEBUG nova.network.neutron [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:22 np0005603622 nova_compute[221301]: 2026-01-31 08:49:22.409 221324 DEBUG oslo_concurrency.lockutils [req-03b915ac-9d7a-494a-96f0-33fc17c4273b req-bc4a1858-c963-4960-99f1-c0d58b49e996 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.284 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Successfully updated port: 26ecf32a-6227-4ef9-8e92-55f3262e4031 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:49:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:23.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.360 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.360 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.361 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.518 221324 DEBUG nova.compute.manager [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.519 221324 DEBUG nova.compute.manager [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing instance network info cache due to event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.520 221324 DEBUG oslo_concurrency.lockutils [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:23.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:23 np0005603622 nova_compute[221301]: 2026-01-31 08:49:23.899 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:49:24 np0005603622 nova_compute[221301]: 2026-01-31 08:49:24.109 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:24 np0005603622 nova_compute[221301]: 2026-01-31 08:49:24.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:49:24 np0005603622 nova_compute[221301]: 2026-01-31 08:49:24.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.215 221324 DEBUG nova.network.neutron [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:25.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.329 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.330 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance network_info: |[{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.330 221324 DEBUG oslo_concurrency.lockutils [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.331 221324 DEBUG nova.network.neutron [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:25.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:25 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.748 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 84307a3a-98ff-4145-a789-fda9cf8b1905_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:25 np0005603622 nova_compute[221301]: 2026-01-31 08:49:25.830 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] resizing rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.234 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.234 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.262 221324 DEBUG nova.objects.instance [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:27.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.345 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:27.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.935 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.936 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:27 np0005603622 nova_compute[221301]: 2026-01-31 08:49:27.937 221324 INFO nova.compute.manager [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Attaching volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59 to /dev/vdb#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.144 221324 DEBUG os_brick.utils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.146 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.162 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.162 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[43ff2b53-b238-4839-a88e-9f4278ee2e6b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.164 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.173 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.173 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa63481-b4dd-4333-9c65-c635688b1395]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.175 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.182 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.183 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b79334e0-db05-41bb-b779-030e851a9cb7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.184 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[a2e63237-cf14-4cd1-9101-40042102ab35]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.185 221324 DEBUG oslo_concurrency.processutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.204 221324 DEBUG oslo_concurrency.processutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "nvme version" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.207 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.208 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.208 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.208 221324 DEBUG os_brick.utils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] <== get_connector_properties: return (63ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.209 221324 DEBUG nova.virt.block_device [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating existing volume attachment record: 57220f93-bfdc-48e9-b764-46da2fb97690 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.718 221324 DEBUG nova.network.neutron [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updated VIF entry in instance network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.719 221324 DEBUG nova.network.neutron [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:28 np0005603622 nova_compute[221301]: 2026-01-31 08:49:28.768 221324 DEBUG nova.objects.instance [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:29 np0005603622 nova_compute[221301]: 2026-01-31 08:49:29.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:29.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:29 np0005603622 nova_compute[221301]: 2026-01-31 08:49:29.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.368 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.369 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Ensure instance console log exists: /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.369 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.370 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.370 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.373 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start _get_guest_xml network_info=[{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.374 221324 DEBUG oslo_concurrency.lockutils [req-027898be-bf73-4980-acd9-bef76a087c86 req-23905ef5-4632-409c-87bb-e5e2a0a927fb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.379 221324 WARNING nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.385 221324 DEBUG nova.virt.libvirt.host [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.385 221324 DEBUG nova.virt.libvirt.host [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.390 221324 DEBUG nova.virt.libvirt.host [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.390 221324 DEBUG nova.virt.libvirt.host [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.392 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.392 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.393 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.393 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.393 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.393 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.393 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.394 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.394 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.394 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.394 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.395 221324 DEBUG nova.virt.hardware [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.397 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4204925358' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.806 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.840 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:30 np0005603622 nova_compute[221301]: 2026-01-31 08:49:30.846 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:30.907 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:30.909 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4088878050' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:49:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2662458123' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:49:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:31.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.385 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.389 221324 DEBUG nova.virt.libvirt.vif [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:49:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-462635409',display_name='tempest-ServerRescueNegativeTestJSON-server-462635409',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-462635409',id=166,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPEp8NFnjdKYncckoDcEUrRrf7xo9LmTCD6V4O6WjeTKJAyV8su/7oD1+xGUT3vktBEdPWfQ8rSAH7lPCJKfsLVFbyZh4D52OnOOfJY3x1nxzkz8U0mjKMYyORWKFvG9ew==',key_name='tempest-keypair-1284882307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-y7tniy9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:49:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=84307a3a-98ff-4145-a789-fda9cf8b1905,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.390 221324 DEBUG nova.network.os_vif_util [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.391 221324 DEBUG nova.network.os_vif_util [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.393 221324 DEBUG nova.objects.instance [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.411 221324 DEBUG nova.objects.instance [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'flavor' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.420 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <uuid>84307a3a-98ff-4145-a789-fda9cf8b1905</uuid>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <name>instance-000000a6</name>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-462635409</nova:name>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:49:30</nova:creationTime>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <nova:port uuid="26ecf32a-6227-4ef9-8e92-55f3262e4031">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="serial">84307a3a-98ff-4145-a789-fda9cf8b1905</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="uuid">84307a3a-98ff-4145-a789-fda9cf8b1905</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84307a3a-98ff-4145-a789-fda9cf8b1905_disk">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:35:e0:35"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <target dev="tap26ecf32a-62"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/console.log" append="off"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:49:31 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:49:31 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.420 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Preparing to wait for external event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.420 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.421 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.421 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.422 221324 DEBUG nova.virt.libvirt.vif [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:49:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-462635409',display_name='tempest-ServerRescueNegativeTestJSON-server-462635409',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-462635409',id=166,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPEp8NFnjdKYncckoDcEUrRrf7xo9LmTCD6V4O6WjeTKJAyV8su/7oD1+xGUT3vktBEdPWfQ8rSAH7lPCJKfsLVFbyZh4D52OnOOfJY3x1nxzkz8U0mjKMYyORWKFvG9ew==',key_name='tempest-keypair-1284882307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-y7tniy9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:49:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=84307a3a-98ff-4145-a789-fda9cf8b1905,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.422 221324 DEBUG nova.network.os_vif_util [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.422 221324 DEBUG nova.network.os_vif_util [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.423 221324 DEBUG os_vif [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.423 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.423 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.424 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.427 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.427 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26ecf32a-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.428 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26ecf32a-62, col_values=(('external_ids', {'iface-id': '26ecf32a-6227-4ef9-8e92-55f3262e4031', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:e0:35', 'vm-uuid': '84307a3a-98ff-4145-a789-fda9cf8b1905'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:31 np0005603622 NetworkManager[49080]: <info>  [1769849371.4326] manager: (tap26ecf32a-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.438 221324 INFO os_vif [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62')#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.447 221324 DEBUG nova.virt.libvirt.driver [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Attempting to attach volume 12e9d9b2-8ec9-4b16-b334-60c0f639cb59 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.451 221324 DEBUG nova.virt.libvirt.guest [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-12e9d9b2-8ec9-4b16-b334-60c0f639cb59">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:49:31 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <serial>12e9d9b2-8ec9-4b16-b334-60c0f639cb59</serial>
Jan 31 03:49:31 np0005603622 nova_compute[221301]:  <shareable/>
Jan 31 03:49:31 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:49:31 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.558 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.559 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.560 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:35:e0:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.561 221324 INFO nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Using config drive#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.601 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:31.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.694 221324 DEBUG nova.virt.libvirt.driver [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.695 221324 DEBUG nova.virt.libvirt.driver [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.695 221324 DEBUG nova.virt.libvirt.driver [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:49:31 np0005603622 nova_compute[221301]: 2026-01-31 08:49:31.695 221324 DEBUG nova.virt.libvirt.driver [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] No VIF found with MAC fa:16:3e:69:73:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.319 221324 INFO nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Creating config drive at /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.325 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk9yxnaix execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:73:3d 10.100.0.9
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:73:3d 10.100.0.9
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.383 221324 DEBUG oslo_concurrency.lockutils [None req-a0e861ee-6853-484f-8693-f242d3e5d604 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.466 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpk9yxnaix" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.497 221324 DEBUG nova.storage.rbd_utils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.501 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.858 221324 DEBUG oslo_concurrency.processutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.358s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.860 221324 INFO nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Deleting local config drive /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config because it was imported into RBD.#033[00m
Jan 31 03:49:32 np0005603622 kernel: tap26ecf32a-62: entered promiscuous mode
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.920 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:32 np0005603622 NetworkManager[49080]: <info>  [1769849372.9210] manager: (tap26ecf32a-62): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00768|binding|INFO|Claiming lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 for this chassis.
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00769|binding|INFO|26ecf32a-6227-4ef9-8e92-55f3262e4031: Claiming fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00770|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 ovn-installed in OVS
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:32 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:32Z|00771|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 up in Southbound
Jan 31 03:49:32 np0005603622 nova_compute[221301]: 2026-01-31 08:49:32.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.941 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '2', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.943 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.945 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:49:32 np0005603622 systemd-udevd[287684]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.961 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dc43ee-a1c1-4181-8065-9157f308db0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:32 np0005603622 NetworkManager[49080]: <info>  [1769849372.9663] device (tap26ecf32a-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:49:32 np0005603622 NetworkManager[49080]: <info>  [1769849372.9670] device (tap26ecf32a-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:49:32 np0005603622 systemd-machined[190406]: New machine qemu-80-instance-000000a6.
Jan 31 03:49:32 np0005603622 systemd[1]: Started Virtual Machine qemu-80-instance-000000a6.
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.991 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[71b62b4c-eefe-4637-86c2-9cc89414aab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:32.995 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb0efbf-7483-4f3b-a5e3-8a305cc6a0ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.020 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ef463583-a237-495d-9ed6-244bc55ca748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.035 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ed00408e-b8e3-41d4-a150-460a40d46218]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287700, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.048 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ff766e-2f10-4482-849d-99f62c52c709]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287701, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287701, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.050 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.052 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.053 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.053 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.054 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.055 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:49:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:33.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.401 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:49:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:33.403 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.630 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849373.6300144, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.631 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Started (Lifecycle Event)#033[00m
Jan 31 03:49:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:33.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:33 np0005603622 nova_compute[221301]: 2026-01-31 08:49:33.996 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.001 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849373.6303198, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.001 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.058 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.061 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.112 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:49:34 np0005603622 podman[287746]: 2026-01-31 08:49:34.140457264 +0000 UTC m=+0.055370823 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 03:49:34 np0005603622 podman[287745]: 2026-01-31 08:49:34.162972087 +0000 UTC m=+0.080805334 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 31 03:49:34 np0005603622 nova_compute[221301]: 2026-01-31 08:49:34.564 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:35.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:49:35.406 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:49:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:35.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.701 221324 DEBUG nova.compute.manager [req-cb534a86-645a-4212-be8e-1d1decf97211 req-9e8ac7dd-1657-4b83-84d8-932ac24303bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.702 221324 DEBUG oslo_concurrency.lockutils [req-cb534a86-645a-4212-be8e-1d1decf97211 req-9e8ac7dd-1657-4b83-84d8-932ac24303bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.702 221324 DEBUG oslo_concurrency.lockutils [req-cb534a86-645a-4212-be8e-1d1decf97211 req-9e8ac7dd-1657-4b83-84d8-932ac24303bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.703 221324 DEBUG oslo_concurrency.lockutils [req-cb534a86-645a-4212-be8e-1d1decf97211 req-9e8ac7dd-1657-4b83-84d8-932ac24303bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.703 221324 DEBUG nova.compute.manager [req-cb534a86-645a-4212-be8e-1d1decf97211 req-9e8ac7dd-1657-4b83-84d8-932ac24303bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Processing event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.704 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.707 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849375.7073507, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.708 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.711 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.716 221324 INFO nova.virt.libvirt.driver [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance spawned successfully.#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.717 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.874 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.880 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.880 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.881 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.881 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.882 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.882 221324 DEBUG nova.virt.libvirt.driver [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.887 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:49:35 np0005603622 nova_compute[221301]: 2026-01-31 08:49:35.980 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:49:36 np0005603622 nova_compute[221301]: 2026-01-31 08:49:36.111 221324 INFO nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Took 16.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:49:36 np0005603622 nova_compute[221301]: 2026-01-31 08:49:36.111 221324 DEBUG nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:49:36 np0005603622 nova_compute[221301]: 2026-01-31 08:49:36.214 221324 INFO nova.compute.manager [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Took 18.86 seconds to build instance.#033[00m
Jan 31 03:49:36 np0005603622 nova_compute[221301]: 2026-01-31 08:49:36.235 221324 DEBUG oslo_concurrency.lockutils [None req-a66d501c-79b9-42a1-b7c8-6dad309812b5 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:36 np0005603622 nova_compute[221301]: 2026-01-31 08:49:36.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:37.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:37.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.956 221324 DEBUG nova.compute.manager [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.957 221324 DEBUG oslo_concurrency.lockutils [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.957 221324 DEBUG oslo_concurrency.lockutils [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.957 221324 DEBUG oslo_concurrency.lockutils [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.957 221324 DEBUG nova.compute.manager [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:49:37 np0005603622 nova_compute[221301]: 2026-01-31 08:49:37.958 221324 WARNING nova.compute.manager [req-1db78671-7bfa-4c0a-8a2c-940f9aedecab req-d1147b18-ae1e-4dc3-8224-61d108ed78c9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.358751) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378358790, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 1275, "num_deletes": 251, "total_data_size": 2737664, "memory_usage": 2777400, "flush_reason": "Manual Compaction"}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378427220, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 1794342, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70510, "largest_seqno": 71780, "table_properties": {"data_size": 1788844, "index_size": 2893, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12368, "raw_average_key_size": 20, "raw_value_size": 1777646, "raw_average_value_size": 2899, "num_data_blocks": 127, "num_entries": 613, "num_filter_entries": 613, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849276, "oldest_key_time": 1769849276, "file_creation_time": 1769849378, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 68831 microseconds, and 5170 cpu microseconds.
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.427572) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 1794342 bytes OK
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.427600) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.467436) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.467497) EVENT_LOG_v1 {"time_micros": 1769849378467483, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.467570) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 2731584, prev total WAL file size 2731584, number of live WAL files 2.
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.469033) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(1752KB)], [144(10MB)]
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378469381, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 13025306, "oldest_snapshot_seqno": -1}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 9065 keys, 11041054 bytes, temperature: kUnknown
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378555371, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 11041054, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10983632, "index_size": 33629, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22725, "raw_key_size": 239908, "raw_average_key_size": 26, "raw_value_size": 10825868, "raw_average_value_size": 1194, "num_data_blocks": 1276, "num_entries": 9065, "num_filter_entries": 9065, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849378, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.555968) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 11041054 bytes
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.559817) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.8 rd, 127.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.7 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(13.4) write-amplify(6.2) OK, records in: 9582, records dropped: 517 output_compression: NoCompression
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.559851) EVENT_LOG_v1 {"time_micros": 1769849378559838, "job": 92, "event": "compaction_finished", "compaction_time_micros": 86368, "compaction_time_cpu_micros": 20480, "output_level": 6, "num_output_files": 1, "total_output_size": 11041054, "num_input_records": 9582, "num_output_records": 9065, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378560153, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849378561394, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.468351) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.561469) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.561474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.561476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.561478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:49:38.561480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:49:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:39.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:39 np0005603622 nova_compute[221301]: 2026-01-31 08:49:39.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:39.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:41.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:41 np0005603622 nova_compute[221301]: 2026-01-31 08:49:41.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:43.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:43 np0005603622 nova_compute[221301]: 2026-01-31 08:49:43.644 221324 DEBUG nova.compute.manager [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:49:43 np0005603622 nova_compute[221301]: 2026-01-31 08:49:43.645 221324 DEBUG nova.compute.manager [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing instance network info cache due to event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:49:43 np0005603622 nova_compute[221301]: 2026-01-31 08:49:43.645 221324 DEBUG oslo_concurrency.lockutils [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:49:43 np0005603622 nova_compute[221301]: 2026-01-31 08:49:43.645 221324 DEBUG oslo_concurrency.lockutils [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:49:43 np0005603622 nova_compute[221301]: 2026-01-31 08:49:43.646 221324 DEBUG nova.network.neutron [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:49:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:43.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:44 np0005603622 nova_compute[221301]: 2026-01-31 08:49:44.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:45.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:45.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:46 np0005603622 nova_compute[221301]: 2026-01-31 08:49:46.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:47.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:47 np0005603622 nova_compute[221301]: 2026-01-31 08:49:47.695 221324 DEBUG nova.network.neutron [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updated VIF entry in instance network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:49:47 np0005603622 nova_compute[221301]: 2026-01-31 08:49:47.696 221324 DEBUG nova.network.neutron [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:49:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:47.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:47 np0005603622 nova_compute[221301]: 2026-01-31 08:49:47.805 221324 DEBUG oslo_concurrency.lockutils [req-e36b90cf-611c-434f-83dc-9353999e61e0 req-1d66cd7f-db1b-4c52-aca3-af6bf72126df fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:49:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:49.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:49 np0005603622 nova_compute[221301]: 2026-01-31 08:49:49.597 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:49 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 31 03:49:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:49.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:51.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:51Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:49:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:49:51Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:49:51 np0005603622 nova_compute[221301]: 2026-01-31 08:49:51.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:51.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:53.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:53.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:54 np0005603622 nova_compute[221301]: 2026-01-31 08:49:54.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:49:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:55.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:49:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:55.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:49:56 np0005603622 nova_compute[221301]: 2026-01-31 08:49:56.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:57.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:57.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:49:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:49:59.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:49:59 np0005603622 nova_compute[221301]: 2026-01-31 08:49:59.381 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:49:59 np0005603622 nova_compute[221301]: 2026-01-31 08:49:59.382 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:49:59 np0005603622 nova_compute[221301]: 2026-01-31 08:49:59.417 221324 DEBUG nova.objects.instance [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:49:59 np0005603622 nova_compute[221301]: 2026-01-31 08:49:59.541 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:49:59 np0005603622 nova_compute[221301]: 2026-01-31 08:49:59.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:49:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:49:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:49:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:49:59.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:49:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e360 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.288 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.289 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.289 221324 INFO nova.compute.manager [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Attaching volume 756cf314-0b1f-46f3-bdfa-a0682d03b570 to /dev/vdb#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.591 221324 DEBUG os_brick.utils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.593 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.601 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.601 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6346ea-569b-4b80-bd7a-b8cc538490c4]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.602 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.607 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.607 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[22d47963-919a-4996-86e8-25470dcec6e6]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.609 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.663 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.663 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[09aa621d-0808-48de-8867-43c471bfb00f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.666 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[fac16241-4dbd-40ef-82d5-321e4d752eaf]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.667 221324 DEBUG oslo_concurrency.processutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.696 221324 DEBUG oslo_concurrency.processutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.699 221324 DEBUG os_brick.initiator.connectors.lightos [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.700 221324 DEBUG os_brick.initiator.connectors.lightos [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.701 221324 DEBUG os_brick.initiator.connectors.lightos [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.701 221324 DEBUG os_brick.utils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] <== get_connector_properties: return (109ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:50:00 np0005603622 nova_compute[221301]: 2026-01-31 08:50:00.702 221324 DEBUG nova.virt.block_device [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating existing volume attachment record: abe8a4d4-2e89-47c0-95db-cc7ace6869f7 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:50:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:01.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:01 np0005603622 nova_compute[221301]: 2026-01-31 08:50:01.445 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:50:02 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/162422280' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.331 221324 DEBUG nova.objects.instance [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.378 221324 DEBUG nova.virt.libvirt.driver [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Attempting to attach volume 756cf314-0b1f-46f3-bdfa-a0682d03b570 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.381 221324 DEBUG nova.virt.libvirt.guest [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-756cf314-0b1f-46f3-bdfa-a0682d03b570">
Jan 31 03:50:02 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:50:02 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:50:02 np0005603622 nova_compute[221301]:  <serial>756cf314-0b1f-46f3-bdfa-a0682d03b570</serial>
Jan 31 03:50:02 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:50:02 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:50:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.918 221324 DEBUG nova.virt.libvirt.driver [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.918 221324 DEBUG nova.virt.libvirt.driver [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.918 221324 DEBUG nova.virt.libvirt.driver [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:02 np0005603622 nova_compute[221301]: 2026-01-31 08:50:02.919 221324 DEBUG nova.virt.libvirt.driver [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:35:e0:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:50:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:03.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:03 np0005603622 nova_compute[221301]: 2026-01-31 08:50:03.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:03.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:04 np0005603622 nova_compute[221301]: 2026-01-31 08:50:04.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:04 np0005603622 nova_compute[221301]: 2026-01-31 08:50:04.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:50:04 np0005603622 nova_compute[221301]: 2026-01-31 08:50:04.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:50:04 np0005603622 nova_compute[221301]: 2026-01-31 08:50:04.604 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:04 np0005603622 nova_compute[221301]: 2026-01-31 08:50:04.755 221324 DEBUG oslo_concurrency.lockutils [None req-fa84f796-2c1b-4cf2-99f4-60ba17b01629 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:05 np0005603622 podman[287821]: 2026-01-31 08:50:05.130065344 +0000 UTC m=+0.050847252 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:50:05 np0005603622 podman[287820]: 2026-01-31 08:50:05.156095371 +0000 UTC m=+0.078172154 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 31 03:50:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:05.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:05.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:05 np0005603622 nova_compute[221301]: 2026-01-31 08:50:05.945 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:05 np0005603622 nova_compute[221301]: 2026-01-31 08:50:05.945 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:05 np0005603622 nova_compute[221301]: 2026-01-31 08:50:05.945 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:50:05 np0005603622 nova_compute[221301]: 2026-01-31 08:50:05.945 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:06 np0005603622 nova_compute[221301]: 2026-01-31 08:50:06.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:06 np0005603622 nova_compute[221301]: 2026-01-31 08:50:06.456 221324 INFO nova.compute.manager [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Rescuing#033[00m
Jan 31 03:50:06 np0005603622 nova_compute[221301]: 2026-01-31 08:50:06.457 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:06 np0005603622 nova_compute[221301]: 2026-01-31 08:50:06.457 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:06 np0005603622 nova_compute[221301]: 2026-01-31 08:50:06.457 221324 DEBUG nova.network.neutron [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:50:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:07.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:07.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:09 np0005603622 nova_compute[221301]: 2026-01-31 08:50:09.098 221324 DEBUG nova.network.neutron [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:09 np0005603622 nova_compute[221301]: 2026-01-31 08:50:09.153 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:09.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:09 np0005603622 nova_compute[221301]: 2026-01-31 08:50:09.607 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:09.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.059 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [{"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.223 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-6fd9c6e1-ddff-436d-b240-240ec06b2a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.223 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.225 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.229 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.230 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.231 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.235 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.301 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.302 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.303 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.303 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.304 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4080329815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.765 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.981 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.981 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.986 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.987 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.987 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.992 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.992 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.998 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.998 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:10 np0005603622 nova_compute[221301]: 2026-01-31 08:50:10.999 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.173 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.174 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3583MB free_disk=20.536231994628906GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.174 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.174 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:11.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:11.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.863 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.863 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.863 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4af4043c-8199-4d0f-acf9-38d029560167 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.863 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 84307a3a-98ff-4145-a789-fda9cf8b1905 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.864 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:50:11 np0005603622 nova_compute[221301]: 2026-01-31 08:50:11.864 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.137 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:50:12 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2400029863' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.605 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.611 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.667 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.772 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:50:12 np0005603622 nova_compute[221301]: 2026-01-31 08:50:12.773 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:13.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:13 np0005603622 kernel: tap26ecf32a-62 (unregistering): left promiscuous mode
Jan 31 03:50:13 np0005603622 NetworkManager[49080]: <info>  [1769849413.7878] device (tap26ecf32a-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:50:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:13Z|00772|binding|INFO|Releasing lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 from this chassis (sb_readonly=0)
Jan 31 03:50:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:13Z|00773|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 down in Southbound
Jan 31 03:50:13 np0005603622 nova_compute[221301]: 2026-01-31 08:50:13.794 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:13 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:13Z|00774|binding|INFO|Removing iface tap26ecf32a-62 ovn-installed in OVS
Jan 31 03:50:13 np0005603622 nova_compute[221301]: 2026-01-31 08:50:13.806 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.839 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '4', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.179'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.842 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.846 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:50:13 np0005603622 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 31 03:50:13 np0005603622 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a6.scope: Consumed 16.181s CPU time.
Jan 31 03:50:13 np0005603622 systemd-machined[190406]: Machine qemu-80-instance-000000a6 terminated.
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.858 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3c90e4-0a4c-4ce2-9db3-0ff72dae9df4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.878 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c34efe-f010-436e-81be-8fd781012756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.881 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0590ef1f-4ad6-45ca-b27c-a972cd684351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.901 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7b5108-91d9-40fd-9f3c-79a55121bc1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.913 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e17d0341-a5f6-45d1-88f8-fe46e4def35b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 33792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287925, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.922 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8a306781-100b-47d3-a255-e9187edc4827]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287926, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287926, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.923 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:13 np0005603622 nova_compute[221301]: 2026-01-31 08:50:13.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:13 np0005603622 nova_compute[221301]: 2026-01-31 08:50:13.927 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.927 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.928 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.928 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:13.928 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.261 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance shutdown successfully after 4 seconds.#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.266 221324 INFO nova.virt.libvirt.driver [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance destroyed successfully.#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.267 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'numa_topology' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.305 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Attempting rescue#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.306 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.309 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.310 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Creating image(s)#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.335 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.341 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'trusted_certs' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.394 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.419 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.422 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.492 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.493 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.493 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.494 221324 DEBUG oslo_concurrency.lockutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.520 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.526 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.554 221324 DEBUG nova.compute.manager [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.555 221324 DEBUG oslo_concurrency.lockutils [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.555 221324 DEBUG oslo_concurrency.lockutils [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.556 221324 DEBUG oslo_concurrency.lockutils [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.556 221324 DEBUG nova.compute.manager [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.556 221324 WARNING nova.compute.manager [req-cefd016d-2bb1-44d4-a21e-c534ad5ba5b2 req-be119470-5e63-48eb-841b-00df87736f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:50:14 np0005603622 nova_compute[221301]: 2026-01-31 08:50:14.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:50:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/649695807' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:50:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:50:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/649695807' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:50:15 np0005603622 nova_compute[221301]: 2026-01-31 08:50:15.201 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.675s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:15 np0005603622 nova_compute[221301]: 2026-01-31 08:50:15.202 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'migration_context' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:15.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:15.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.197 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.198 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start _get_guest_xml network_info=[{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:35:e0:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.199 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.378 221324 WARNING nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.383 221324 DEBUG nova.virt.libvirt.host [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.384 221324 DEBUG nova.virt.libvirt.host [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.388 221324 DEBUG nova.virt.libvirt.host [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.388 221324 DEBUG nova.virt.libvirt.host [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.389 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.389 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.390 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.390 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.390 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.390 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.391 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.391 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.391 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.391 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.392 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.392 221324 DEBUG nova.virt.hardware [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.392 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'vcpu_model' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.695 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.908 221324 DEBUG nova.compute.manager [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.909 221324 DEBUG oslo_concurrency.lockutils [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.910 221324 DEBUG oslo_concurrency.lockutils [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.910 221324 DEBUG oslo_concurrency.lockutils [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.910 221324 DEBUG nova.compute.manager [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:16 np0005603622 nova_compute[221301]: 2026-01-31 08:50:16.911 221324 WARNING nova.compute.manager [req-1ecd3f9a-d50c-422f-8f5a-2a2c68e7476d req-282d208b-eafc-46d7-908e-47de10c82528 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state rescuing.#033[00m
Jan 31 03:50:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:50:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1885345812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:50:17 np0005603622 nova_compute[221301]: 2026-01-31 08:50:17.245 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:17 np0005603622 nova_compute[221301]: 2026-01-31 08:50:17.246 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:17.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:50:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3541876897' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:50:17 np0005603622 nova_compute[221301]: 2026-01-31 08:50:17.688 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:17 np0005603622 nova_compute[221301]: 2026-01-31 08:50:17.690 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:17.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:17 np0005603622 nova_compute[221301]: 2026-01-31 08:50:17.939 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:50:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:50:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1329688378' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.296 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.298 221324 DEBUG nova.virt.libvirt.vif [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:49:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-462635409',display_name='tempest-ServerRescueNegativeTestJSON-server-462635409',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-462635409',id=166,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPEp8NFnjdKYncckoDcEUrRrf7xo9LmTCD6V4O6WjeTKJAyV8su/7oD1+xGUT3vktBEdPWfQ8rSAH7lPCJKfsLVFbyZh4D52OnOOfJY3x1nxzkz8U0mjKMYyORWKFvG9ew==',key_name='tempest-keypair-1284882307',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:49:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-y7tniy9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:49:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=84307a3a-98ff-4145-a789-fda9cf8b1905,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:35:e0:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.299 221324 DEBUG nova.network.os_vif_util [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "vif_mac": "fa:16:3e:35:e0:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.300 221324 DEBUG nova.network.os_vif_util [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.301 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'pci_devices' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.501 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <uuid>84307a3a-98ff-4145-a789-fda9cf8b1905</uuid>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <name>instance-000000a6</name>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-462635409</nova:name>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:50:16</nova:creationTime>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:user uuid="aa7f893021af4a84b03d85b476dadfe0">tempest-ServerRescueNegativeTestJSON-81297706-project-member</nova:user>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:project uuid="bf1c3d387dbe4191b4d05bdfca5959da">tempest-ServerRescueNegativeTestJSON-81297706</nova:project>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <nova:port uuid="26ecf32a-6227-4ef9-8e92-55f3262e4031">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="serial">84307a3a-98ff-4145-a789-fda9cf8b1905</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="uuid">84307a3a-98ff-4145-a789-fda9cf8b1905</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84307a3a-98ff-4145-a789-fda9cf8b1905_disk.rescue">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84307a3a-98ff-4145-a789-fda9cf8b1905_disk">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config.rescue">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:35:e0:35"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <target dev="tap26ecf32a-62"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/console.log" append="off"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:50:18 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:50:18 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:50:18 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:50:18 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.511 221324 INFO nova.virt.libvirt.driver [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance destroyed successfully.#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.731 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.732 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.732 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.733 221324 DEBUG nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] No VIF found with MAC fa:16:3e:35:e0:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.734 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Using config drive#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.775 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.818 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'ec2_ids' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:18 np0005603622 nova_compute[221301]: 2026-01-31 08:50:18.938 221324 DEBUG nova.objects.instance [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'keypairs' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:19.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:19 np0005603622 nova_compute[221301]: 2026-01-31 08:50:19.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:19.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:20 np0005603622 nova_compute[221301]: 2026-01-31 08:50:20.996 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Creating config drive at /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.002 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3yga09b6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.128 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3yga09b6" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.179 221324 DEBUG nova.storage.rbd_utils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] rbd image 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.185 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:21.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.389 221324 DEBUG oslo_concurrency.processutils [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue 84307a3a-98ff-4145-a789-fda9cf8b1905_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.391 221324 INFO nova.virt.libvirt.driver [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Deleting local config drive /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905/disk.config.rescue because it was imported into RBD.#033[00m
Jan 31 03:50:21 np0005603622 kernel: tap26ecf32a-62: entered promiscuous mode
Jan 31 03:50:21 np0005603622 NetworkManager[49080]: <info>  [1769849421.4403] manager: (tap26ecf32a-62): new Tun device (/org/freedesktop/NetworkManager/Devices/345)
Jan 31 03:50:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:21Z|00775|binding|INFO|Claiming lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 for this chassis.
Jan 31 03:50:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:21Z|00776|binding|INFO|26ecf32a-6227-4ef9-8e92-55f3262e4031: Claiming fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:21Z|00777|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 ovn-installed in OVS
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:21 np0005603622 systemd-machined[190406]: New machine qemu-81-instance-000000a6.
Jan 31 03:50:21 np0005603622 systemd[1]: Started Virtual Machine qemu-81-instance-000000a6.
Jan 31 03:50:21 np0005603622 systemd-udevd[288171]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:50:21 np0005603622 NetworkManager[49080]: <info>  [1769849421.4981] device (tap26ecf32a-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:50:21 np0005603622 NetworkManager[49080]: <info>  [1769849421.4985] device (tap26ecf32a-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:50:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:21Z|00778|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 up in Southbound
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.628 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '5', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.630 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.632 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.641 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15192db8-6b0d-4e57-b2ae-468c1d19c1d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.675 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb54ffa-68a2-415d-b6c6-9f2ba44eb32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.680 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e37d80cb-8b61-4fe8-8977-ff2824ad99f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.726 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6b266831-d011-4013-8d13-c37247144d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.750 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6257bce6-359d-4aa6-9144-f1b56f786bd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 33792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288185, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.767 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1251d9-9770-492a-ae28-3275699701f0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288186, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288186, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.770 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:21 np0005603622 nova_compute[221301]: 2026-01-31 08:50:21.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.776 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.776 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.776 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:21.777 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.677 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 84307a3a-98ff-4145-a789-fda9cf8b1905 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.678 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849422.6762562, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.678 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.690 221324 DEBUG nova.compute.manager [None req-d6b88263-ca18-49e4-acc2-a913c25d0d8d aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.793 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:22 np0005603622 nova_compute[221301]: 2026-01-31 08:50:22.799 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:50:23 np0005603622 nova_compute[221301]: 2026-01-31 08:50:23.057 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 31 03:50:23 np0005603622 nova_compute[221301]: 2026-01-31 08:50:23.057 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849422.6828027, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:23 np0005603622 nova_compute[221301]: 2026-01-31 08:50:23.058 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Started (Lifecycle Event)#033[00m
Jan 31 03:50:23 np0005603622 nova_compute[221301]: 2026-01-31 08:50:23.229 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:23 np0005603622 nova_compute[221301]: 2026-01-31 08:50:23.233 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:50:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:23.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:23.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:24 np0005603622 nova_compute[221301]: 2026-01-31 08:50:24.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:25.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:25.392 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:25.394 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:50:25 np0005603622 nova_compute[221301]: 2026-01-31 08:50:25.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:25.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:26 np0005603622 nova_compute[221301]: 2026-01-31 08:50:26.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 31 03:50:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:27 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:50:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:27.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.017 221324 DEBUG nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.018 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.019 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.019 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.020 221324 DEBUG nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.020 221324 WARNING nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.021 221324 DEBUG nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.021 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.022 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.022 221324 DEBUG oslo_concurrency.lockutils [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.023 221324 DEBUG nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.023 221324 WARNING nova.compute.manager [req-434c36cf-e287-476d-b071-2c9c30a91e9c req-4cc62862-9e5f-44cb-b370-22ca5b8feede fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state rescued and task_state None.#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.917 221324 INFO nova.compute.manager [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Unrescuing#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.918 221324 DEBUG oslo_concurrency.lockutils [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.918 221324 DEBUG oslo_concurrency.lockutils [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:28 np0005603622 nova_compute[221301]: 2026-01-31 08:50:28.919 221324 DEBUG nova.network.neutron [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:50:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:29.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:29 np0005603622 nova_compute[221301]: 2026-01-31 08:50:29.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:29.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:30 np0005603622 nova_compute[221301]: 2026-01-31 08:50:30.424 221324 DEBUG nova.compute.manager [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:30 np0005603622 nova_compute[221301]: 2026-01-31 08:50:30.425 221324 DEBUG nova.compute.manager [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:30 np0005603622 nova_compute[221301]: 2026-01-31 08:50:30.425 221324 DEBUG oslo_concurrency.lockutils [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:30 np0005603622 nova_compute[221301]: 2026-01-31 08:50:30.425 221324 DEBUG oslo_concurrency.lockutils [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:30 np0005603622 nova_compute[221301]: 2026-01-31 08:50:30.426 221324 DEBUG nova.network.neutron [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:30.907 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:30.910 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:31.401 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:31 np0005603622 nova_compute[221301]: 2026-01-31 08:50:31.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:31.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:50:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:33.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.069 221324 DEBUG nova.network.neutron [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.200 221324 DEBUG oslo_concurrency.lockutils [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.201 221324 DEBUG nova.objects.instance [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 kernel: tap26ecf32a-62 (unregistering): left promiscuous mode
Jan 31 03:50:34 np0005603622 NetworkManager[49080]: <info>  [1769849434.7366] device (tap26ecf32a-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:34Z|00779|binding|INFO|Releasing lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 from this chassis (sb_readonly=0)
Jan 31 03:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:34Z|00780|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 down in Southbound
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.747 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:34Z|00781|binding|INFO|Removing iface tap26ecf32a-62 ovn-installed in OVS
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.750 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.759 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 31 03:50:34 np0005603622 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a6.scope: Consumed 12.125s CPU time.
Jan 31 03:50:34 np0005603622 systemd-machined[190406]: Machine qemu-81-instance-000000a6 terminated.
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.859 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '6', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.860 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.861 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.873 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b5a653-3ab1-43f1-86f2-bbbd6836ab60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.882 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.895 221324 INFO nova.virt.libvirt.driver [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance destroyed successfully.#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.896 221324 DEBUG nova.objects.instance [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'numa_topology' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.899 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3026fe-7eb6-4c84-b05a-eda6c7d6f93d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.902 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c81e191b-0472-499d-8291-8582ccf36cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.923 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[aebd70f7-ec10-440a-b62f-c97b00c29794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.936 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0e1288-a4b6-441b-97a5-d276bf81498d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 33792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288450, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.946 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c21cd9ab-77a4-4718-a5b9-cf1b7972ded6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288451, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288451, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.947 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.949 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:34 np0005603622 nova_compute[221301]: 2026-01-31 08:50:34.952 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.953 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.953 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.954 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:34.954 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:35 np0005603622 systemd-udevd[288430]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:50:35 np0005603622 NetworkManager[49080]: <info>  [1769849435.1207] manager: (tap26ecf32a-62): new Tun device (/org/freedesktop/NetworkManager/Devices/346)
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.122 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:35Z|00782|binding|INFO|Claiming lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 for this chassis.
Jan 31 03:50:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:35Z|00783|binding|INFO|26ecf32a-6227-4ef9-8e92-55f3262e4031: Claiming fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:50:35 np0005603622 NetworkManager[49080]: <info>  [1769849435.1296] device (tap26ecf32a-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:50:35 np0005603622 NetworkManager[49080]: <info>  [1769849435.1304] device (tap26ecf32a-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:50:35 np0005603622 kernel: tap26ecf32a-62: entered promiscuous mode
Jan 31 03:50:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:35Z|00784|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 ovn-installed in OVS
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.160 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.165 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603622 systemd-machined[190406]: New machine qemu-82-instance-000000a6.
Jan 31 03:50:35 np0005603622 systemd[1]: Started Virtual Machine qemu-82-instance-000000a6.
Jan 31 03:50:35 np0005603622 podman[288465]: 2026-01-31 08:50:35.239522122 +0000 UTC m=+0.053639247 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:50:35 np0005603622 podman[288464]: 2026-01-31 08:50:35.264536431 +0000 UTC m=+0.078878872 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.344 221324 DEBUG nova.network.neutron [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.345 221324 DEBUG nova.network.neutron [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:35Z|00785|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 up in Southbound
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.399 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '6', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.400 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc bound to our chassis#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.401 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:50:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.415 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aca46604-cc15-47e6-b5d9-e65c77204514]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.443 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a10b5170-2cdd-4623-8cb4-002f1f6d3ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.446 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[778a1812-78f0-4356-b34c-fbf60c0ec15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.467 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbb41d2-cfba-41e6-95aa-e08ea5c117ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.482 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe40b82-7bdd-4ebe-a95d-7e5931ecf119]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 33792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288532, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.494 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d5ecfa-e24e-4671-b028-82f6e760487c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288541, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288541, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.495 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:35.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.563 221324 DEBUG oslo_concurrency.lockutils [req-12edc33b-f1a4-42dc-89e2-9b809ada4bdd req-29259b86-1ec9-443c-9380-573e9930392b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.665 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 84307a3a-98ff-4145-a789-fda9cf8b1905 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.666 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849435.664912, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.666 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:50:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:35.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.887 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:35 np0005603622 nova_compute[221301]: 2026-01-31 08:50:35.892 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.099 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.100 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849435.668303, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.100 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Started (Lifecycle Event)#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.144 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.147 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.199 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 31 03:50:36 np0005603622 nova_compute[221301]: 2026-01-31 08:50:36.465 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:50:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:37.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:50:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:37.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.856 221324 DEBUG nova.compute.manager [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.856 221324 DEBUG oslo_concurrency.lockutils [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.856 221324 DEBUG oslo_concurrency.lockutils [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.856 221324 DEBUG oslo_concurrency.lockutils [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.857 221324 DEBUG nova.compute.manager [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:37 np0005603622 nova_compute[221301]: 2026-01-31 08:50:37.857 221324 WARNING nova.compute.manager [req-71dabaf5-9c58-4536-b061-b005f6740f2a req-fd07729f-e908-4e12-85cb-3d2f9aa92b8f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 31 03:50:38 np0005603622 nova_compute[221301]: 2026-01-31 08:50:38.384 221324 DEBUG nova.compute.manager [None req-ee1b9d4d-f00e-47ed-86c5-8863fa89f749 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:50:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:39.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:39 np0005603622 nova_compute[221301]: 2026-01-31 08:50:39.619 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 03:50:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:39.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 03:50:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.057 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.057 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.058 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.058 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.058 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.059 221324 WARNING nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.059 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.059 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.059 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.060 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.060 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.060 221324 WARNING nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.060 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.061 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.061 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.061 221324 DEBUG oslo_concurrency.lockutils [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.061 221324 DEBUG nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:40 np0005603622 nova_compute[221301]: 2026-01-31 08:50:40.062 221324 WARNING nova.compute.manager [req-efb019f3-89b7-4ee8-93fc-5730287bafbc req-4345a76e-9271-488a-b8f2-4be14e0228e2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:50:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:41.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:41 np0005603622 nova_compute[221301]: 2026-01-31 08:50:41.439 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:41 np0005603622 nova_compute[221301]: 2026-01-31 08:50:41.440 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:41 np0005603622 nova_compute[221301]: 2026-01-31 08:50:41.440 221324 DEBUG nova.network.neutron [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:50:41 np0005603622 nova_compute[221301]: 2026-01-31 08:50:41.471 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:41.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:43.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:43.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.622 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.683 221324 DEBUG nova.compute.manager [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.684 221324 DEBUG nova.compute.manager [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing instance network info cache due to event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.684 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.684 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:44 np0005603622 nova_compute[221301]: 2026-01-31 08:50:44.684 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:45.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:45.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:45 np0005603622 nova_compute[221301]: 2026-01-31 08:50:45.900 221324 DEBUG nova.network.neutron [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:46 np0005603622 nova_compute[221301]: 2026-01-31 08:50:46.040 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:46 np0005603622 nova_compute[221301]: 2026-01-31 08:50:46.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:46 np0005603622 nova_compute[221301]: 2026-01-31 08:50:46.701 221324 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:50:46 np0005603622 nova_compute[221301]: 2026-01-31 08:50:46.701 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Creating file /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/c92e3d4512f44ada8c3c92b40218c304.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:50:46 np0005603622 nova_compute[221301]: 2026-01-31 08:50:46.701 221324 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/c92e3d4512f44ada8c3c92b40218c304.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.057 221324 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/c92e3d4512f44ada8c3c92b40218c304.tmp" returned: 1 in 0.356s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.058 221324 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167/c92e3d4512f44ada8c3c92b40218c304.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.058 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Creating directory /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.058 221324 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.246 221324 DEBUG oslo_concurrency.processutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/4af4043c-8199-4d0f-acf9-38d029560167" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:50:47 np0005603622 nova_compute[221301]: 2026-01-31 08:50:47.251 221324 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:50:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:47.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:49.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:49 np0005603622 nova_compute[221301]: 2026-01-31 08:50:49.625 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:50Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:e0:35 10.100.0.7
Jan 31 03:50:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.063 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updated VIF entry in instance network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.065 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.084 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.084 221324 DEBUG nova.compute.manager [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.085 221324 DEBUG nova.compute.manager [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing instance network info cache due to event network-changed-26ecf32a-6227-4ef9-8e92-55f3262e4031. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.086 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.086 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.086 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Refreshing network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.278 221324 INFO nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance shutdown successfully after 4 seconds.#033[00m
Jan 31 03:50:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:51.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:51.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:51 np0005603622 kernel: tap3aae5c0f-f2 (unregistering): left promiscuous mode
Jan 31 03:50:51 np0005603622 NetworkManager[49080]: <info>  [1769849451.9353] device (tap3aae5c0f-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:51Z|00786|binding|INFO|Releasing lport 3aae5c0f-f2ed-4352-a4e2-017466399641 from this chassis (sb_readonly=0)
Jan 31 03:50:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:51Z|00787|binding|INFO|Setting lport 3aae5c0f-f2ed-4352-a4e2-017466399641 down in Southbound
Jan 31 03:50:51 np0005603622 ovn_controller[130109]: 2026-01-31T08:50:51Z|00788|binding|INFO|Removing iface tap3aae5c0f-f2 ovn-installed in OVS
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.944 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:51 np0005603622 nova_compute[221301]: 2026-01-31 08:50:51.950 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 31 03:50:52 np0005603622 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a5.scope: Consumed 16.749s CPU time.
Jan 31 03:50:52 np0005603622 systemd-machined[190406]: Machine qemu-79-instance-000000a5 terminated.
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.088 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:73:3d 10.100.0.9'], port_security=['fa:16:3e:69:73:3d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4af4043c-8199-4d0f-acf9-38d029560167', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3aae5c0f-f2ed-4352-a4e2-017466399641) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.090 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3aae5c0f-f2ed-4352-a4e2-017466399641 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 unbound from our chassis#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.092 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.107 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[692d2600-f3aa-4d8c-bedf-56d41fc029ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.113 221324 INFO nova.virt.libvirt.driver [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Instance destroyed successfully.#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.115 221324 DEBUG nova.virt.libvirt.vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:49:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:50:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.115 221324 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "vif_mac": "fa:16:3e:69:73:3d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.116 221324 DEBUG nova.network.os_vif_util [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.117 221324 DEBUG os_vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.119 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.119 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aae5c0f-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.123 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.126 221324 INFO os_vif [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2')#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.134 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a20f34-47f9-4020-9d1b-9cfd43b7ce8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.138 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a66c844e-98d2-4efc-b16f-ce2b5addaae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.168 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dda4cec8-504c-4393-b242-b67f29080817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.182 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3afa8961-3640-4dee-8792-755a71ae219c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3afaf607-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:84:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849367, 'reachable_time': 24734, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288630, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.195 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e98fcf0e-5a5a-4e68-9dc1-fb613ae1e2e3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849375, 'tstamp': 849375}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288631, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3afaf607-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 849377, 'tstamp': 849377}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288631, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.197 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.201 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3afaf607-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.201 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.202 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3afaf607-40, col_values=(('external_ids', {'iface-id': '0ed76a0a-650c-4ec7-a4d4-0e745236b047'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:50:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:50:52.202 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.411 221324 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.411 221324 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:52 np0005603622 nova_compute[221301]: 2026-01-31 08:50:52.412 221324 DEBUG nova.virt.libvirt.driver [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:50:53 np0005603622 nova_compute[221301]: 2026-01-31 08:50:53.178 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updated VIF entry in instance network info cache for port 26ecf32a-6227-4ef9-8e92-55f3262e4031. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:50:53 np0005603622 nova_compute[221301]: 2026-01-31 08:50:53.179 221324 DEBUG nova.network.neutron [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [{"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:50:53 np0005603622 nova_compute[221301]: 2026-01-31 08:50:53.222 221324 DEBUG oslo_concurrency.lockutils [req-5fae76de-a852-40bb-b453-bcce969748b4 req-05ab52a0-39cc-41e0-b8a9-78c6f5b1d21d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84307a3a-98ff-4145-a789-fda9cf8b1905" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:50:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:53.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:53 np0005603622 nova_compute[221301]: 2026-01-31 08:50:53.696 221324 DEBUG neutronclient.v2_0.client [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3aae5c0f-f2ed-4352-a4e2-017466399641 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:50:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.039 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.039 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.040 221324 DEBUG oslo_concurrency.lockutils [None req-2d500a13-f418-4928-a6b6-06e9e8212c5a a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.821 221324 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.821 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.822 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.822 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.822 221324 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.823 221324 WARNING nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-unplugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.823 221324 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.823 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.824 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.824 221324 DEBUG oslo_concurrency.lockutils [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.824 221324 DEBUG nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:50:54 np0005603622 nova_compute[221301]: 2026-01-31 08:50:54.824 221324 WARNING nova.compute.manager [req-63278f6e-98c9-4b1e-8482-7586f8915011 req-8bcb3dfa-a432-4bb5-893b-53733a8555a3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:50:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:50:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:55.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:50:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:50:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:55.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:57 np0005603622 nova_compute[221301]: 2026-01-31 08:50:57.124 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:50:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:57.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:50:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:57.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:58 np0005603622 nova_compute[221301]: 2026-01-31 08:50:58.382 221324 DEBUG nova.compute.manager [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:50:58 np0005603622 nova_compute[221301]: 2026-01-31 08:50:58.383 221324 DEBUG nova.compute.manager [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing instance network info cache due to event network-changed-3aae5c0f-f2ed-4352-a4e2-017466399641. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:50:58 np0005603622 nova_compute[221301]: 2026-01-31 08:50:58.383 221324 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:50:58 np0005603622 nova_compute[221301]: 2026-01-31 08:50:58.383 221324 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:50:58 np0005603622 nova_compute[221301]: 2026-01-31 08:50:58.383 221324 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Refreshing network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:50:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:50:59.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:50:59 np0005603622 nova_compute[221301]: 2026-01-31 08:50:59.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:50:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:50:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:50:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:50:59.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.660 221324 DEBUG oslo_concurrency.lockutils [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.661 221324 DEBUG oslo_concurrency.lockutils [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.680 221324 INFO nova.compute.manager [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Detaching volume 756cf314-0b1f-46f3-bdfa-a0682d03b570#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.875 221324 INFO nova.virt.block_device [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Attempting to driver detach volume 756cf314-0b1f-46f3-bdfa-a0682d03b570 from mountpoint /dev/vdb#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.886 221324 DEBUG nova.virt.libvirt.driver [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Attempting to detach device vdb from instance 84307a3a-98ff-4145-a789-fda9cf8b1905 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.887 221324 DEBUG nova.virt.libvirt.guest [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-756cf314-0b1f-46f3-bdfa-a0682d03b570">
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <serial>756cf314-0b1f-46f3-bdfa-a0682d03b570</serial>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:51:00 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.899 221324 INFO nova.virt.libvirt.driver [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully detached device vdb from instance 84307a3a-98ff-4145-a789-fda9cf8b1905 from the persistent domain config.#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.900 221324 DEBUG nova.virt.libvirt.driver [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 84307a3a-98ff-4145-a789-fda9cf8b1905 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.900 221324 DEBUG nova.virt.libvirt.guest [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-756cf314-0b1f-46f3-bdfa-a0682d03b570">
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <serial>756cf314-0b1f-46f3-bdfa-a0682d03b570</serial>
Jan 31 03:51:00 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:51:00 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:51:00 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.950 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849460.9503179, 84307a3a-98ff-4145-a789-fda9cf8b1905 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.951 221324 DEBUG nova.virt.libvirt.driver [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 84307a3a-98ff-4145-a789-fda9cf8b1905 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.954 221324 INFO nova.virt.libvirt.driver [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully detached device vdb from instance 84307a3a-98ff-4145-a789-fda9cf8b1905 from the live domain config.#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.985 221324 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updated VIF entry in instance network info cache for port 3aae5c0f-f2ed-4352-a4e2-017466399641. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:51:00 np0005603622 nova_compute[221301]: 2026-01-31 08:51:00.986 221324 DEBUG nova.network.neutron [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:01 np0005603622 nova_compute[221301]: 2026-01-31 08:51:01.229 221324 DEBUG oslo_concurrency.lockutils [req-47af7f31-6dc6-492a-9134-85677516bf5c req-a1f34a79-5697-4b7d-86db-6f75a9052217 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:01 np0005603622 nova_compute[221301]: 2026-01-31 08:51:01.265 221324 DEBUG nova.objects.instance [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'flavor' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:01 np0005603622 nova_compute[221301]: 2026-01-31 08:51:01.352 221324 DEBUG oslo_concurrency.lockutils [None req-eb8b9a07-2696-4bf9-9479-877ab754e317 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:01.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:01.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 31 03:51:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:02Z|00789|binding|INFO|Releasing lport 5976b74a-78ce-46e1-bd2c-76a2a502c8f5 from this chassis (sb_readonly=0)
Jan 31 03:51:02 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:02Z|00790|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:51:02 np0005603622 nova_compute[221301]: 2026-01-31 08:51:02.075 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:02 np0005603622 nova_compute[221301]: 2026-01-31 08:51:02.126 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:02 np0005603622 nova_compute[221301]: 2026-01-31 08:51:02.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:03.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.788 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.789 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.789 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.789 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.789 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.791 221324 INFO nova.compute.manager [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Terminating instance#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.792 221324 DEBUG nova.compute.manager [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:51:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:03.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:03 np0005603622 kernel: tap26ecf32a-62 (unregistering): left promiscuous mode
Jan 31 03:51:03 np0005603622 NetworkManager[49080]: <info>  [1769849463.8676] device (tap26ecf32a-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:03Z|00791|binding|INFO|Releasing lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 from this chassis (sb_readonly=0)
Jan 31 03:51:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:03Z|00792|binding|INFO|Setting lport 26ecf32a-6227-4ef9-8e92-55f3262e4031 down in Southbound
Jan 31 03:51:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:03Z|00793|binding|INFO|Removing iface tap26ecf32a-62 ovn-installed in OVS
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.876 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.896 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e0:35 10.100.0.7'], port_security=['fa:16:3e:35:e0:35 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84307a3a-98ff-4145-a789-fda9cf8b1905', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '8', 'neutron:security_group_ids': '96c36d72-b9f7-4399-af44-5588b758d3ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.179', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=26ecf32a-6227-4ef9-8e92-55f3262e4031) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.897 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 26ecf32a-6227-4ef9-8e92-55f3262e4031 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.899 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2128154c-0218-4f66-9509-e0db66eba3fc#033[00m
Jan 31 03:51:03 np0005603622 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 31 03:51:03 np0005603622 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a6.scope: Consumed 13.594s CPU time.
Jan 31 03:51:03 np0005603622 systemd-machined[190406]: Machine qemu-82-instance-000000a6 terminated.
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.913 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a22d3939-7eef-4ed0-b2ad-a717737cf01d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.936 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[236832d3-3243-4622-b748-0997eec068a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.939 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[8d98addc-5a3b-477b-830b-065e4d31e344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.955 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[89c45ed0-f526-41eb-8c52-7c947f73e0a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.970 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b52f0bb1-f7e7-4f17-af80-e94650167245]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2128154c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:32:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858136, 'reachable_time': 33792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288646, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.984 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0133f516-83b8-4d24-adef-3b18cf1cca4e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858149, 'tstamp': 858149}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288647, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2128154c-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858153, 'tstamp': 858153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288647, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.986 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.987 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603622 nova_compute[221301]: 2026-01-31 08:51:03.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.991 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2128154c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.991 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.992 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2128154c-00, col_values=(('external_ids', {'iface-id': '5976b74a-78ce-46e1-bd2c-76a2a502c8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:03.992 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.025 221324 INFO nova.virt.libvirt.driver [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Instance destroyed successfully.#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.025 221324 DEBUG nova.objects.instance [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid 84307a3a-98ff-4145-a789-fda9cf8b1905 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.183 221324 DEBUG nova.virt.libvirt.vif [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:49:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-462635409',display_name='tempest-ServerRescueNegativeTestJSON-server-462635409',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-462635409',id=166,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPEp8NFnjdKYncckoDcEUrRrf7xo9LmTCD6V4O6WjeTKJAyV8su/7oD1+xGUT3vktBEdPWfQ8rSAH7lPCJKfsLVFbyZh4D52OnOOfJY3x1nxzkz8U0mjKMYyORWKFvG9ew==',key_name='tempest-keypair-1284882307',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:50:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-y7tniy9w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:50:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=84307a3a-98ff-4145-a789-fda9cf8b1905,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.183 221324 DEBUG nova.network.os_vif_util [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "address": "fa:16:3e:35:e0:35", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26ecf32a-62", "ovs_interfaceid": "26ecf32a-6227-4ef9-8e92-55f3262e4031", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.184 221324 DEBUG nova.network.os_vif_util [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.184 221324 DEBUG os_vif [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.186 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26ecf32a-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.187 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.191 221324 INFO os_vif [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:e0:35,bridge_name='br-int',has_traffic_filtering=True,id=26ecf32a-6227-4ef9-8e92-55f3262e4031,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26ecf32a-62')#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.645 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.763 221324 DEBUG nova.compute.manager [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.763 221324 DEBUG oslo_concurrency.lockutils [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.763 221324 DEBUG oslo_concurrency.lockutils [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.764 221324 DEBUG oslo_concurrency.lockutils [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.764 221324 DEBUG nova.compute.manager [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.765 221324 DEBUG nova.compute.manager [req-b71292e6-2bee-4ec5-a8d4-459ab8b53b76 req-8c4a9a99-9101-4b9e-8b41-2466e2cb42dd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-unplugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.896 221324 DEBUG nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.897 221324 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.897 221324 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.898 221324 DEBUG oslo_concurrency.lockutils [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.898 221324 DEBUG nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:04 np0005603622 nova_compute[221301]: 2026-01-31 08:51:04.899 221324 WARNING nova.compute.manager [req-a102e8b4-8922-477b-b1c8-287a75107e6c req-21752168-8051-4065-afd7-3ea42ed47d3f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.323 221324 INFO nova.virt.libvirt.driver [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Deleting instance files /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905_del#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.324 221324 INFO nova.virt.libvirt.driver [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Deletion of /var/lib/nova/instances/84307a3a-98ff-4145-a789-fda9cf8b1905_del complete#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:51:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:05.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.454 221324 INFO nova.compute.manager [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Took 1.66 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.455 221324 DEBUG oslo.service.loopingcall [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.455 221324 DEBUG nova.compute.manager [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.455 221324 DEBUG nova.network.neutron [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:51:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.823 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.823 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:05 np0005603622 nova_compute[221301]: 2026-01-31 08:51:05.824 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:51:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:05.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:06 np0005603622 podman[288680]: 2026-01-31 08:51:06.116894108 +0000 UTC m=+0.047225525 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 31 03:51:06 np0005603622 podman[288679]: 2026-01-31 08:51:06.136066431 +0000 UTC m=+0.066633765 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.110 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849452.108826, 4af4043c-8199-4d0f-acf9-38d029560167 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.111 221324 INFO nova.compute.manager [-] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:51:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:07.255 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.256 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:07.257 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.272 221324 DEBUG nova.compute.manager [None req-f7f20f12-4512-4b91-bfa5-1d941da44aac - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.276 221324 DEBUG nova.compute.manager [None req-f7f20f12-4512-4b91-bfa5-1d941da44aac - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.319 221324 DEBUG nova.compute.manager [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.319 221324 DEBUG oslo_concurrency.lockutils [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.320 221324 DEBUG oslo_concurrency.lockutils [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.320 221324 DEBUG oslo_concurrency.lockutils [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.320 221324 DEBUG nova.compute.manager [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] No waiting events found dispatching network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.320 221324 WARNING nova.compute.manager [req-5f3cbfde-d862-4e67-8abc-9c7cff154e1c req-f0120665-49b5-4687-a016-4645bb104ddf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received unexpected event network-vif-plugged-26ecf32a-6227-4ef9-8e92-55f3262e4031 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:51:07 np0005603622 nova_compute[221301]: 2026-01-31 08:51:07.355 221324 INFO nova.compute.manager [None req-f7f20f12-4512-4b91-bfa5-1d941da44aac - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:51:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:07.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:07.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.052 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.053 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.054 221324 DEBUG nova.compute.manager [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Going to confirm migration 23 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.062 221324 DEBUG nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.063 221324 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4af4043c-8199-4d0f-acf9-38d029560167-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.064 221324 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.065 221324 DEBUG oslo_concurrency.lockutils [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.066 221324 DEBUG nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] No waiting events found dispatching network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.066 221324 WARNING nova.compute.manager [req-32cd674b-a893-4d86-9fb1-16de2cdeedae req-a7e9dd9e-4a59-4a93-8d1f-4b8815424367 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Received unexpected event network-vif-plugged-3aae5c0f-f2ed-4352-a4e2-017466399641 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.096 221324 DEBUG nova.network.neutron [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.484 221324 INFO nova.compute.manager [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Took 3.03 seconds to deallocate network for instance.#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.584 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.584 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:08 np0005603622 nova_compute[221301]: 2026-01-31 08:51:08.767 221324 DEBUG oslo_concurrency.processutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.081 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [{"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.121 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.122 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.123 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.123 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.123 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.166 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.367 221324 DEBUG neutronclient.v2_0.client [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3aae5c0f-f2ed-4352-a4e2-017466399641 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.369 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.369 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquired lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.369 221324 DEBUG nova.network.neutron [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.370 221324 DEBUG nova.objects.instance [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'info_cache' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:09.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/147628953' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.480 221324 DEBUG oslo_concurrency.processutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.713s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.487 221324 DEBUG nova.compute.provider_tree [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.543 221324 DEBUG nova.scheduler.client.report [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.647 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.673 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.676 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.676 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.677 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.677 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:09 np0005603622 nova_compute[221301]: 2026-01-31 08:51:09.696 221324 DEBUG nova.compute.manager [req-b01e126e-47fa-4fa8-9f7d-d7859e769c31 req-6004e5e6-b4b9-44b8-a6fd-d46b04a87b45 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Received event network-vif-deleted-26ecf32a-6227-4ef9-8e92-55f3262e4031 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:09.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2052062956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.112 221324 INFO nova.scheduler.client.report [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Deleted allocations for instance 84307a3a-98ff-4145-a789-fda9cf8b1905#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.115 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.244 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.244 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.248 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.249 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.249 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.252 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.253 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.281 221324 DEBUG oslo_concurrency.lockutils [None req-d1cd434a-4c87-49d2-9282-577af8bef7a2 aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "84307a3a-98ff-4145-a789-fda9cf8b1905" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.418 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.419 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3956MB free_disk=20.672863006591797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.419 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.419 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.568 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration for instance 4af4043c-8199-4d0f-acf9-38d029560167 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.620 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating resource usage from migration cccb9a8e-31b9-485b-8565-37667620f588#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.620 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Starting to track outgoing migration cccb9a8e-31b9-485b-8565-37667620f588 with flavor a01eb4f0-fd80-416b-a750-75de320394d8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.670 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.670 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.670 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration cccb9a8e-31b9-485b-8565-37667620f588 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.670 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.671 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:51:10 np0005603622 nova_compute[221301]: 2026-01-31 08:51:10.806 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3474112198' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.272 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.277 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.301 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.376 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.377 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.957s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:11 np0005603622 nova_compute[221301]: 2026-01-31 08:51:11.378 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:11.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:11.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:13 np0005603622 nova_compute[221301]: 2026-01-31 08:51:13.000 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:13.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:13.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.055 221324 DEBUG nova.network.neutron [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Updating instance_info_cache with network_info: [{"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.178 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Releasing lock "refresh_cache-4af4043c-8199-4d0f-acf9-38d029560167" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.178 221324 DEBUG nova.objects.instance [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'migration_context' on Instance uuid 4af4043c-8199-4d0f-acf9-38d029560167 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.234 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.281 221324 DEBUG nova.storage.rbd_utils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] removing snapshot(nova-resize) on rbd image(4af4043c-8199-4d0f-acf9-38d029560167_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.736 221324 DEBUG nova.virt.libvirt.vif [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=165,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:51:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-bd0dktab',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:51:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=4af4043c-8199-4d0f-acf9-38d029560167,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.737 221324 DEBUG nova.network.os_vif_util [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "3aae5c0f-f2ed-4352-a4e2-017466399641", "address": "fa:16:3e:69:73:3d", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aae5c0f-f2", "ovs_interfaceid": "3aae5c0f-f2ed-4352-a4e2-017466399641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.738 221324 DEBUG nova.network.os_vif_util [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.738 221324 DEBUG os_vif [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.740 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.740 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aae5c0f-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.741 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.743 221324 INFO os_vif [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:73:3d,bridge_name='br-int',has_traffic_filtering=True,id=3aae5c0f-f2ed-4352-a4e2-017466399641,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aae5c0f-f2')#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.743 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.744 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:14 np0005603622 nova_compute[221301]: 2026-01-31 08:51:14.935 221324 DEBUG oslo_concurrency.processutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:15 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3208715395' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:15 np0005603622 nova_compute[221301]: 2026-01-31 08:51:15.383 221324 DEBUG oslo_concurrency.processutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:15 np0005603622 nova_compute[221301]: 2026-01-31 08:51:15.388 221324 DEBUG nova.compute.provider_tree [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:15 np0005603622 nova_compute[221301]: 2026-01-31 08:51:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:15.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:15 np0005603622 nova_compute[221301]: 2026-01-31 08:51:15.662 221324 DEBUG nova.scheduler.client.report [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:15 np0005603622 nova_compute[221301]: 2026-01-31 08:51:15.848 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:51:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:15.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:51:16 np0005603622 nova_compute[221301]: 2026-01-31 08:51:16.113 221324 INFO nova.scheduler.client.report [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Deleted allocation for migration cccb9a8e-31b9-485b-8565-37667620f588#033[00m
Jan 31 03:51:16 np0005603622 nova_compute[221301]: 2026-01-31 08:51:16.228 221324 DEBUG oslo_concurrency.lockutils [None req-d3789aec-8556-4101-85c5-b70ec201eaa6 a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "4af4043c-8199-4d0f-acf9-38d029560167" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 8.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:16.258 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:17.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:17 np0005603622 nova_compute[221301]: 2026-01-31 08:51:17.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:17.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:19 np0005603622 nova_compute[221301]: 2026-01-31 08:51:19.022 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849464.0216758, 84307a3a-98ff-4145-a789-fda9cf8b1905 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:19 np0005603622 nova_compute[221301]: 2026-01-31 08:51:19.023 221324 INFO nova.compute.manager [-] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:51:19 np0005603622 nova_compute[221301]: 2026-01-31 08:51:19.062 221324 DEBUG nova.compute.manager [None req-89317239-dafa-4a71-83e6-d5eba92a3424 - - - - - -] [instance: 84307a3a-98ff-4145-a789-fda9cf8b1905] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:19 np0005603622 nova_compute[221301]: 2026-01-31 08:51:19.236 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:19.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:19 np0005603622 nova_compute[221301]: 2026-01-31 08:51:19.694 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:51:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:19.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:51:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:51:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/510012718' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:51:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:51:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/510012718' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:51:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:21.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:21.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 31 03:51:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:23.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:24 np0005603622 nova_compute[221301]: 2026-01-31 08:51:24.258 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:24 np0005603622 nova_compute[221301]: 2026-01-31 08:51:24.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:25.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:25.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:26 np0005603622 nova_compute[221301]: 2026-01-31 08:51:26.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:27.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:27.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.107 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.107 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.108 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.108 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.108 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.109 221324 INFO nova.compute.manager [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Terminating instance#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.110 221324 DEBUG nova.compute.manager [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:51:28 np0005603622 kernel: tap6e3ffb68-e7 (unregistering): left promiscuous mode
Jan 31 03:51:28 np0005603622 NetworkManager[49080]: <info>  [1769849488.2199] device (tap6e3ffb68-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:51:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:28Z|00794|binding|INFO|Releasing lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 from this chassis (sb_readonly=0)
Jan 31 03:51:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:28Z|00795|binding|INFO|Setting lport 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 down in Southbound
Jan 31 03:51:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:28Z|00796|binding|INFO|Removing iface tap6e3ffb68-e7 ovn-installed in OVS
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.228 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:28.239 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:16:3c 10.100.0.13'], port_security=['fa:16:3e:b0:16:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2128154c-0218-4f66-9509-e0db66eba3fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf1c3d387dbe4191b4d05bdfca5959da', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4937dacf-809a-410a-970f-8b358db49b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc4ff4f3-028a-4adf-9ffc-a84ef2563d05, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:28.241 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 in datapath 2128154c-0218-4f66-9509-e0db66eba3fc unbound from our chassis#033[00m
Jan 31 03:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:28.242 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2128154c-0218-4f66-9509-e0db66eba3fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:28.244 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4228c304-fae8-4b6f-b869-98350f80b0bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:28.244 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc namespace which is not needed anymore#033[00m
Jan 31 03:51:28 np0005603622 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 31 03:51:28 np0005603622 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a1.scope: Consumed 18.833s CPU time.
Jan 31 03:51:28 np0005603622 systemd-machined[190406]: Machine qemu-78-instance-000000a1 terminated.
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.346 221324 INFO nova.virt.libvirt.driver [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Instance destroyed successfully.#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.348 221324 DEBUG nova.objects.instance [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lazy-loading 'resources' on Instance uuid 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.381 221324 DEBUG nova.virt.libvirt.vif [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1544301339',display_name='tempest-ServerRescueNegativeTestJSON-server-1544301339',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1544301339',id=161,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:48:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf1c3d387dbe4191b4d05bdfca5959da',ramdisk_id='',reservation_id='r-h0qz026s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-81297706',owner_user_name='tempest-ServerRescueNegativeTestJSON-81297706-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:48:53Z,user_data=None,user_id='aa7f893021af4a84b03d85b476dadfe0',uuid=4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.382 221324 DEBUG nova.network.os_vif_util [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converting VIF {"id": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "address": "fa:16:3e:b0:16:3c", "network": {"id": "2128154c-0218-4f66-9509-e0db66eba3fc", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1701846829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf1c3d387dbe4191b4d05bdfca5959da", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e3ffb68-e7", "ovs_interfaceid": "6e3ffb68-e766-4edd-8e73-e44bb1e58fd4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.383 221324 DEBUG nova.network.os_vif_util [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.383 221324 DEBUG os_vif [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.385 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e3ffb68-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:51:28 np0005603622 nova_compute[221301]: 2026-01-31 08:51:28.392 221324 INFO os_vif [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:16:3c,bridge_name='br-int',has_traffic_filtering=True,id=6e3ffb68-e766-4edd-8e73-e44bb1e58fd4,network=Network(2128154c-0218-4f66-9509-e0db66eba3fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e3ffb68-e7')#033[00m
Jan 31 03:51:28 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [NOTICE]   (286557) : haproxy version is 2.8.14-c23fe91
Jan 31 03:51:28 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [NOTICE]   (286557) : path to executable is /usr/sbin/haproxy
Jan 31 03:51:28 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [WARNING]  (286557) : Exiting Master process...
Jan 31 03:51:28 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [ALERT]    (286557) : Current worker (286559) exited with code 143 (Terminated)
Jan 31 03:51:28 np0005603622 neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc[286553]: [WARNING]  (286557) : All workers exited. Exiting... (0)
Jan 31 03:51:28 np0005603622 systemd[1]: libpod-1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf.scope: Deactivated successfully.
Jan 31 03:51:28 np0005603622 podman[288872]: 2026-01-31 08:51:28.436879449 +0000 UTC m=+0.108739402 container died 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:51:28 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf-userdata-shm.mount: Deactivated successfully.
Jan 31 03:51:28 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0092fe2adfa00cb4a7d2b301e55ed44982192678d5bb66c7b0b0e901fa9b3f20-merged.mount: Deactivated successfully.
Jan 31 03:51:28 np0005603622 podman[288872]: 2026-01-31 08:51:28.773358736 +0000 UTC m=+0.445218689 container cleanup 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:51:28 np0005603622 systemd[1]: libpod-conmon-1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf.scope: Deactivated successfully.
Jan 31 03:51:29 np0005603622 podman[288931]: 2026-01-31 08:51:29.179185021 +0000 UTC m=+0.386768855 container remove 1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.183 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f3fdad-3b34-4dbf-98a8-1fd4c124d17c]: (4, ('Sat Jan 31 08:51:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf)\n1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf\nSat Jan 31 08:51:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc (1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf)\n1fcaa9edc7f152afd7d3ad1a4311b9e7d89fa9c935de2de435ace882dfe65abf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.185 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f788bf6c-551b-478f-9d03-ff6bc0b3b089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.186 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2128154c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:29 np0005603622 kernel: tap2128154c-00: left promiscuous mode
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.189 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.195 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.198 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a836054a-3b3d-4ddf-b9b0-0f5823b9559d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.217 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c3696657-b67e-4ccd-af7b-cb6b71fad054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.219 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5f99bf73-2d19-4274-b18e-2f88418f79af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.233 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1740d8c7-1198-443d-8115-2386ba69ed4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858130, 'reachable_time': 25585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288945, 'error': None, 'target': 'ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 systemd[1]: run-netns-ovnmeta\x2d2128154c\x2d0218\x2d4f66\x2d9509\x2de0db66eba3fc.mount: Deactivated successfully.
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.237 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2128154c-0218-4f66-9509-e0db66eba3fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:51:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:29.238 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[cc13d68b-f6c3-40af-9c2c-a0ac265261bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.327 221324 DEBUG nova.compute.manager [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.327 221324 DEBUG oslo_concurrency.lockutils [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.328 221324 DEBUG oslo_concurrency.lockutils [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.328 221324 DEBUG oslo_concurrency.lockutils [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.328 221324 DEBUG nova.compute.manager [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.328 221324 DEBUG nova.compute.manager [req-665b8823-08bc-42ca-b216-a68a9708d3e9 req-9eab70fc-17ab-412e-97ab-a04c37b4fe9b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-unplugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:51:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:29.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.683 221324 INFO nova.virt.libvirt.driver [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Deleting instance files /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_del#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.684 221324 INFO nova.virt.libvirt.driver [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Deletion of /var/lib/nova/instances/4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e_del complete#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.732 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.732 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.776 221324 INFO nova.compute.manager [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Took 1.67 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.777 221324 DEBUG oslo.service.loopingcall [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.777 221324 DEBUG nova.compute.manager [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.777 221324 DEBUG nova.network.neutron [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.786 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:51:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.909 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.910 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.919 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:51:29 np0005603622 nova_compute[221301]: 2026-01-31 08:51:29.920 221324 INFO nova.compute.claims [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.091 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/993746160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.543 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.548 221324 DEBUG nova.compute.provider_tree [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.568 221324 DEBUG nova.scheduler.client.report [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.603 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.604 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.678 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.679 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.708 221324 INFO nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.729 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.875 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.876 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.877 221324 INFO nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Creating image(s)#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.905 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:30.908 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:30.909 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.934 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.964 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:30 np0005603622 nova_compute[221301]: 2026-01-31 08:51:30.968 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.022 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.023 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.023 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.024 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.055 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.060 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.264 221324 DEBUG nova.policy [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1c6e7eff11b435a81429826a682b32f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0bfe11bd9d694684b527666e2c378eed', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.289 221324 DEBUG nova.network.neutron [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.316 221324 INFO nova.compute.manager [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Took 1.54 seconds to deallocate network for instance.#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.386 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.387 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.436 221324 DEBUG nova.compute.manager [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.437 221324 DEBUG oslo_concurrency.lockutils [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.438 221324 DEBUG oslo_concurrency.lockutils [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.438 221324 DEBUG oslo_concurrency.lockutils [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.439 221324 DEBUG nova.compute.manager [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] No waiting events found dispatching network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.439 221324 WARNING nova.compute.manager [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received unexpected event network-vif-plugged-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.439 221324 DEBUG nova.compute.manager [req-f904acda-ab2f-4e7e-963f-069170b77c6f req-256b8bad-99cc-4351-b21d-99b4ac63d1cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Received event network-vif-deleted-6e3ffb68-e766-4edd-8e73-e44bb1e58fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:31.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.511 221324 DEBUG oslo_concurrency.processutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.591 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.709 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] resizing rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.826 221324 DEBUG nova.objects.instance [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'migration_context' on Instance uuid e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.864 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.865 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Ensure instance console log exists: /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.866 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.866 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.866 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:31.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:51:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1778964318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.966 221324 DEBUG oslo_concurrency.processutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:31 np0005603622 nova_compute[221301]: 2026-01-31 08:51:31.972 221324 DEBUG nova.compute.provider_tree [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:51:32 np0005603622 nova_compute[221301]: 2026-01-31 08:51:32.025 221324 DEBUG nova.scheduler.client.report [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:51:32 np0005603622 nova_compute[221301]: 2026-01-31 08:51:32.055 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:32 np0005603622 nova_compute[221301]: 2026-01-31 08:51:32.135 221324 INFO nova.scheduler.client.report [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Deleted allocations for instance 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e#033[00m
Jan 31 03:51:32 np0005603622 nova_compute[221301]: 2026-01-31 08:51:32.293 221324 DEBUG oslo_concurrency.lockutils [None req-96aaa111-40ef-485e-bae9-4298ca9fb33c aa7f893021af4a84b03d85b476dadfe0 bf1c3d387dbe4191b4d05bdfca5959da - - default default] Lock "4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:32 np0005603622 nova_compute[221301]: 2026-01-31 08:51:32.627 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Successfully created port: 5d9caf6f-4602-4589-9c08-43b1a20a9c34 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:51:33 np0005603622 nova_compute[221301]: 2026-01-31 08:51:33.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:33.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:51:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:33 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:51:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.199 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Successfully updated port: 5d9caf6f-4602-4589-9c08-43b1a20a9c34 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.224 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.224 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.224 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.324 221324 DEBUG nova.compute.manager [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.324 221324 DEBUG nova.compute.manager [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing instance network info cache due to event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.325 221324 DEBUG oslo_concurrency.lockutils [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.419 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.714 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.959 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.994 221324 WARNING nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.994 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.994 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.995 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.995 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:34 np0005603622 nova_compute[221301]: 2026-01-31 08:51:34.995 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:35 np0005603622 nova_compute[221301]: 2026-01-31 08:51:35.020 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:35.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:35.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:35 np0005603622 nova_compute[221301]: 2026-01-31 08:51:35.990 221324 DEBUG nova.network.neutron [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.035 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.036 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance network_info: |[{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.036 221324 DEBUG oslo_concurrency.lockutils [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.037 221324 DEBUG nova.network.neutron [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.041 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Start _get_guest_xml network_info=[{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.045 221324 WARNING nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.052 221324 DEBUG nova.virt.libvirt.host [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.053 221324 DEBUG nova.virt.libvirt.host [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.057 221324 DEBUG nova.virt.libvirt.host [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.057 221324 DEBUG nova.virt.libvirt.host [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.059 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.059 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.060 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.060 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.060 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.061 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.061 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.061 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.062 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.062 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.062 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.063 221324 DEBUG nova.virt.hardware [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.066 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:51:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1663491557' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.512 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.551 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.556 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:36 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:36Z|00797|binding|INFO|Releasing lport 0ed76a0a-650c-4ec7-a4d4-0e745236b047 from this chassis (sb_readonly=0)
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:51:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3877555790' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.968 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.970 221324 DEBUG nova.virt.libvirt.vif [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:51:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1107028276',display_name='tempest-TestNetworkAdvancedServerOps-server-1107028276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1107028276',id=170,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMhAjak4EaSbAMsOQHzT/F+YD+mdwqaKmT1b8Pkiv6vPXQSXr3cEJwaMw5cOEGrpti6B+hT6jgk8eQer/fm3Y87ortF0Suf8ZM3a30yTbd7sIeUbybs0ERwtVLRyiyyflg==',key_name='tempest-TestNetworkAdvancedServerOps-1864875451',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-g52p6b46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:51:30Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.970 221324 DEBUG nova.network.os_vif_util [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.971 221324 DEBUG nova.network.os_vif_util [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.972 221324 DEBUG nova.objects.instance [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'pci_devices' on Instance uuid e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.989 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <uuid>e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8</uuid>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <name>instance-000000aa</name>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1107028276</nova:name>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:51:36</nova:creationTime>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <nova:port uuid="5d9caf6f-4602-4589-9c08-43b1a20a9c34">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="serial">e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="uuid">e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ed:8d:1c"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <target dev="tap5d9caf6f-46"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/console.log" append="off"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:51:36 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:51:36 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:51:36 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:51:36 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.991 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Preparing to wait for external event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.992 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.992 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.993 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.993 221324 DEBUG nova.virt.libvirt.vif [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:51:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1107028276',display_name='tempest-TestNetworkAdvancedServerOps-server-1107028276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1107028276',id=170,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMhAjak4EaSbAMsOQHzT/F+YD+mdwqaKmT1b8Pkiv6vPXQSXr3cEJwaMw5cOEGrpti6B+hT6jgk8eQer/fm3Y87ortF0Suf8ZM3a30yTbd7sIeUbybs0ERwtVLRyiyyflg==',key_name='tempest-TestNetworkAdvancedServerOps-1864875451',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-g52p6b46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:51:30Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.994 221324 DEBUG nova.network.os_vif_util [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.995 221324 DEBUG nova.network.os_vif_util [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.995 221324 DEBUG os_vif [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.996 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.997 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:36 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.999 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:36.999 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d9caf6f-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.000 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d9caf6f-46, col_values=(('external_ids', {'iface-id': '5d9caf6f-4602-4589-9c08-43b1a20a9c34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:8d:1c', 'vm-uuid': 'e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:37 np0005603622 NetworkManager[49080]: <info>  [1769849497.0072] manager: (tap5d9caf6f-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.013 221324 INFO os_vif [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46')#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.094 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.095 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.095 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No VIF found with MAC fa:16:3e:ed:8d:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.095 221324 INFO nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Using config drive#033[00m
Jan 31 03:51:37 np0005603622 podman[289354]: 2026-01-31 08:51:37.097861729 +0000 UTC m=+0.050419020 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.120 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:37 np0005603622 podman[289353]: 2026-01-31 08:51:37.147757495 +0000 UTC m=+0.101338754 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:51:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:37.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.702 221324 INFO nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Creating config drive at /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.709 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjvuo3zax execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.835 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjvuo3zax" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.870 221324 DEBUG nova.storage.rbd_utils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.875 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:51:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:37.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.918 221324 DEBUG nova.network.neutron [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updated VIF entry in instance network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.918 221324 DEBUG nova.network.neutron [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:37 np0005603622 nova_compute[221301]: 2026-01-31 08:51:37.943 221324 DEBUG oslo_concurrency.lockutils [req-47c4853d-6baf-4bb5-b030-afe2e56afbda req-f38b723d-b97c-4ada-8468-646d0b269d57 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.048 221324 DEBUG oslo_concurrency.processutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.048 221324 INFO nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Deleting local config drive /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/disk.config because it was imported into RBD.#033[00m
Jan 31 03:51:38 np0005603622 kernel: tap5d9caf6f-46: entered promiscuous mode
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.0959] manager: (tap5d9caf6f-46): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.142 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:38Z|00798|binding|INFO|Claiming lport 5d9caf6f-4602-4589-9c08-43b1a20a9c34 for this chassis.
Jan 31 03:51:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:38Z|00799|binding|INFO|5d9caf6f-4602-4589-9c08-43b1a20a9c34: Claiming fa:16:3e:ed:8d:1c 10.100.0.13
Jan 31 03:51:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:38Z|00800|binding|INFO|Setting lport 5d9caf6f-4602-4589-9c08-43b1a20a9c34 ovn-installed in OVS
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.150 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:38Z|00801|binding|INFO|Setting lport 5d9caf6f-4602-4589-9c08-43b1a20a9c34 up in Southbound
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.156 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:8d:1c 10.100.0.13'], port_security=['fa:16:3e:ed:8d:1c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f94c6ed-71ba-4114-a483-6969c923a169', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '2', 'neutron:security_group_ids': '823dcd1e-c848-4b0b-b209-d3c49f7c199a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2593ca0c-7180-414a-b815-a3d2e8e5bf3e, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=5d9caf6f-4602-4589-9c08-43b1a20a9c34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.158 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 in datapath 1f94c6ed-71ba-4114-a483-6969c923a169 bound to our chassis#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.159 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f94c6ed-71ba-4114-a483-6969c923a169#033[00m
Jan 31 03:51:38 np0005603622 systemd-udevd[289468]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:51:38 np0005603622 systemd-machined[190406]: New machine qemu-83-instance-000000aa.
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.168 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a76f0a7d-902d-4b18-92ca-b918db8cbf9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.169 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f94c6ed-71 in ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.171 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f94c6ed-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.171 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8de6b0a4-aef4-4034-961a-f76a3beebbe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.171 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d401f53f-fa21-43a0-89a4-7778f06f2f71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 systemd[1]: Started Virtual Machine qemu-83-instance-000000aa.
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.1753] device (tap5d9caf6f-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.1760] device (tap5d9caf6f-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.181 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[094f87f1-f319-4c10-bd6b-729cd06d6e9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.200 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[12220d50-1080-451b-bb5a-c7363c3750c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.222 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9486e26b-cee5-47f3-a204-58774295b529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.226 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a36a0f49-a08e-4fb8-9a49-7bd15b8e86b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 systemd-udevd[289471]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.2278] manager: (tap1f94c6ed-70): new Veth device (/org/freedesktop/NetworkManager/Devices/349)
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.252 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5bab399c-f217-4e45-8499-ce6e806efd33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.255 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e4eab2b5-823e-4f8a-8f35-46f64ff087a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.2729] device (tap1f94c6ed-70): carrier: link connected
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.278 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[40425d36-f1d3-47ae-8763-2c5c9674f1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.291 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[48b5613a-8d39-4d8e-92d0-f400b7e83c1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f94c6ed-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 240], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874961, 'reachable_time': 15826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289501, 'error': None, 'target': 'ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.303 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[64c47fa8-d4d4-4151-bbed-9ca8eb9c2cf0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:2d70'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 874961, 'tstamp': 874961}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289502, 'error': None, 'target': 'ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.316 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf715a5-0b80-413c-a319-5a75b691c36c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f94c6ed-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 240], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874961, 'reachable_time': 15826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289503, 'error': None, 'target': 'ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.336 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8382ca48-b97c-45c4-adde-e5bf89e09097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.379 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc28571-aa31-4dca-a6f7-db2eb79089dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.381 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f94c6ed-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.381 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.381 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f94c6ed-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:38 np0005603622 NetworkManager[49080]: <info>  [1769849498.3839] manager: (tap1f94c6ed-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.383 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 kernel: tap1f94c6ed-70: entered promiscuous mode
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.386 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f94c6ed-70, col_values=(('external_ids', {'iface-id': 'b0f245d6-f62b-4b38-8c6a-366768516e3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:51:38 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:38Z|00802|binding|INFO|Releasing lport b0f245d6-f62b-4b38-8c6a-366768516e3e from this chassis (sb_readonly=0)
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.387 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.388 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f94c6ed-71ba-4114-a483-6969c923a169.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f94c6ed-71ba-4114-a483-6969c923a169.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.389 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25300d5b-c447-4898-81db-4b139beca5fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.389 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-1f94c6ed-71ba-4114-a483-6969c923a169
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/1f94c6ed-71ba-4114-a483-6969c923a169.pid.haproxy
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 1f94c6ed-71ba-4114-a483-6969c923a169
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:51:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:51:38.390 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169', 'env', 'PROCESS_TAG=haproxy-1f94c6ed-71ba-4114-a483-6969c923a169', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f94c6ed-71ba-4114-a483-6969c923a169.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:51:38 np0005603622 nova_compute[221301]: 2026-01-31 08:51:38.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:38 np0005603622 podman[289534]: 2026-01-31 08:51:38.678813844 +0000 UTC m=+0.018392334 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:51:39 np0005603622 podman[289534]: 2026-01-31 08:51:39.102816683 +0000 UTC m=+0.442395153 container create de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 03:51:39 np0005603622 systemd[1]: Started libpod-conmon-de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47.scope.
Jan 31 03:51:39 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:51:39 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a735526c49a3d4c1632fae08892c7809d02325f28d9f28768fe0e0594740e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.200 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849499.200044, e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.202 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] VM Started (Lifecycle Event)#033[00m
Jan 31 03:51:39 np0005603622 podman[289534]: 2026-01-31 08:51:39.215662264 +0000 UTC m=+0.555240824 container init de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:51:39 np0005603622 podman[289534]: 2026-01-31 08:51:39.221219303 +0000 UTC m=+0.560797803 container start de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:51:39 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [NOTICE]   (289645) : New worker (289647) forked
Jan 31 03:51:39 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [NOTICE]   (289645) : Loading success.
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.283 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.288 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849499.2002969, e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.289 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.318 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.323 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.376 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:51:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:39.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:39 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:51:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:39.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.981 221324 DEBUG nova.compute.manager [req-33a521ca-b0b3-4703-8717-bbe0873d484b req-8ba82926-2ed5-45b4-bfd6-10e88dec2058 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.982 221324 DEBUG oslo_concurrency.lockutils [req-33a521ca-b0b3-4703-8717-bbe0873d484b req-8ba82926-2ed5-45b4-bfd6-10e88dec2058 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.982 221324 DEBUG oslo_concurrency.lockutils [req-33a521ca-b0b3-4703-8717-bbe0873d484b req-8ba82926-2ed5-45b4-bfd6-10e88dec2058 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.983 221324 DEBUG oslo_concurrency.lockutils [req-33a521ca-b0b3-4703-8717-bbe0873d484b req-8ba82926-2ed5-45b4-bfd6-10e88dec2058 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.983 221324 DEBUG nova.compute.manager [req-33a521ca-b0b3-4703-8717-bbe0873d484b req-8ba82926-2ed5-45b4-bfd6-10e88dec2058 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Processing event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.984 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.988 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849499.9881525, e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.988 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.991 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.994 221324 INFO nova.virt.libvirt.driver [-] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance spawned successfully.#033[00m
Jan 31 03:51:39 np0005603622 nova_compute[221301]: 2026-01-31 08:51:39.995 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.041 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.049 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.049 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.050 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.050 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.051 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.051 221324 DEBUG nova.virt.libvirt.driver [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.056 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.107 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.184 221324 INFO nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Took 9.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.184 221324 DEBUG nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.294 221324 INFO nova.compute.manager [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Took 10.43 seconds to build instance.#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.328 221324 DEBUG oslo_concurrency.lockutils [None req-790070c5-b3fb-4d5e-bcfa-5fbe90c501ad f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.329 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 5.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.329 221324 INFO nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:51:40 np0005603622 nova_compute[221301]: 2026-01-31 08:51:40.329 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:41.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:41.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.122 221324 DEBUG nova.compute.manager [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.122 221324 DEBUG oslo_concurrency.lockutils [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.123 221324 DEBUG oslo_concurrency.lockutils [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.123 221324 DEBUG oslo_concurrency.lockutils [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.123 221324 DEBUG nova.compute.manager [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] No waiting events found dispatching network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:51:42 np0005603622 nova_compute[221301]: 2026-01-31 08:51:42.123 221324 WARNING nova.compute.manager [req-db625457-c30c-45ca-ae2d-c14ec5ba2ff2 req-8ee8ba36-e206-4c0e-8195-d87811b56479 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received unexpected event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:51:43 np0005603622 nova_compute[221301]: 2026-01-31 08:51:43.343 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849488.3418355, 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:51:43 np0005603622 nova_compute[221301]: 2026-01-31 08:51:43.343 221324 INFO nova.compute.manager [-] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:51:43 np0005603622 nova_compute[221301]: 2026-01-31 08:51:43.401 221324 DEBUG nova.compute.manager [None req-e2c8f97c-f7ed-4774-a399-a9be5e039bd0 - - - - - -] [instance: 4c9bd4c4-6240-4d7d-88e7-6af4e69fe06e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:51:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:43.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:43.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:44 np0005603622 nova_compute[221301]: 2026-01-31 08:51:44.718 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:45.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:45.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:46 np0005603622 nova_compute[221301]: 2026-01-31 08:51:46.908 221324 DEBUG nova.compute.manager [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:51:46 np0005603622 nova_compute[221301]: 2026-01-31 08:51:46.909 221324 DEBUG nova.compute.manager [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing instance network info cache due to event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:51:46 np0005603622 nova_compute[221301]: 2026-01-31 08:51:46.910 221324 DEBUG oslo_concurrency.lockutils [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:51:46 np0005603622 nova_compute[221301]: 2026-01-31 08:51:46.910 221324 DEBUG oslo_concurrency.lockutils [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:51:46 np0005603622 nova_compute[221301]: 2026-01-31 08:51:46.911 221324 DEBUG nova.network.neutron [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:51:47 np0005603622 nova_compute[221301]: 2026-01-31 08:51:47.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:47 np0005603622 nova_compute[221301]: 2026-01-31 08:51:47.448 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:51:47 np0005603622 nova_compute[221301]: 2026-01-31 08:51:47.449 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:51:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:47.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:47 np0005603622 nova_compute[221301]: 2026-01-31 08:51:47.538 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:51:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:47.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:48 np0005603622 nova_compute[221301]: 2026-01-31 08:51:48.422 221324 DEBUG nova.network.neutron [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updated VIF entry in instance network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:51:48 np0005603622 nova_compute[221301]: 2026-01-31 08:51:48.423 221324 DEBUG nova.network.neutron [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:51:48 np0005603622 nova_compute[221301]: 2026-01-31 08:51:48.525 221324 DEBUG oslo_concurrency.lockutils [req-68d10e6a-3f2b-42d2-9609-e03183a0e145 req-5346f026-d98d-4dc8-bac8-2a3e2d72d4bd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:51:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:49.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:49 np0005603622 nova_compute[221301]: 2026-01-31 08:51:49.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:51.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:52 np0005603622 nova_compute[221301]: 2026-01-31 08:51:52.016 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:53Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ed:8d:1c 10.100.0.13
Jan 31 03:51:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:51:53Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:8d:1c 10.100.0.13
Jan 31 03:51:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:53.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:54 np0005603622 nova_compute[221301]: 2026-01-31 08:51:54.751 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:55.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:51:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:51:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:55.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:51:57 np0005603622 nova_compute[221301]: 2026-01-31 08:51:57.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:57.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:57.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:51:59.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.687 221324 INFO nova.compute.manager [None req-72138275-97e8-4bf5-8270-658f3f06dd41 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Get console output#033[00m
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.692 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.754 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:51:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:51:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:51:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:51:59.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.935 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.936 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:51:59 np0005603622 nova_compute[221301]: 2026-01-31 08:51:59.960 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.056 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.057 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.067 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.068 221324 INFO nova.compute.claims [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.314 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3914426281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.759 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.765 221324 DEBUG nova.compute.provider_tree [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.791 221324 DEBUG nova.scheduler.client.report [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.844 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.845 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.916 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.916 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.942 221324 INFO nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:52:00 np0005603622 nova_compute[221301]: 2026-01-31 08:52:00.974 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.153 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.154 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.154 221324 INFO nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Creating image(s)#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.178 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.207 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.233 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.237 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.261 221324 DEBUG nova.policy [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3bd4ce8a916a4bdbbc988eb4fe32991e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.296 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.297 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.298 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.298 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.324 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.330 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1e30fc0d-e774-4cad-b497-496833551f8b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:01.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.851 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 1e30fc0d-e774-4cad-b497-496833551f8b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:01 np0005603622 nova_compute[221301]: 2026-01-31 08:52:01.926 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] resizing rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:52:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:01.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:52:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 53K writes, 213K keys, 53K commit groups, 1.0 writes per commit group, ingest: 0.20 GB, 0.04 MB/s#012Cumulative WAL: 53K writes, 19K syncs, 2.77 writes per sync, written: 0.20 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5595 writes, 23K keys, 5595 commit groups, 1.0 writes per commit group, ingest: 26.78 MB, 0.04 MB/s#012Interval WAL: 5595 writes, 1946 syncs, 2.88 writes per sync, written: 0.03 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.185 221324 DEBUG nova.objects.instance [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'migration_context' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.236 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.237 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Ensure instance console log exists: /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.237 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.238 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.238 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.486 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:02 np0005603622 nova_compute[221301]: 2026-01-31 08:52:02.597 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Successfully created port: 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:52:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:03.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:03.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:03 np0005603622 nova_compute[221301]: 2026-01-31 08:52:03.996 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Successfully updated port: 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.017 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.017 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquired lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.018 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.224 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.954 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.955 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:04 np0005603622 nova_compute[221301]: 2026-01-31 08:52:04.955 221324 DEBUG nova.network.neutron [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.169 221324 DEBUG nova.network.neutron [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating instance_info_cache with network_info: [{"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.286 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Releasing lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.286 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Instance network_info: |[{"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.289 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Start _get_guest_xml network_info=[{"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.295 221324 WARNING nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.301 221324 DEBUG nova.virt.libvirt.host [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.302 221324 DEBUG nova.virt.libvirt.host [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.307 221324 DEBUG nova.virt.libvirt.host [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.308 221324 DEBUG nova.virt.libvirt.host [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.309 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.309 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.310 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.310 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.311 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.311 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.311 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.311 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.312 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.312 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.312 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.312 221324 DEBUG nova.virt.hardware [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.315 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:05.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:52:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3885856315' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.768 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.828 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:05 np0005603622 nova_compute[221301]: 2026-01-31 08:52:05.833 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:05.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.013 221324 DEBUG nova.compute.manager [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-changed-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.014 221324 DEBUG nova.compute.manager [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Refreshing instance network info cache due to event network-changed-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.014 221324 DEBUG oslo_concurrency.lockutils [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.015 221324 DEBUG oslo_concurrency.lockutils [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.015 221324 DEBUG nova.network.neutron [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Refreshing network info cache for port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:52:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:52:06 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2733327734' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.258 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.260 221324 DEBUG nova.virt.libvirt.vif [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:51:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1474271208',display_name='tempest-AttachVolumeNegativeTest-server-1474271208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1474271208',id=173,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH8uyo0SI9g4bhyXaUu/mgiRdZ9p5P7fj/0AXQr6KiCUFLPfccpQyldXU/JL23UYZE5aZv7vIKBfjnNipswNMAX6ybSgDqSjq/nAj/WykPc1IMgcwquHyEnNFgVLXNp54g==',key_name='tempest-keypair-1515101677',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-sqz0x7qz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:52:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=1e30fc0d-e774-4cad-b497-496833551f8b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.261 221324 DEBUG nova.network.os_vif_util [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.262 221324 DEBUG nova.network.os_vif_util [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.264 221324 DEBUG nova.objects.instance [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.318 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <uuid>1e30fc0d-e774-4cad-b497-496833551f8b</uuid>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <name>instance-000000ad</name>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1474271208</nova:name>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:52:05</nova:creationTime>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:user uuid="3bd4ce8a916a4bdbbc988eb4fe32991e">tempest-AttachVolumeNegativeTest-457307401-project-member</nova:user>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:project uuid="76fb5cb7abcd4d74abfc471a96bbd12c">tempest-AttachVolumeNegativeTest-457307401</nova:project>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <nova:port uuid="93b367b9-c0d7-4f7e-98bf-a33da41aa3d2">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="serial">1e30fc0d-e774-4cad-b497-496833551f8b</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="uuid">1e30fc0d-e774-4cad-b497-496833551f8b</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/1e30fc0d-e774-4cad-b497-496833551f8b_disk">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/1e30fc0d-e774-4cad-b497-496833551f8b_disk.config">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:26:b5:e7"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <target dev="tap93b367b9-c0"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/console.log" append="off"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:52:06 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:52:06 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:52:06 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:52:06 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.320 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Preparing to wait for external event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.321 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.321 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.321 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.322 221324 DEBUG nova.virt.libvirt.vif [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:51:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1474271208',display_name='tempest-AttachVolumeNegativeTest-server-1474271208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1474271208',id=173,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH8uyo0SI9g4bhyXaUu/mgiRdZ9p5P7fj/0AXQr6KiCUFLPfccpQyldXU/JL23UYZE5aZv7vIKBfjnNipswNMAX6ybSgDqSjq/nAj/WykPc1IMgcwquHyEnNFgVLXNp54g==',key_name='tempest-keypair-1515101677',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-sqz0x7qz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:52:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=1e30fc0d-e774-4cad-b497-496833551f8b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.322 221324 DEBUG nova.network.os_vif_util [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.323 221324 DEBUG nova.network.os_vif_util [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.324 221324 DEBUG os_vif [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.325 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.326 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.330 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.331 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93b367b9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.332 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93b367b9-c0, col_values=(('external_ids', {'iface-id': '93b367b9-c0d7-4f7e-98bf-a33da41aa3d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b5:e7', 'vm-uuid': '1e30fc0d-e774-4cad-b497-496833551f8b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:06 np0005603622 NetworkManager[49080]: <info>  [1769849526.3371] manager: (tap93b367b9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.342 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.344 221324 INFO os_vif [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0')#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.379 221324 DEBUG nova.network.neutron [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.450 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.464 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.464 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.465 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No VIF found with MAC fa:16:3e:26:b5:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.465 221324 INFO nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Using config drive#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.496 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.762 221324 DEBUG nova.virt.libvirt.driver [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.763 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Creating file /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/257fdf7cee9e47d597a3ea884ba26897.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 31 03:52:06 np0005603622 nova_compute[221301]: 2026-01-31 08:52:06.764 221324 DEBUG oslo_concurrency.processutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/257fdf7cee9e47d597a3ea884ba26897.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.156 221324 DEBUG oslo_concurrency.processutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/257fdf7cee9e47d597a3ea884ba26897.tmp" returned: 1 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.157 221324 DEBUG oslo_concurrency.processutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8/257fdf7cee9e47d597a3ea884ba26897.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.158 221324 DEBUG nova.virt.libvirt.volume.remotefs [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Creating directory /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.158 221324 DEBUG oslo_concurrency.processutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.321 221324 INFO nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Creating config drive at /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.327 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo95bqwsx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.357 221324 DEBUG oslo_concurrency.processutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.362 221324 DEBUG nova.virt.libvirt.driver [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.428 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 4af4043c-8199-4d0f-acf9-38d029560167] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.453 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpo95bqwsx" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.482 221324 DEBUG nova.storage.rbd_utils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] rbd image 1e30fc0d-e774-4cad-b497-496833551f8b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.487 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config 1e30fc0d-e774-4cad-b497-496833551f8b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.515 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:52:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:07.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.862 221324 DEBUG oslo_concurrency.processutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config 1e30fc0d-e774-4cad-b497-496833551f8b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.375s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.862 221324 INFO nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Deleting local config drive /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:52:07 np0005603622 kernel: tap93b367b9-c0: entered promiscuous mode
Jan 31 03:52:07 np0005603622 NetworkManager[49080]: <info>  [1769849527.9032] manager: (tap93b367b9-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Jan 31 03:52:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:07Z|00803|binding|INFO|Claiming lport 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 for this chassis.
Jan 31 03:52:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:07Z|00804|binding|INFO|93b367b9-c0d7-4f7e-98bf-a33da41aa3d2: Claiming fa:16:3e:26:b5:e7 10.100.0.8
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:07Z|00805|binding|INFO|Setting lport 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 ovn-installed in OVS
Jan 31 03:52:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:07Z|00806|binding|INFO|Setting lport 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 up in Southbound
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.913 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b5:e7 10.100.0.8'], port_security=['fa:16:3e:26:b5:e7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e30fc0d-e774-4cad-b497-496833551f8b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4b0a4a7-6b42-4161-a5a6-7829375769dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.914 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.916 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 in datapath a02f269a-650e-4227-8352-05abf2566c17 bound to our chassis#033[00m
Jan 31 03:52:07 np0005603622 nova_compute[221301]: 2026-01-31 08:52:07.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.918 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a02f269a-650e-4227-8352-05abf2566c17#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.927 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0f054eb9-6262-44d6-a420-1b36c0b86372]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.928 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa02f269a-61 in ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.931 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa02f269a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.931 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[22029291-cd14-4d47-84da-73e0d48deec2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.932 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bb669266-a3b4-450b-bf69-45db4bc2c054]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:07.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.945 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[5abc5709-4059-40af-bdee-90249f79fda2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:07 np0005603622 systemd-udevd[289999]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:52:07 np0005603622 systemd-machined[190406]: New machine qemu-84-instance-000000ad.
Jan 31 03:52:07 np0005603622 NetworkManager[49080]: <info>  [1769849527.9577] device (tap93b367b9-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:52:07 np0005603622 NetworkManager[49080]: <info>  [1769849527.9583] device (tap93b367b9-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:52:07 np0005603622 systemd[1]: Started Virtual Machine qemu-84-instance-000000ad.
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.970 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[67b6b232-38bf-4fe2-8a9a-6b575b436c41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:07 np0005603622 podman[289982]: 2026-01-31 08:52:07.991851513 +0000 UTC m=+0.057850810 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 31 03:52:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:07.996 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dcbb91b9-28c1-40b5-bbf7-d933c3cd48b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 NetworkManager[49080]: <info>  [1769849528.0025] manager: (tapa02f269a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/353)
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.001 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[09951449-bd76-4fa2-98e5-39794b91fc17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.023 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbd147f-bef3-46b3-8326-3be4f5a42d97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.027 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b8590649-634d-45d8-b9be-52595f3db4db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 NetworkManager[49080]: <info>  [1769849528.0415] device (tapa02f269a-60): carrier: link connected
Jan 31 03:52:08 np0005603622 podman[289981]: 2026-01-31 08:52:08.046310171 +0000 UTC m=+0.114719682 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.045 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b061d90a-d895-4bd4-8665-d41bb861dc5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.059 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b02ff301-516d-4b96-ba23-6eaea5478de0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877938, 'reachable_time': 42570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290058, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.070 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[198c1b9c-5b46-400f-b74d-778528a57d1c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:e973'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 877938, 'tstamp': 877938}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290059, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.084 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[25ee7bc4-3adf-486e-a490-4caa396f31b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa02f269a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:e9:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877938, 'reachable_time': 42570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 290060, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.105 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e53aa3-1657-488a-be74-992850b61112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.148 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2734e797-abc2-40e9-a8cd-6bea6af5562e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.149 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.150 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.150 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa02f269a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:08 np0005603622 NetworkManager[49080]: <info>  [1769849528.1527] manager: (tapa02f269a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 31 03:52:08 np0005603622 kernel: tapa02f269a-60: entered promiscuous mode
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.157 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa02f269a-60, col_values=(('external_ids', {'iface-id': '2c775482-0f82-4695-be62-4a95328fbf79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:08 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:08Z|00807|binding|INFO|Releasing lport 2c775482-0f82-4695-be62-4a95328fbf79 from this chassis (sb_readonly=0)
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.159 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.162 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[974691f9-0803-40a4-83ad-51faf7042615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.164 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/a02f269a-650e-4227-8352-05abf2566c17.pid.haproxy
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID a02f269a-650e-4227-8352-05abf2566c17
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:52:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:08.165 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'env', 'PROCESS_TAG=haproxy-a02f269a-650e-4227-8352-05abf2566c17', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a02f269a-650e-4227-8352-05abf2566c17.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.438 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849528.437605, 1e30fc0d-e774-4cad-b497-496833551f8b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.438 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.498 221324 DEBUG nova.compute.manager [req-7264c5cd-3f69-470f-ace6-38b39d12f158 req-b1ca05b0-81a9-452b-a439-b0bc8793749f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.499 221324 DEBUG oslo_concurrency.lockutils [req-7264c5cd-3f69-470f-ace6-38b39d12f158 req-b1ca05b0-81a9-452b-a439-b0bc8793749f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.499 221324 DEBUG oslo_concurrency.lockutils [req-7264c5cd-3f69-470f-ace6-38b39d12f158 req-b1ca05b0-81a9-452b-a439-b0bc8793749f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.499 221324 DEBUG oslo_concurrency.lockutils [req-7264c5cd-3f69-470f-ace6-38b39d12f158 req-b1ca05b0-81a9-452b-a439-b0bc8793749f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.500 221324 DEBUG nova.compute.manager [req-7264c5cd-3f69-470f-ace6-38b39d12f158 req-b1ca05b0-81a9-452b-a439-b0bc8793749f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Processing event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.500 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:52:08 np0005603622 podman[290134]: 2026-01-31 08:52:08.503938602 +0000 UTC m=+0.057936262 container create 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.504 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.508 221324 INFO nova.virt.libvirt.driver [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Instance spawned successfully.#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.508 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:52:08 np0005603622 systemd[1]: Started libpod-conmon-81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e.scope.
Jan 31 03:52:08 np0005603622 podman[290134]: 2026-01-31 08:52:08.467883547 +0000 UTC m=+0.021881207 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:52:08 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:52:08 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d988f0fccfd9548b4a3b421fc9f47c7b43ea487f0452476855c4f0d9f94bb571/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:52:08 np0005603622 podman[290134]: 2026-01-31 08:52:08.587583411 +0000 UTC m=+0.141581101 container init 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:52:08 np0005603622 podman[290134]: 2026-01-31 08:52:08.592099892 +0000 UTC m=+0.146097552 container start 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:52:08 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [NOTICE]   (290154) : New worker (290156) forked
Jan 31 03:52:08 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [NOTICE]   (290154) : Loading success.
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.733 221324 DEBUG nova.network.neutron [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updated VIF entry in instance network info cache for port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.733 221324 DEBUG nova.network.neutron [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating instance_info_cache with network_info: [{"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.738 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.738 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.738 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.738 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.739 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.763 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.769 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.774 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.774 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.775 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.775 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.776 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.776 221324 DEBUG nova.virt.libvirt.driver [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.869 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.870 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849528.4377854, 1e30fc0d-e774-4cad-b497-496833551f8b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.870 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.872 221324 DEBUG oslo_concurrency.lockutils [req-3f31f630-d5a1-475f-8d40-0663f9a2e4c7 req-5ff340f6-1383-4555-8085-6a1c153fa7a9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.912 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.915 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849528.5039485, 1e30fc0d-e774-4cad-b497-496833551f8b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.915 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.942 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.945 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.982 221324 INFO nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Took 7.83 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.983 221324 DEBUG nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:08 np0005603622 nova_compute[221301]: 2026-01-31 08:52:08.990 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.157 221324 INFO nova.compute.manager [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Took 9.13 seconds to build instance.#033[00m
Jan 31 03:52:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/746702742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.195 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.208 221324 DEBUG oslo_concurrency.lockutils [None req-b66e039a-0d98-447e-890e-932b1c19a199 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.275 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.275 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.278 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.278 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.281 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.282 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000ad as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.443 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.444 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3760MB free_disk=20.69357681274414GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.529 221324 INFO nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating resource usage from migration dfacf1ec-776d-4ad2-a40b-20488a596d16#033[00m
Jan 31 03:52:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:09.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.568 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.568 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 1e30fc0d-e774-4cad-b497-496833551f8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.568 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Migration dfacf1ec-776d-4ad2-a40b-20488a596d16 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.569 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.569 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.597 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.638 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.638 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.657 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.688 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.763 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:09 np0005603622 nova_compute[221301]: 2026-01-31 08:52:09.788 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:09.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1365022218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.236 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.242 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.274 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.334 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.334 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.379 221324 INFO nova.virt.libvirt.driver [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:52:10 np0005603622 kernel: tap5d9caf6f-46 (unregistering): left promiscuous mode
Jan 31 03:52:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:10 np0005603622 NetworkManager[49080]: <info>  [1769849530.5700] device (tap5d9caf6f-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.582 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:10Z|00808|binding|INFO|Releasing lport 5d9caf6f-4602-4589-9c08-43b1a20a9c34 from this chassis (sb_readonly=0)
Jan 31 03:52:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:10Z|00809|binding|INFO|Setting lport 5d9caf6f-4602-4589-9c08-43b1a20a9c34 down in Southbound
Jan 31 03:52:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:10Z|00810|binding|INFO|Removing iface tap5d9caf6f-46 ovn-installed in OVS
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.585 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:10.593 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:8d:1c 10.100.0.13'], port_security=['fa:16:3e:ed:8d:1c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f94c6ed-71ba-4114-a483-6969c923a169', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '4', 'neutron:security_group_ids': '823dcd1e-c848-4b0b-b209-d3c49f7c199a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2593ca0c-7180-414a-b815-a3d2e8e5bf3e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=5d9caf6f-4602-4589-9c08-43b1a20a9c34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:10.594 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 in datapath 1f94c6ed-71ba-4114-a483-6969c923a169 unbound from our chassis#033[00m
Jan 31 03:52:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:10.595 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f94c6ed-71ba-4114-a483-6969c923a169, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:10.596 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88b296e6-2154-46ce-9a3e-54e63bebd7cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:10.597 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169 namespace which is not needed anymore#033[00m
Jan 31 03:52:10 np0005603622 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 31 03:52:10 np0005603622 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000aa.scope: Consumed 14.197s CPU time.
Jan 31 03:52:10 np0005603622 systemd-machined[190406]: Machine qemu-83-instance-000000aa terminated.
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.675 221324 DEBUG nova.compute.manager [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.677 221324 DEBUG oslo_concurrency.lockutils [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.678 221324 DEBUG oslo_concurrency.lockutils [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.678 221324 DEBUG oslo_concurrency.lockutils [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.679 221324 DEBUG nova.compute.manager [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] No waiting events found dispatching network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.680 221324 WARNING nova.compute.manager [req-b931e11d-ed8e-4cbd-ad24-1aa457c28c8e req-3cecf8da-a2e4-41c8-8f0e-ab30f8bbe758 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received unexpected event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:52:10 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [NOTICE]   (289645) : haproxy version is 2.8.14-c23fe91
Jan 31 03:52:10 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [NOTICE]   (289645) : path to executable is /usr/sbin/haproxy
Jan 31 03:52:10 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [WARNING]  (289645) : Exiting Master process...
Jan 31 03:52:10 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [ALERT]    (289645) : Current worker (289647) exited with code 143 (Terminated)
Jan 31 03:52:10 np0005603622 neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169[289641]: [WARNING]  (289645) : All workers exited. Exiting... (0)
Jan 31 03:52:10 np0005603622 systemd[1]: libpod-de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47.scope: Deactivated successfully.
Jan 31 03:52:10 np0005603622 podman[290232]: 2026-01-31 08:52:10.7211753 +0000 UTC m=+0.046316291 container died de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:52:10 np0005603622 NetworkManager[49080]: <info>  [1769849530.8007] manager: (tap5d9caf6f-46): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.820 221324 INFO nova.virt.libvirt.driver [-] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Instance destroyed successfully.#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.822 221324 DEBUG nova.virt.libvirt.vif [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:51:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1107028276',display_name='tempest-TestNetworkAdvancedServerOps-server-1107028276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1107028276',id=170,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMhAjak4EaSbAMsOQHzT/F+YD+mdwqaKmT1b8Pkiv6vPXQSXr3cEJwaMw5cOEGrpti6B+hT6jgk8eQer/fm3Y87ortF0Suf8ZM3a30yTbd7sIeUbybs0ERwtVLRyiyyflg==',key_name='tempest-TestNetworkAdvancedServerOps-1864875451',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:51:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-g52p6b46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:52:03Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2122556826", "vif_mac": "fa:16:3e:ed:8d:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.822 221324 DEBUG nova.network.os_vif_util [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2122556826", "vif_mac": "fa:16:3e:ed:8d:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.823 221324 DEBUG nova.network.os_vif_util [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.824 221324 DEBUG os_vif [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.826 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.826 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d9caf6f-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.827 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:52:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47-userdata-shm.mount: Deactivated successfully.
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.832 221324 INFO os_vif [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46')#033[00m
Jan 31 03:52:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-45a735526c49a3d4c1632fae08892c7809d02325f28d9f28768fe0e0594740e6-merged.mount: Deactivated successfully.
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.838 221324 DEBUG nova.virt.libvirt.driver [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:10 np0005603622 nova_compute[221301]: 2026-01-31 08:52:10.838 221324 DEBUG nova.virt.libvirt.driver [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:52:10 np0005603622 podman[290232]: 2026-01-31 08:52:10.864957289 +0000 UTC m=+0.190098270 container cleanup de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:52:10 np0005603622 systemd[1]: libpod-conmon-de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47.scope: Deactivated successfully.
Jan 31 03:52:11 np0005603622 podman[290265]: 2026-01-31 08:52:11.017369588 +0000 UTC m=+0.129014514 container remove de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.022 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cef6e547-ee8e-4ab5-8357-30242fa70865]: (4, ('Sat Jan 31 08:52:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169 (de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47)\nde1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47\nSat Jan 31 08:52:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169 (de1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47)\nde1959b6691b432582d3004676943c61e47e0ceaac8fb749856294d822ef7f47\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.025 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e7d87f-3e7a-4193-982e-3a60d6047cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.026 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f94c6ed-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:11 np0005603622 kernel: tap1f94c6ed-70: left promiscuous mode
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.038 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d58a971c-ba09-40f6-86ac-06a7c7c1185f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.061 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5e91c30a-b31a-4f4b-9491-c7923f6a20c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.063 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[47815631-3f4a-48cb-ad2b-a00b01014c87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.074 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1753b737-44f0-4165-a2bf-1680f1845041]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874955, 'reachable_time': 24532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290282, 'error': None, 'target': 'ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 systemd[1]: run-netns-ovnmeta\x2d1f94c6ed\x2d71ba\x2d4114\x2da483\x2d6969c923a169.mount: Deactivated successfully.
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.078 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f94c6ed-71ba-4114-a483-6969c923a169 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:52:11 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:11.078 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[e2409cdd-5934-4d2b-b3fb-e606092f9beb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.220 221324 DEBUG neutronclient.v2_0.client [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.384 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.385 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.385 221324 DEBUG oslo_concurrency.lockutils [None req-2ad6268d-fc06-4bb2-bbeb-fb778b5feddf f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.520 221324 DEBUG nova.compute.manager [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-unplugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.521 221324 DEBUG oslo_concurrency.lockutils [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.521 221324 DEBUG oslo_concurrency.lockutils [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.521 221324 DEBUG oslo_concurrency.lockutils [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.522 221324 DEBUG nova.compute.manager [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] No waiting events found dispatching network-vif-unplugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:11 np0005603622 nova_compute[221301]: 2026-01-31 08:52:11.522 221324 WARNING nova.compute.manager [req-c8b27aae-22e4-4ac5-95d9-2b862100fb5b req-1fea4f62-53f1-4f16-a906-5eae62515a0d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received unexpected event network-vif-unplugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:52:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:11.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:11.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:12 np0005603622 nova_compute[221301]: 2026-01-31 08:52:12.335 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:12 np0005603622 nova_compute[221301]: 2026-01-31 08:52:12.336 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:13.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.772 221324 DEBUG nova.compute.manager [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.772 221324 DEBUG oslo_concurrency.lockutils [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.773 221324 DEBUG oslo_concurrency.lockutils [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.773 221324 DEBUG oslo_concurrency.lockutils [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.773 221324 DEBUG nova.compute.manager [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] No waiting events found dispatching network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:13 np0005603622 nova_compute[221301]: 2026-01-31 08:52:13.774 221324 WARNING nova.compute.manager [req-c54f7d54-5e45-4f30-aa9b-91f100fd76b4 req-c69215c8-ba74-4e1e-9950-d777a2290afb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received unexpected event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 31 03:52:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:13.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:14.292 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:14.294 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.998 221324 DEBUG nova.compute.manager [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.998 221324 DEBUG nova.compute.manager [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing instance network info cache due to event network-changed-5d9caf6f-4602-4589-9c08-43b1a20a9c34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.999 221324 DEBUG oslo_concurrency.lockutils [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.999 221324 DEBUG oslo_concurrency.lockutils [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:14 np0005603622 nova_compute[221301]: 2026-01-31 08:52:14.999 221324 DEBUG nova.network.neutron [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Refreshing network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:52:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:15.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.830 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.923 221324 DEBUG nova.compute.manager [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-changed-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.923 221324 DEBUG nova.compute.manager [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Refreshing instance network info cache due to event network-changed-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.924 221324 DEBUG oslo_concurrency.lockutils [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.924 221324 DEBUG oslo_concurrency.lockutils [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:15 np0005603622 nova_compute[221301]: 2026-01-31 08:52:15.924 221324 DEBUG nova.network.neutron [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Refreshing network info cache for port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:52:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:15.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 31 03:52:16 np0005603622 nova_compute[221301]: 2026-01-31 08:52:16.865 221324 DEBUG nova.network.neutron [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updated VIF entry in instance network info cache for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:52:16 np0005603622 nova_compute[221301]: 2026-01-31 08:52:16.865 221324 DEBUG nova.network.neutron [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:16 np0005603622 nova_compute[221301]: 2026-01-31 08:52:16.893 221324 DEBUG oslo_concurrency.lockutils [req-f32b06b3-0fcb-41b5-9bfd-6323310c51b5 req-32cc343d-1115-4313-aa94-e7ab2f649ca9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:17.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:17 np0005603622 nova_compute[221301]: 2026-01-31 08:52:17.616 221324 DEBUG nova.network.neutron [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updated VIF entry in instance network info cache for port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:52:17 np0005603622 nova_compute[221301]: 2026-01-31 08:52:17.616 221324 DEBUG nova.network.neutron [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating instance_info_cache with network_info: [{"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:17 np0005603622 nova_compute[221301]: 2026-01-31 08:52:17.650 221324 DEBUG oslo_concurrency.lockutils [req-c347486a-29ba-4218-a89f-bc99220a2124 req-dceb0b16-5b5a-48d1-999d-b48b6855cf3e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-1e30fc0d-e774-4cad-b497-496833551f8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:17.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.618 221324 DEBUG nova.compute.manager [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.619 221324 DEBUG oslo_concurrency.lockutils [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.619 221324 DEBUG oslo_concurrency.lockutils [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.620 221324 DEBUG oslo_concurrency.lockutils [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.620 221324 DEBUG nova.compute.manager [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] No waiting events found dispatching network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:18 np0005603622 nova_compute[221301]: 2026-01-31 08:52:18.620 221324 WARNING nova.compute.manager [req-82c057ee-b492-4fbb-b35e-01d1eee60c2e req-b762cb2a-a76c-463c-81f5-439bbccd6ea3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received unexpected event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 31 03:52:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:19.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:19 np0005603622 nova_compute[221301]: 2026-01-31 08:52:19.767 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:19.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 31 03:52:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.832 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.922 221324 DEBUG nova.compute.manager [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.923 221324 DEBUG oslo_concurrency.lockutils [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.923 221324 DEBUG oslo_concurrency.lockutils [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.924 221324 DEBUG oslo_concurrency.lockutils [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.924 221324 DEBUG nova.compute.manager [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] No waiting events found dispatching network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:20 np0005603622 nova_compute[221301]: 2026-01-31 08:52:20.924 221324 WARNING nova.compute.manager [req-52a4bc47-63ce-4889-886f-d41b6381d910 req-6f54317f-70d4-4055-ba33-ab4f1b319961 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Received unexpected event network-vif-plugged-5d9caf6f-4602-4589-9c08-43b1a20a9c34 for instance with vm_state resized and task_state None.#033[00m
Jan 31 03:52:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:21.299 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:21 np0005603622 nova_compute[221301]: 2026-01-31 08:52:21.870 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:21 np0005603622 nova_compute[221301]: 2026-01-31 08:52:21.871 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:21 np0005603622 nova_compute[221301]: 2026-01-31 08:52:21.872 221324 DEBUG nova.compute.manager [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Going to confirm migration 24 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 31 03:52:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:21.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:22 np0005603622 nova_compute[221301]: 2026-01-31 08:52:22.791 221324 DEBUG neutronclient.v2_0.client [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 5d9caf6f-4602-4589-9c08-43b1a20a9c34 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 31 03:52:22 np0005603622 nova_compute[221301]: 2026-01-31 08:52:22.793 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:52:22 np0005603622 nova_compute[221301]: 2026-01-31 08:52:22.794 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:52:22 np0005603622 nova_compute[221301]: 2026-01-31 08:52:22.794 221324 DEBUG nova.network.neutron [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:52:22 np0005603622 nova_compute[221301]: 2026-01-31 08:52:22.795 221324 DEBUG nova.objects.instance [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'info_cache' on Instance uuid e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:23Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:b5:e7 10.100.0.8
Jan 31 03:52:23 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:23Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:b5:e7 10.100.0.8
Jan 31 03:52:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:23.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:23.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:24 np0005603622 nova_compute[221301]: 2026-01-31 08:52:24.769 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:25.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.819 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849530.817046, e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.819 221324 INFO nova.compute.manager [-] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.864 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.884 221324 DEBUG nova.compute.manager [None req-cba6e5d8-28e6-4d84-9e69-a483ab646931 - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.888 221324 DEBUG nova.compute.manager [None req-cba6e5d8-28e6-4d84-9e69-a483ab646931 - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:52:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:25.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:25 np0005603622 nova_compute[221301]: 2026-01-31 08:52:25.988 221324 INFO nova.compute.manager [None req-cba6e5d8-28e6-4d84-9e69-a483ab646931 - - - - - -] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 31 03:52:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 03:52:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 14K writes, 73K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 14K writes, 14K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1586 writes, 7487 keys, 1586 commit groups, 1.0 writes per commit group, ingest: 15.45 MB, 0.03 MB/s#012Interval WAL: 1586 writes, 1586 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     39.2      2.29              0.29        46    0.050       0      0       0.0       0.0#012  L6      1/0   10.53 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.0     86.4     73.5      6.15              1.48        45    0.137    318K    24K       0.0       0.0#012 Sum      1/0   10.53 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.0     63.0     64.2      8.45              1.78        91    0.093    318K    24K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.4     70.5     69.6      0.90              0.21        10    0.090     47K   2604       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     86.4     73.5      6.15              1.48        45    0.137    318K    24K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     39.3      2.29              0.29        45    0.051       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.088, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.53 GB write, 0.10 MB/s write, 0.52 GB read, 0.10 MB/s read, 8.4 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 58.42 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.001157 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3330,56.08 MB,18.4476%) FilterBlock(91,890.42 KB,0.286037%) IndexBlock(91,1.47 MB,0.483899%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 03:52:27 np0005603622 nova_compute[221301]: 2026-01-31 08:52:27.222 221324 DEBUG nova.network.neutron [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8] Updating instance_info_cache with network_info: [{"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:27 np0005603622 nova_compute[221301]: 2026-01-31 08:52:27.406 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:52:27 np0005603622 nova_compute[221301]: 2026-01-31 08:52:27.407 221324 DEBUG nova.objects.instance [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'migration_context' on Instance uuid e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:27.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 31 03:52:27 np0005603622 nova_compute[221301]: 2026-01-31 08:52:27.859 221324 DEBUG nova.storage.rbd_utils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] removing snapshot(nova-resize) on rbd image(e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 03:52:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:27.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.902 221324 DEBUG nova.virt.libvirt.vif [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:51:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1107028276',display_name='tempest-TestNetworkAdvancedServerOps-server-1107028276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1107028276',id=170,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMhAjak4EaSbAMsOQHzT/F+YD+mdwqaKmT1b8Pkiv6vPXQSXr3cEJwaMw5cOEGrpti6B+hT6jgk8eQer/fm3Y87ortF0Suf8ZM3a30yTbd7sIeUbybs0ERwtVLRyiyyflg==',key_name='tempest-TestNetworkAdvancedServerOps-1864875451',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:52:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-g52p6b46',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:52:18Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.903 221324 DEBUG nova.network.os_vif_util [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "address": "fa:16:3e:ed:8d:1c", "network": {"id": "1f94c6ed-71ba-4114-a483-6969c923a169", "bridge": "br-int", "label": "tempest-network-smoke--2122556826", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d9caf6f-46", "ovs_interfaceid": "5d9caf6f-4602-4589-9c08-43b1a20a9c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.904 221324 DEBUG nova.network.os_vif_util [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.905 221324 DEBUG os_vif [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.908 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d9caf6f-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.909 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.912 221324 INFO os_vif [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8d:1c,bridge_name='br-int',has_traffic_filtering=True,id=5d9caf6f-4602-4589-9c08-43b1a20a9c34,network=Network(1f94c6ed-71ba-4114-a483-6969c923a169),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d9caf6f-46')#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.912 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:28 np0005603622 nova_compute[221301]: 2026-01-31 08:52:28.913 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.132 221324 DEBUG oslo_concurrency.processutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1908463231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.560 221324 DEBUG oslo_concurrency.processutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:29.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.567 221324 DEBUG nova.compute.provider_tree [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.592 221324 DEBUG nova.scheduler.client.report [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.737 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:29 np0005603622 nova_compute[221301]: 2026-01-31 08:52:29.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:29.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:30 np0005603622 nova_compute[221301]: 2026-01-31 08:52:30.127 221324 INFO nova.scheduler.client.report [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Deleted allocation for migration dfacf1ec-776d-4ad2-a40b-20488a596d16#033[00m
Jan 31 03:52:30 np0005603622 nova_compute[221301]: 2026-01-31 08:52:30.236 221324 DEBUG oslo_concurrency.lockutils [None req-915621c0-fa87-4e3b-8486-203f72a79da3 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "e8aff7bc-7d3c-4761-b7a3-849c89e7f1c8" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 8.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:30 np0005603622 nova_compute[221301]: 2026-01-31 08:52:30.866 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:30.910 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:30.911 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:30.912 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:31.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:31.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:32 np0005603622 nova_compute[221301]: 2026-01-31 08:52:32.235 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:32 np0005603622 nova_compute[221301]: 2026-01-31 08:52:32.236 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:32 np0005603622 nova_compute[221301]: 2026-01-31 08:52:32.325 221324 DEBUG nova.objects.instance [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'flavor' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:32 np0005603622 nova_compute[221301]: 2026-01-31 08:52:32.424 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.252 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.252 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.252 221324 INFO nova.compute.manager [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Attaching volume 9af5cb67-e1aa-49f9-909d-b9b2d8642543 to /dev/vdb#033[00m
Jan 31 03:52:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.697 221324 DEBUG os_brick.utils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.699 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.709 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.709 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab0f0f9-0812-4cf3-9336-9ce897ac9bd0]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.711 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.717 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.718 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[0939d7a4-5e83-4374-a22f-0dc1af2b658f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.719 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.727 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.727 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d0a34f-6cbe-42b8-aa1d-f94991acdfc5]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.729 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7ebdc0d9-2a68-48fb-b7c4-dfc54aa5da21]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.730 221324 DEBUG oslo_concurrency.processutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.755 221324 DEBUG oslo_concurrency.processutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.758 221324 DEBUG os_brick.initiator.connectors.lightos [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.759 221324 DEBUG os_brick.initiator.connectors.lightos [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.759 221324 DEBUG os_brick.initiator.connectors.lightos [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.760 221324 DEBUG os_brick.utils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] <== get_connector_properties: return (62ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:52:33 np0005603622 nova_compute[221301]: 2026-01-31 08:52:33.760 221324 DEBUG nova.virt.block_device [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating existing volume attachment record: 87a06692-0b3d-4740-a9ca-ef3072df60ea _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:52:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:33.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:52:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/347984763' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:52:34 np0005603622 nova_compute[221301]: 2026-01-31 08:52:34.809 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:34 np0005603622 nova_compute[221301]: 2026-01-31 08:52:34.821 221324 DEBUG nova.objects.instance [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'flavor' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:34 np0005603622 nova_compute[221301]: 2026-01-31 08:52:34.850 221324 DEBUG nova.virt.libvirt.driver [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Attempting to attach volume 9af5cb67-e1aa-49f9-909d-b9b2d8642543 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:52:34 np0005603622 nova_compute[221301]: 2026-01-31 08:52:34.853 221324 DEBUG nova.virt.libvirt.guest [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9af5cb67-e1aa-49f9-909d-b9b2d8642543">
Jan 31 03:52:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:52:34 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:52:34 np0005603622 nova_compute[221301]:  <serial>9af5cb67-e1aa-49f9-909d-b9b2d8642543</serial>
Jan 31 03:52:34 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:52:34 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.054 221324 DEBUG nova.virt.libvirt.driver [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.054 221324 DEBUG nova.virt.libvirt.driver [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.055 221324 DEBUG nova.virt.libvirt.driver [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.055 221324 DEBUG nova.virt.libvirt.driver [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] No VIF found with MAC fa:16:3e:26:b5:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:52:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:35.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.727 221324 DEBUG oslo_concurrency.lockutils [None req-c3aff7c7-b493-47dc-831d-b6e84bb243fa 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:35 np0005603622 nova_compute[221301]: 2026-01-31 08:52:35.912 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:35.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.183 221324 DEBUG oslo_concurrency.lockutils [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.183 221324 DEBUG oslo_concurrency.lockutils [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.300 221324 INFO nova.compute.manager [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Detaching volume 9af5cb67-e1aa-49f9-909d-b9b2d8642543#033[00m
Jan 31 03:52:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:37.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.638 221324 INFO nova.virt.block_device [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Attempting to driver detach volume 9af5cb67-e1aa-49f9-909d-b9b2d8642543 from mountpoint /dev/vdb#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.650 221324 DEBUG nova.virt.libvirt.driver [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Attempting to detach device vdb from instance 1e30fc0d-e774-4cad-b497-496833551f8b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.651 221324 DEBUG nova.virt.libvirt.guest [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9af5cb67-e1aa-49f9-909d-b9b2d8642543">
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <serial>9af5cb67-e1aa-49f9-909d-b9b2d8642543</serial>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:52:37 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.661 221324 INFO nova.virt.libvirt.driver [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully detached device vdb from instance 1e30fc0d-e774-4cad-b497-496833551f8b from the persistent domain config.#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.661 221324 DEBUG nova.virt.libvirt.driver [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 1e30fc0d-e774-4cad-b497-496833551f8b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.662 221324 DEBUG nova.virt.libvirt.guest [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9af5cb67-e1aa-49f9-909d-b9b2d8642543">
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <serial>9af5cb67-e1aa-49f9-909d-b9b2d8642543</serial>
Jan 31 03:52:37 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:52:37 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:52:37 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.772 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849557.771873, 1e30fc0d-e774-4cad-b497-496833551f8b => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.773 221324 DEBUG nova.virt.libvirt.driver [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 1e30fc0d-e774-4cad-b497-496833551f8b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:52:37 np0005603622 nova_compute[221301]: 2026-01-31 08:52:37.775 221324 INFO nova.virt.libvirt.driver [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully detached device vdb from instance 1e30fc0d-e774-4cad-b497-496833551f8b from the live domain config.#033[00m
Jan 31 03:52:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:37.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:38 np0005603622 podman[290372]: 2026-01-31 08:52:38.148406613 +0000 UTC m=+0.063704707 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:52:38 np0005603622 podman[290373]: 2026-01-31 08:52:38.232208176 +0000 UTC m=+0.145700851 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 03:52:38 np0005603622 nova_compute[221301]: 2026-01-31 08:52:38.373 221324 DEBUG nova.objects.instance [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'flavor' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:38 np0005603622 nova_compute[221301]: 2026-01-31 08:52:38.481 221324 DEBUG oslo_concurrency.lockutils [None req-00285e7d-7432-483c-a35b-de9e9d19e223 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:39.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:39 np0005603622 nova_compute[221301]: 2026-01-31 08:52:39.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:39.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:40 np0005603622 nova_compute[221301]: 2026-01-31 08:52:40.916 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.010 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.011 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.011 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.012 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.012 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.014 221324 INFO nova.compute.manager [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Terminating instance#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.015 221324 DEBUG nova.compute.manager [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:52:41 np0005603622 kernel: tap93b367b9-c0 (unregistering): left promiscuous mode
Jan 31 03:52:41 np0005603622 NetworkManager[49080]: <info>  [1769849561.2349] device (tap93b367b9-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:41Z|00811|binding|INFO|Releasing lport 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 from this chassis (sb_readonly=0)
Jan 31 03:52:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:41Z|00812|binding|INFO|Setting lport 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 down in Southbound
Jan 31 03:52:41 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:41Z|00813|binding|INFO|Removing iface tap93b367b9-c0 ovn-installed in OVS
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Jan 31 03:52:41 np0005603622 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ad.scope: Consumed 14.479s CPU time.
Jan 31 03:52:41 np0005603622 systemd-machined[190406]: Machine qemu-84-instance-000000ad terminated.
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.314 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b5:e7 10.100.0.8'], port_security=['fa:16:3e:26:b5:e7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1e30fc0d-e774-4cad-b497-496833551f8b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a02f269a-650e-4227-8352-05abf2566c17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76fb5cb7abcd4d74abfc471a96bbd12c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4b0a4a7-6b42-4161-a5a6-7829375769dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97876448-21a4-4b64-9452-bd401dfcc8ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.315 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 in datapath a02f269a-650e-4227-8352-05abf2566c17 unbound from our chassis#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.316 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a02f269a-650e-4227-8352-05abf2566c17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.317 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b46b20e7-c093-4732-bb09-b8e835f101f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.318 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 namespace which is not needed anymore#033[00m
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [NOTICE]   (290154) : haproxy version is 2.8.14-c23fe91
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [NOTICE]   (290154) : path to executable is /usr/sbin/haproxy
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [WARNING]  (290154) : Exiting Master process...
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [WARNING]  (290154) : Exiting Master process...
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [ALERT]    (290154) : Current worker (290156) exited with code 143 (Terminated)
Jan 31 03:52:41 np0005603622 neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17[290150]: [WARNING]  (290154) : All workers exited. Exiting... (0)
Jan 31 03:52:41 np0005603622 systemd[1]: libpod-81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e.scope: Deactivated successfully.
Jan 31 03:52:41 np0005603622 podman[290692]: 2026-01-31 08:52:41.424385075 +0000 UTC m=+0.042937191 container died 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.444 221324 INFO nova.virt.libvirt.driver [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Instance destroyed successfully.#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.445 221324 DEBUG nova.objects.instance [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lazy-loading 'resources' on Instance uuid 1e30fc0d-e774-4cad-b497-496833551f8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:52:41 np0005603622 systemd[1]: var-lib-containers-storage-overlay-d988f0fccfd9548b4a3b421fc9f47c7b43ea487f0452476855c4f0d9f94bb571-merged.mount: Deactivated successfully.
Jan 31 03:52:41 np0005603622 podman[290692]: 2026-01-31 08:52:41.459298469 +0000 UTC m=+0.077850585 container cleanup 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:52:41 np0005603622 systemd[1]: libpod-conmon-81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e.scope: Deactivated successfully.
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.514 221324 DEBUG nova.virt.libvirt.vif [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:51:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1474271208',display_name='tempest-AttachVolumeNegativeTest-server-1474271208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1474271208',id=173,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH8uyo0SI9g4bhyXaUu/mgiRdZ9p5P7fj/0AXQr6KiCUFLPfccpQyldXU/JL23UYZE5aZv7vIKBfjnNipswNMAX6ybSgDqSjq/nAj/WykPc1IMgcwquHyEnNFgVLXNp54g==',key_name='tempest-keypair-1515101677',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:52:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76fb5cb7abcd4d74abfc471a96bbd12c',ramdisk_id='',reservation_id='r-sqz0x7qz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-457307401',owner_user_name='tempest-AttachVolumeNegativeTest-457307401-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:52:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3bd4ce8a916a4bdbbc988eb4fe32991e',uuid=1e30fc0d-e774-4cad-b497-496833551f8b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.515 221324 DEBUG nova.network.os_vif_util [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converting VIF {"id": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "address": "fa:16:3e:26:b5:e7", "network": {"id": "a02f269a-650e-4227-8352-05abf2566c17", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-245078866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76fb5cb7abcd4d74abfc471a96bbd12c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93b367b9-c0", "ovs_interfaceid": "93b367b9-c0d7-4f7e-98bf-a33da41aa3d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.517 221324 DEBUG nova.network.os_vif_util [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.518 221324 DEBUG os_vif [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 podman[290735]: 2026-01-31 08:52:41.521492304 +0000 UTC m=+0.043223568 container remove 81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.522 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93b367b9-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.524 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c853696f-c088-46f2-95ae-33753de5475f]: (4, ('Sat Jan 31 08:52:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e)\n81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e\nSat Jan 31 08:52:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 (81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e)\n81aee326b3ba6a1a92cfe6336b2cef9b1c5370ad58442d930a96dfb41d43c41e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.528 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[856886e9-19ab-4a59-a0c3-723050229fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.529 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa02f269a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.529 221324 INFO os_vif [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:b5:e7,bridge_name='br-int',has_traffic_filtering=True,id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2,network=Network(a02f269a-650e-4227-8352-05abf2566c17),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93b367b9-c0')#033[00m
Jan 31 03:52:41 np0005603622 kernel: tapa02f269a-60: left promiscuous mode
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.542 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fd74530f-9f6d-4a70-b2e9-3925ac98b367]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 nova_compute[221301]: 2026-01-31 08:52:41.551 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.558 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[530b6767-ff42-467c-b951-08f903f02e4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.559 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0982c8-283b-453b-87fa-2b8a5a5bd74b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.571 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a50465f9-e048-476d-bf4f-2ec379ecd4c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 877933, 'reachable_time': 38273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290765, 'error': None, 'target': 'ovnmeta-a02f269a-650e-4227-8352-05abf2566c17', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.574 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a02f269a-650e-4227-8352-05abf2566c17 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:52:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:41.574 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[13fa216d-3e33-479f-a65f-facfd68c5258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:41 np0005603622 systemd[1]: run-netns-ovnmeta\x2da02f269a\x2d650e\x2d4227\x2d8352\x2d05abf2566c17.mount: Deactivated successfully.
Jan 31 03:52:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:41.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:41.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.158 221324 DEBUG nova.compute.manager [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-unplugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.159 221324 DEBUG oslo_concurrency.lockutils [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.160 221324 DEBUG oslo_concurrency.lockutils [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.160 221324 DEBUG oslo_concurrency.lockutils [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.161 221324 DEBUG nova.compute.manager [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] No waiting events found dispatching network-vif-unplugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.161 221324 DEBUG nova.compute.manager [req-e1cb2e85-9a6b-4d59-b4db-774b151165bc req-200efad2-8154-4d6a-aede-31df98280afe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-unplugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:52:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.957 221324 INFO nova.virt.libvirt.driver [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Deleting instance files /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b_del#033[00m
Jan 31 03:52:42 np0005603622 nova_compute[221301]: 2026-01-31 08:52:42.958 221324 INFO nova.virt.libvirt.driver [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Deletion of /var/lib/nova/instances/1e30fc0d-e774-4cad-b497-496833551f8b_del complete#033[00m
Jan 31 03:52:43 np0005603622 nova_compute[221301]: 2026-01-31 08:52:43.435 221324 INFO nova.compute.manager [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Took 2.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:52:43 np0005603622 nova_compute[221301]: 2026-01-31 08:52:43.436 221324 DEBUG oslo.service.loopingcall [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:52:43 np0005603622 nova_compute[221301]: 2026-01-31 08:52:43.436 221324 DEBUG nova.compute.manager [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:52:43 np0005603622 nova_compute[221301]: 2026-01-31 08:52:43.436 221324 DEBUG nova.network.neutron [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:52:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:52:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1184651225' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:52:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:52:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1184651225' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:52:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:43.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:44.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.288 221324 DEBUG nova.compute.manager [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.289 221324 DEBUG oslo_concurrency.lockutils [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.290 221324 DEBUG oslo_concurrency.lockutils [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.290 221324 DEBUG oslo_concurrency.lockutils [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.291 221324 DEBUG nova.compute.manager [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] No waiting events found dispatching network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.292 221324 WARNING nova.compute.manager [req-bf424068-032b-44ec-9dc4-fc6a48516029 req-439cdf9f-ab11-4d91-9772-891b0e602878 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received unexpected event network-vif-plugged-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.813 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.945 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.945 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.945 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.945 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.946 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.946 221324 INFO nova.compute.manager [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Terminating instance#033[00m
Jan 31 03:52:44 np0005603622 nova_compute[221301]: 2026-01-31 08:52:44.947 221324 DEBUG nova.compute.manager [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:52:45 np0005603622 kernel: tap8caf8cff-a5 (unregistering): left promiscuous mode
Jan 31 03:52:45 np0005603622 NetworkManager[49080]: <info>  [1769849565.0127] device (tap8caf8cff-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:52:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:45Z|00814|binding|INFO|Releasing lport 8caf8cff-a513-4677-929d-ade94fee3082 from this chassis (sb_readonly=0)
Jan 31 03:52:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:45Z|00815|binding|INFO|Setting lport 8caf8cff-a513-4677-929d-ade94fee3082 down in Southbound
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:52:45Z|00816|binding|INFO|Removing iface tap8caf8cff-a5 ovn-installed in OVS
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Jan 31 03:52:45 np0005603622 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a0.scope: Consumed 25.698s CPU time.
Jan 31 03:52:45 np0005603622 systemd-machined[190406]: Machine qemu-75-instance-000000a0 terminated.
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.086 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:d6:1f 10.100.0.10'], port_security=['fa:16:3e:64:d6:1f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6fd9c6e1-ddff-436d-b240-240ec06b2a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8397e0fed04b4dabb57148d0924de2dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd636f3a4-efef-465a-ac59-8182d61336f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd2578f-ff6e-4dc3-bc49-93cbf023edc5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8caf8cff-a513-4677-929d-ade94fee3082) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.087 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8caf8cff-a513-4677-929d-ade94fee3082 in datapath 3afaf607-43a1-4d65-95fc-0a22b5c901d0 unbound from our chassis#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.088 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3afaf607-43a1-4d65-95fc-0a22b5c901d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.089 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[87aebad1-177a-4657-8be1-cef07c20e6ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.090 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 namespace which is not needed anymore#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.133 221324 DEBUG nova.network.neutron [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.169 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.181 221324 INFO nova.virt.libvirt.driver [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Instance destroyed successfully.#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.182 221324 DEBUG nova.objects.instance [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lazy-loading 'resources' on Instance uuid 6fd9c6e1-ddff-436d-b240-240ec06b2a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.203 221324 DEBUG nova.compute.manager [req-17695846-6ae4-4c37-b80f-6223ff0b8850 req-eae8fd56-c969-44fa-87b4-d42447688aef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Received event network-vif-deleted-93b367b9-c0d7-4f7e-98bf-a33da41aa3d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.204 221324 INFO nova.compute.manager [req-17695846-6ae4-4c37-b80f-6223ff0b8850 req-eae8fd56-c969-44fa-87b4-d42447688aef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Neutron deleted interface 93b367b9-c0d7-4f7e-98bf-a33da41aa3d2; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.204 221324 DEBUG nova.network.neutron [req-17695846-6ae4-4c37-b80f-6223ff0b8850 req-eae8fd56-c969-44fa-87b4-d42447688aef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.237 221324 DEBUG nova.virt.libvirt.vif [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:47:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=160,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMGvV4tGHwFrQ7+1WPmMS3fGcrpcMKpLQBFiD2ZG0NedKq4jaCN6oHf8RWlX+X72Ff/PSGJSQ5nqRPZm+CDMr01vn3vAMra9m4dZ/R1d2vwh+NDFwu298PivPHJQkyuCpg==',key_name='tempest-keypair-600650673',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:47:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8397e0fed04b4dabb57148d0924de2dc',ramdisk_id='',reservation_id='r-fxo6kllb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-1931311941',owner_user_name='tempest-AttachVolumeMultiAttachTest-1931311941-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:47:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a498364761ef428b99cac3f92e603385',uuid=6fd9c6e1-ddff-436d-b240-240ec06b2a15,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.238 221324 DEBUG nova.network.os_vif_util [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converting VIF {"id": "8caf8cff-a513-4677-929d-ade94fee3082", "address": "fa:16:3e:64:d6:1f", "network": {"id": "3afaf607-43a1-4d65-95fc-0a22b5c901d0", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1596089959-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8397e0fed04b4dabb57148d0924de2dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8caf8cff-a5", "ovs_interfaceid": "8caf8cff-a513-4677-929d-ade94fee3082", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.239 221324 DEBUG nova.network.os_vif_util [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.239 221324 DEBUG os_vif [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.243 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8caf8cff-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.246 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.249 221324 INFO os_vif [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=8caf8cff-a513-4677-929d-ade94fee3082,network=Network(3afaf607-43a1-4d65-95fc-0a22b5c901d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8caf8cff-a5')#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.272 221324 INFO nova.compute.manager [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Took 1.84 seconds to deallocate network for instance.#033[00m
Jan 31 03:52:45 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [NOTICE]   (285139) : haproxy version is 2.8.14-c23fe91
Jan 31 03:52:45 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [NOTICE]   (285139) : path to executable is /usr/sbin/haproxy
Jan 31 03:52:45 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [WARNING]  (285139) : Exiting Master process...
Jan 31 03:52:45 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [ALERT]    (285139) : Current worker (285141) exited with code 143 (Terminated)
Jan 31 03:52:45 np0005603622 neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0[285135]: [WARNING]  (285139) : All workers exited. Exiting... (0)
Jan 31 03:52:45 np0005603622 systemd[1]: libpod-d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91.scope: Deactivated successfully.
Jan 31 03:52:45 np0005603622 podman[290797]: 2026-01-31 08:52:45.292723864 +0000 UTC m=+0.122784348 container died d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.326 221324 DEBUG nova.compute.manager [req-17695846-6ae4-4c37-b80f-6223ff0b8850 req-eae8fd56-c969-44fa-87b4-d42447688aef fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Detach interface failed, port_id=93b367b9-c0d7-4f7e-98bf-a33da41aa3d2, reason: Instance 1e30fc0d-e774-4cad-b497-496833551f8b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:52:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91-userdata-shm.mount: Deactivated successfully.
Jan 31 03:52:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay-d523dc433e6532bd0fba67fdc4d0394dc72f04077c07ea639f25ab44b30e17a4-merged.mount: Deactivated successfully.
Jan 31 03:52:45 np0005603622 podman[290797]: 2026-01-31 08:52:45.416899128 +0000 UTC m=+0.246959612 container cleanup d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:52:45 np0005603622 systemd[1]: libpod-conmon-d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91.scope: Deactivated successfully.
Jan 31 03:52:45 np0005603622 podman[290853]: 2026-01-31 08:52:45.537801105 +0000 UTC m=+0.098773305 container remove d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.543 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06b160bc-0737-4295-863e-8c77b31a2288]: (4, ('Sat Jan 31 08:52:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 (d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91)\nd092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91\nSat Jan 31 08:52:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 (d092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91)\nd092743e9204357e1ca2b7d6249488e357f49800c0d94f5703a4724a787b7c91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.546 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[308e2939-0fc4-44fd-a557-81af2519c1c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.548 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3afaf607-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.551 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 kernel: tap3afaf607-40: left promiscuous mode
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.559 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0919dcd6-2c6c-4c2c-b264-5ba8506439c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.571 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1be3f778-3736-4c70-90bf-73c5e2425e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.573 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4cc7e4f-c81d-44d7-be34-252e2a6180c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.587 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a94a578a-e57a-4c33-a200-a0c609236bec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 849362, 'reachable_time': 17619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290866, 'error': None, 'target': 'ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:45.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.590 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3afaf607-43a1-4d65-95fc-0a22b5c901d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:52:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:52:45.590 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6d66e3-dcc4-4658-af94-81f077977088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:52:45 np0005603622 systemd[1]: run-netns-ovnmeta\x2d3afaf607\x2d43a1\x2d4d65\x2d95fc\x2d0a22b5c901d0.mount: Deactivated successfully.
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.669 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.671 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:45 np0005603622 nova_compute[221301]: 2026-01-31 08:52:45.792 221324 DEBUG oslo_concurrency.processutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:46.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.097 221324 INFO nova.virt.libvirt.driver [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Deleting instance files /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15_del#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.098 221324 INFO nova.virt.libvirt.driver [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Deletion of /var/lib/nova/instances/6fd9c6e1-ddff-436d-b240-240ec06b2a15_del complete#033[00m
Jan 31 03:52:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1835849491' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.260 221324 DEBUG oslo_concurrency.processutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.265 221324 DEBUG nova.compute.provider_tree [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.337 221324 DEBUG nova.scheduler.client.report [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.529 221324 INFO nova.compute.manager [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Took 1.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.529 221324 DEBUG oslo.service.loopingcall [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.530 221324 DEBUG nova.compute.manager [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:52:46 np0005603622 nova_compute[221301]: 2026-01-31 08:52:46.530 221324 DEBUG nova.network.neutron [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.008 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:52:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:47.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.747 221324 INFO nova.scheduler.client.report [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Deleted allocations for instance 1e30fc0d-e774-4cad-b497-496833551f8b#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.862 221324 DEBUG nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-unplugged-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.863 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.863 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.863 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.864 221324 DEBUG nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] No waiting events found dispatching network-vif-unplugged-8caf8cff-a513-4677-929d-ade94fee3082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.864 221324 DEBUG nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-unplugged-8caf8cff-a513-4677-929d-ade94fee3082 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.864 221324 DEBUG nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.864 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.864 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.865 221324 DEBUG oslo_concurrency.lockutils [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.865 221324 DEBUG nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] No waiting events found dispatching network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:52:47 np0005603622 nova_compute[221301]: 2026-01-31 08:52:47.865 221324 WARNING nova.compute.manager [req-5b9b68ed-6022-42c8-b853-680c591b41ef req-50191de8-97a9-4574-991c-05aeaafe0fea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received unexpected event network-vif-plugged-8caf8cff-a513-4677-929d-ade94fee3082 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:52:48 np0005603622 nova_compute[221301]: 2026-01-31 08:52:48.003 221324 DEBUG oslo_concurrency.lockutils [None req-9519a8a6-b798-49aa-af31-29857585f81a 3bd4ce8a916a4bdbbc988eb4fe32991e 76fb5cb7abcd4d74abfc471a96bbd12c - - default default] Lock "1e30fc0d-e774-4cad-b497-496833551f8b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:48.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.629191) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568629232, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 2475, "num_deletes": 256, "total_data_size": 5630659, "memory_usage": 5710600, "flush_reason": "Manual Compaction"}
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568886231, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 3666358, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 71786, "largest_seqno": 74255, "table_properties": {"data_size": 3656276, "index_size": 6383, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21711, "raw_average_key_size": 20, "raw_value_size": 3635875, "raw_average_value_size": 3506, "num_data_blocks": 276, "num_entries": 1037, "num_filter_entries": 1037, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849379, "oldest_key_time": 1769849379, "file_creation_time": 1769849568, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 257190 microseconds, and 7692 cpu microseconds.
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:52:48 np0005603622 nova_compute[221301]: 2026-01-31 08:52:48.926 221324 DEBUG nova.network.neutron [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:52:48 np0005603622 nova_compute[221301]: 2026-01-31 08:52:48.959 221324 INFO nova.compute.manager [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Took 2.43 seconds to deallocate network for instance.#033[00m
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.886354) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 3666358 bytes OK
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.886399) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.963429) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.963472) EVENT_LOG_v1 {"time_micros": 1769849568963462, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.963496) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 5619714, prev total WAL file size 5619714, number of live WAL files 2.
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.964706) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(3580KB)], [147(10MB)]
Jan 31 03:52:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849568964793, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 14707412, "oldest_snapshot_seqno": -1}
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.027 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.027 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.050 221324 DEBUG nova.compute.manager [req-4fa09fbc-fcc3-4f5b-a929-b637e0dcd1c3 req-e8220337-ce17-46a6-92d3-548414b542a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Received event network-vif-deleted-8caf8cff-a513-4677-929d-ade94fee3082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.092 221324 DEBUG oslo_concurrency.processutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9571 keys, 12770059 bytes, temperature: kUnknown
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849569145851, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 12770059, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12707725, "index_size": 37325, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 251457, "raw_average_key_size": 26, "raw_value_size": 12539683, "raw_average_value_size": 1310, "num_data_blocks": 1427, "num_entries": 9571, "num_filter_entries": 9571, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849568, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.146326) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 12770059 bytes
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.151492) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 81.1 rd, 70.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(7.5) write-amplify(3.5) OK, records in: 10102, records dropped: 531 output_compression: NoCompression
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.151522) EVENT_LOG_v1 {"time_micros": 1769849569151508, "job": 94, "event": "compaction_finished", "compaction_time_micros": 181261, "compaction_time_cpu_micros": 47572, "output_level": 6, "num_output_files": 1, "total_output_size": 12770059, "num_input_records": 10102, "num_output_records": 9571, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849569152128, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849569153220, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:48.964587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.153270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.153276) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.153305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.153308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:52:49.153311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:52:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2896584589' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:52:49 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:52:49 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.515 221324 DEBUG oslo_concurrency.processutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.520 221324 DEBUG nova.compute.provider_tree [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.558 221324 DEBUG nova.scheduler.client.report [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:52:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:49.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.614 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.648 221324 INFO nova.scheduler.client.report [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Deleted allocations for instance 6fd9c6e1-ddff-436d-b240-240ec06b2a15#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.757 221324 DEBUG oslo_concurrency.lockutils [None req-b8d183e7-0fdd-48c1-a7b7-27ff4bf6300b a498364761ef428b99cac3f92e603385 8397e0fed04b4dabb57148d0924de2dc - - default default] Lock "6fd9c6e1-ddff-436d-b240-240ec06b2a15" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:52:49 np0005603622 nova_compute[221301]: 2026-01-31 08:52:49.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:50.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:50 np0005603622 nova_compute[221301]: 2026-01-31 08:52:50.245 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:51.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:52.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:52 np0005603622 nova_compute[221301]: 2026-01-31 08:52:52.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:53 np0005603622 nova_compute[221301]: 2026-01-31 08:52:53.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:53 np0005603622 nova_compute[221301]: 2026-01-31 08:52:53.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:53.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:54.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:54 np0005603622 nova_compute[221301]: 2026-01-31 08:52:54.865 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:55 np0005603622 nova_compute[221301]: 2026-01-31 08:52:55.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:52:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:52:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:55.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:52:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/5806762' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:52:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:52:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/5806762' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:52:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:52:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:56.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:52:56 np0005603622 nova_compute[221301]: 2026-01-31 08:52:56.440 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849561.4397037, 1e30fc0d-e774-4cad-b497-496833551f8b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:52:56 np0005603622 nova_compute[221301]: 2026-01-31 08:52:56.441 221324 INFO nova.compute.manager [-] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:52:56 np0005603622 nova_compute[221301]: 2026-01-31 08:52:56.493 221324 DEBUG nova.compute.manager [None req-731b2946-9dc5-4bf5-b8f0-789a9f758513 - - - - - -] [instance: 1e30fc0d-e774-4cad-b497-496833551f8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:52:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:57.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:52:58.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:52:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:52:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:52:59.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:52:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 03:52:59 np0005603622 nova_compute[221301]: 2026-01-31 08:52:59.902 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:00.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:00 np0005603622 nova_compute[221301]: 2026-01-31 08:53:00.180 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849565.1793592, 6fd9c6e1-ddff-436d-b240-240ec06b2a15 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:53:00 np0005603622 nova_compute[221301]: 2026-01-31 08:53:00.181 221324 INFO nova.compute.manager [-] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:53:00 np0005603622 nova_compute[221301]: 2026-01-31 08:53:00.238 221324 DEBUG nova.compute.manager [None req-3b5acd8e-0d38-44d8-a63e-235550bcf4b4 - - - - - -] [instance: 6fd9c6e1-ddff-436d-b240-240ec06b2a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:53:00 np0005603622 nova_compute[221301]: 2026-01-31 08:53:00.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:01.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:02.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:02 np0005603622 nova_compute[221301]: 2026-01-31 08:53:02.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:53:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:03.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:53:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:04.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:04 np0005603622 nova_compute[221301]: 2026-01-31 08:53:04.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:04 np0005603622 nova_compute[221301]: 2026-01-31 08:53:04.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:05 np0005603622 nova_compute[221301]: 2026-01-31 08:53:05.252 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:05.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:06.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:06 np0005603622 nova_compute[221301]: 2026-01-31 08:53:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:07 np0005603622 nova_compute[221301]: 2026-01-31 08:53:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:07 np0005603622 nova_compute[221301]: 2026-01-31 08:53:07.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:53:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:07.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:08.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:08 np0005603622 nova_compute[221301]: 2026-01-31 08:53:08.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:08 np0005603622 nova_compute[221301]: 2026-01-31 08:53:08.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:53:08 np0005603622 nova_compute[221301]: 2026-01-31 08:53:08.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:53:08 np0005603622 nova_compute[221301]: 2026-01-31 08:53:08.536 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:53:09 np0005603622 podman[290966]: 2026-01-31 08:53:09.131951954 +0000 UTC m=+0.054327006 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:53:09 np0005603622 podman[290965]: 2026-01-31 08:53:09.166966461 +0000 UTC m=+0.086612389 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:53:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:09.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:09 np0005603622 nova_compute[221301]: 2026-01-31 08:53:09.907 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:10.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.447 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.447 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1017929805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:10 np0005603622 nova_compute[221301]: 2026-01-31 08:53:10.910 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.083 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.084 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4331MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.084 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.084 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.348 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.349 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.377 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:53:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:11.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:53:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3644475585' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.802 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.807 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.854 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.902 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:53:11 np0005603622 nova_compute[221301]: 2026-01-31 08:53:11.903 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:12.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:12 np0005603622 nova_compute[221301]: 2026-01-31 08:53:12.904 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:13.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:14.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:14 np0005603622 nova_compute[221301]: 2026-01-31 08:53:14.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:15 np0005603622 nova_compute[221301]: 2026-01-31 08:53:15.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:15 np0005603622 nova_compute[221301]: 2026-01-31 08:53:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:15.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:16.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:16 np0005603622 nova_compute[221301]: 2026-01-31 08:53:16.977 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:16.978 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:53:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:16.980 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:53:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:17.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:18.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:19.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:19 np0005603622 nova_compute[221301]: 2026-01-31 08:53:19.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:20.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:20 np0005603622 nova_compute[221301]: 2026-01-31 08:53:20.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:21.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:21.982 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:53:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:22.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:23.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:53:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:24.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:53:24 np0005603622 nova_compute[221301]: 2026-01-31 08:53:24.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603622 nova_compute[221301]: 2026-01-31 08:53:25.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:25.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:27.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:28.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:29 np0005603622 nova_compute[221301]: 2026-01-31 08:53:29.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:53:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:29.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:29 np0005603622 nova_compute[221301]: 2026-01-31 08:53:29.919 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:30.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:30 np0005603622 nova_compute[221301]: 2026-01-31 08:53:30.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:30.911 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:30.911 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:53:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:53:30.912 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:53:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:53:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:31.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:53:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:32.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:33.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:34.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:34 np0005603622 nova_compute[221301]: 2026-01-31 08:53:34.922 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:35 np0005603622 nova_compute[221301]: 2026-01-31 08:53:35.300 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 03:53:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:35.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 03:53:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:36.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:38.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:53:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3542381080' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:53:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:53:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3542381080' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:53:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:39.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:39 np0005603622 nova_compute[221301]: 2026-01-31 08:53:39.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:40.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:40 np0005603622 podman[291055]: 2026-01-31 08:53:40.126453173 +0000 UTC m=+0.052677892 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 31 03:53:40 np0005603622 podman[291054]: 2026-01-31 08:53:40.156659651 +0000 UTC m=+0.086053965 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:53:40 np0005603622 nova_compute[221301]: 2026-01-31 08:53:40.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:41.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:42.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:43.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:44.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:44 np0005603622 nova_compute[221301]: 2026-01-31 08:53:44.926 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:45 np0005603622 nova_compute[221301]: 2026-01-31 08:53:45.303 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:45.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:46.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:47.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:48.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:49.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:49 np0005603622 nova_compute[221301]: 2026-01-31 08:53:49.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.021643864 +0000 UTC m=+0.045079188 container create e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:53:50 np0005603622 systemd[1]: Started libpod-conmon-e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527.scope.
Jan 31 03:53:50 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.004639939 +0000 UTC m=+0.028075283 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:53:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:50.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.121671792 +0000 UTC m=+0.145107156 container init e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.12982384 +0000 UTC m=+0.153259164 container start e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.134828304 +0000 UTC m=+0.158263648 container attach e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 31 03:53:50 np0005603622 goofy_goodall[291387]: 167 167
Jan 31 03:53:50 np0005603622 systemd[1]: libpod-e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527.scope: Deactivated successfully.
Jan 31 03:53:50 np0005603622 conmon[291387]: conmon e469e85464a486fd4bf9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527.scope/container/memory.events
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.141249796 +0000 UTC m=+0.164685140 container died e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True)
Jan 31 03:53:50 np0005603622 systemd[1]: var-lib-containers-storage-overlay-30fcf88288dadeace24f51943a2b9f8387046d7f2a50b77b5cc71ab7d09737c0-merged.mount: Deactivated successfully.
Jan 31 03:53:50 np0005603622 podman[291371]: 2026-01-31 08:53:50.18474976 +0000 UTC m=+0.208185104 container remove e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 31 03:53:50 np0005603622 systemd[1]: libpod-conmon-e469e85464a486fd4bf9b6ba4890065924dfedcd0471ee82ae76f1c27d862527.scope: Deactivated successfully.
Jan 31 03:53:50 np0005603622 podman[291411]: 2026-01-31 08:53:50.31100495 +0000 UTC m=+0.035580953 container create 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 31 03:53:50 np0005603622 nova_compute[221301]: 2026-01-31 08:53:50.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:50 np0005603622 systemd[1]: Started libpod-conmon-6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845.scope.
Jan 31 03:53:50 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:53:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64209995fd8ffbb5365603f2121cda49c67738428e0880050fc8b555d2232420/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64209995fd8ffbb5365603f2121cda49c67738428e0880050fc8b555d2232420/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64209995fd8ffbb5365603f2121cda49c67738428e0880050fc8b555d2232420/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603622 podman[291411]: 2026-01-31 08:53:50.293751528 +0000 UTC m=+0.018327551 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 03:53:50 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64209995fd8ffbb5365603f2121cda49c67738428e0880050fc8b555d2232420/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 03:53:50 np0005603622 podman[291411]: 2026-01-31 08:53:50.408912011 +0000 UTC m=+0.133488054 container init 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:53:50 np0005603622 podman[291411]: 2026-01-31 08:53:50.418734694 +0000 UTC m=+0.143310707 container start 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:53:50 np0005603622 podman[291411]: 2026-01-31 08:53:50.422453044 +0000 UTC m=+0.147029237 container attach 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 03:53:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]: [
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:    {
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "available": false,
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "ceph_device": false,
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "lsm_data": {},
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "lvs": [],
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "path": "/dev/sr0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "rejected_reasons": [
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "Has a FileSystem",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "Insufficient space (<5GB)"
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        ],
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        "sys_api": {
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "actuators": null,
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "device_nodes": "sr0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "devname": "sr0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "human_readable_size": "482.00 KB",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "id_bus": "ata",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "model": "QEMU DVD-ROM",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "nr_requests": "2",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "parent": "/dev/sr0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "partitions": {},
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "path": "/dev/sr0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "removable": "1",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "rev": "2.5+",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "ro": "0",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "rotational": "1",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "sas_address": "",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "sas_device_handle": "",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "scheduler_mode": "mq-deadline",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "sectors": 0,
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "sectorsize": "2048",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "size": 493568.0,
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "support_discard": "2048",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "type": "disk",
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:            "vendor": "QEMU"
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:        }
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]:    }
Jan 31 03:53:51 np0005603622 kind_chaplygin[291427]: ]
Jan 31 03:53:51 np0005603622 systemd[1]: libpod-6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845.scope: Deactivated successfully.
Jan 31 03:53:51 np0005603622 podman[291411]: 2026-01-31 08:53:51.57138715 +0000 UTC m=+1.295963193 container died 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 03:53:51 np0005603622 systemd[1]: libpod-6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845.scope: Consumed 1.138s CPU time.
Jan 31 03:53:51 np0005603622 systemd[1]: var-lib-containers-storage-overlay-64209995fd8ffbb5365603f2121cda49c67738428e0880050fc8b555d2232420-merged.mount: Deactivated successfully.
Jan 31 03:53:51 np0005603622 podman[291411]: 2026-01-31 08:53:51.620457594 +0000 UTC m=+1.345033597 container remove 6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_chaplygin, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 31 03:53:51 np0005603622 systemd[1]: libpod-conmon-6f3b1249aa3ed7a090fdb6ea52d62b8b69744341e0027cf43c529a8022e01845.scope: Deactivated successfully.
Jan 31 03:53:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:51.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:52.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:53:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:53.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:54.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:54 np0005603622 nova_compute[221301]: 2026-01-31 08:53:54.930 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:55 np0005603622 nova_compute[221301]: 2026-01-31 08:53:55.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:53:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:53:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:55.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:56.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:57.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:53:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:53:58.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:53:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:53:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:53:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:53:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:53:59.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:53:59 np0005603622 nova_compute[221301]: 2026-01-31 08:53:59.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:00.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:00 np0005603622 nova_compute[221301]: 2026-01-31 08:54:00.332 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:01.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:02.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:03.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:04.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:04 np0005603622 nova_compute[221301]: 2026-01-31 08:54:04.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:04 np0005603622 nova_compute[221301]: 2026-01-31 08:54:04.934 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:05 np0005603622 nova_compute[221301]: 2026-01-31 08:54:05.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:05 np0005603622 nova_compute[221301]: 2026-01-31 08:54:05.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:05.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:05Z|00817|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 31 03:54:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:06.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:06 np0005603622 nova_compute[221301]: 2026-01-31 08:54:06.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:07 np0005603622 nova_compute[221301]: 2026-01-31 08:54:07.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:07 np0005603622 nova_compute[221301]: 2026-01-31 08:54:07.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:54:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:07.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:08.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:08 np0005603622 nova_compute[221301]: 2026-01-31 08:54:08.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:08 np0005603622 nova_compute[221301]: 2026-01-31 08:54:08.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:54:08 np0005603622 nova_compute[221301]: 2026-01-31 08:54:08.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:54:08 np0005603622 nova_compute[221301]: 2026-01-31 08:54:08.772 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:54:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:09.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:09 np0005603622 nova_compute[221301]: 2026-01-31 08:54:09.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:10.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.604 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.605 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.605 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.606 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:54:10 np0005603622 nova_compute[221301]: 2026-01-31 08:54:10.606 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3937212631' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:11 np0005603622 nova_compute[221301]: 2026-01-31 08:54:11.075 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:11 np0005603622 podman[292688]: 2026-01-31 08:54:11.163119701 +0000 UTC m=+0.081145514 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 03:54:11 np0005603622 podman[292685]: 2026-01-31 08:54:11.167834857 +0000 UTC m=+0.088239013 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:54:11 np0005603622 nova_compute[221301]: 2026-01-31 08:54:11.241 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:54:11 np0005603622 nova_compute[221301]: 2026-01-31 08:54:11.243 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4283MB free_disk=20.897842407226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:54:11 np0005603622 nova_compute[221301]: 2026-01-31 08:54:11.243 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:11 np0005603622 nova_compute[221301]: 2026-01-31 08:54:11.244 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:11.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.322321) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652322387, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1204, "num_deletes": 256, "total_data_size": 2522293, "memory_usage": 2566880, "flush_reason": "Manual Compaction"}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 31 03:54:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:12.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652345390, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1641451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74260, "largest_seqno": 75459, "table_properties": {"data_size": 1636203, "index_size": 2643, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11631, "raw_average_key_size": 19, "raw_value_size": 1625574, "raw_average_value_size": 2755, "num_data_blocks": 114, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849569, "oldest_key_time": 1769849569, "file_creation_time": 1769849652, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 23136 microseconds, and 6260 cpu microseconds.
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.345450) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1641451 bytes OK
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.345476) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.348799) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.348819) EVENT_LOG_v1 {"time_micros": 1769849652348811, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.348840) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2516501, prev total WAL file size 2516501, number of live WAL files 2.
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.349432) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373633' seq:72057594037927935, type:22 .. '6C6F676D0033303135' seq:0, type:0; will stop at (end)
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1602KB)], [150(12MB)]
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652349500, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 14411510, "oldest_snapshot_seqno": -1}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9632 keys, 14255721 bytes, temperature: kUnknown
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652467530, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 14255721, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14191178, "index_size": 39342, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24133, "raw_key_size": 253693, "raw_average_key_size": 26, "raw_value_size": 14020396, "raw_average_value_size": 1455, "num_data_blocks": 1510, "num_entries": 9632, "num_filter_entries": 9632, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849652, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.467802) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 14255721 bytes
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.468906) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 122.0 rd, 120.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.2 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(17.5) write-amplify(8.7) OK, records in: 10161, records dropped: 529 output_compression: NoCompression
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.468921) EVENT_LOG_v1 {"time_micros": 1769849652468914, "job": 96, "event": "compaction_finished", "compaction_time_micros": 118117, "compaction_time_cpu_micros": 26846, "output_level": 6, "num_output_files": 1, "total_output_size": 14255721, "num_input_records": 10161, "num_output_records": 9632, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652469240, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849652470648, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.349313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.470739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.470746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.470749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.470752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:12.470755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:12 np0005603622 nova_compute[221301]: 2026-01-31 08:54:12.938 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance c4e77744-5254-4a68-8cdf-49fce00d7b32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 31 03:54:12 np0005603622 nova_compute[221301]: 2026-01-31 08:54:12.938 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:54:12 np0005603622 nova_compute[221301]: 2026-01-31 08:54:12.939 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.134 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.135 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.253 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.344 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:54:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:13 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3864276524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.691 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.697 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:54:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:13.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.750 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.753 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.753 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.861 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.862 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.869 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:54:13 np0005603622 nova_compute[221301]: 2026-01-31 08:54:13.869 221324 INFO nova.compute.claims [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:54:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:14.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:14 np0005603622 nova_compute[221301]: 2026-01-31 08:54:14.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:15 np0005603622 nova_compute[221301]: 2026-01-31 08:54:15.337 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:15.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:16.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:16 np0005603622 nova_compute[221301]: 2026-01-31 08:54:16.753 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:16 np0005603622 nova_compute[221301]: 2026-01-31 08:54:16.754 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:54:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:17.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:18 np0005603622 nova_compute[221301]: 2026-01-31 08:54:18.149 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:18.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:54:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2799367344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:54:18 np0005603622 nova_compute[221301]: 2026-01-31 08:54:18.569 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:18 np0005603622 nova_compute[221301]: 2026-01-31 08:54:18.575 221324 DEBUG nova.compute.provider_tree [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:54:18 np0005603622 nova_compute[221301]: 2026-01-31 08:54:18.860 221324 DEBUG nova.scheduler.client.report [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:54:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:19.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:19 np0005603622 nova_compute[221301]: 2026-01-31 08:54:19.838 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 5.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:19 np0005603622 nova_compute[221301]: 2026-01-31 08:54:19.839 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:54:19 np0005603622 nova_compute[221301]: 2026-01-31 08:54:19.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:20 np0005603622 nova_compute[221301]: 2026-01-31 08:54:20.339 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:20.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:20 np0005603622 nova_compute[221301]: 2026-01-31 08:54:20.382 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:54:20 np0005603622 nova_compute[221301]: 2026-01-31 08:54:20.382 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:54:20 np0005603622 nova_compute[221301]: 2026-01-31 08:54:20.485 221324 INFO nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:54:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:20 np0005603622 nova_compute[221301]: 2026-01-31 08:54:20.997 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.178 221324 DEBUG nova.policy [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53804fd0f3a14f95a4955e3bc6dcc8cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:54:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:21.365 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:21.366 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:54:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:21.366 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:21.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.871 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.872 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.873 221324 INFO nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Creating image(s)#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.903 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.930 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.963 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:21 np0005603622 nova_compute[221301]: 2026-01-31 08:54:21.967 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.020 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.021 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.022 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.022 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.049 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:22 np0005603622 nova_compute[221301]: 2026-01-31 08:54:22.054 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c4e77744-5254-4a68-8cdf-49fce00d7b32_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:22.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.083 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 c4e77744-5254-4a68-8cdf-49fce00d7b32_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.182 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] resizing rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.698 221324 DEBUG nova.objects.instance [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'migration_context' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:54:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:23.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.854 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.854 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Ensure instance console log exists: /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.855 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.855 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:23 np0005603622 nova_compute[221301]: 2026-01-31 08:54:23.856 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:24 np0005603622 nova_compute[221301]: 2026-01-31 08:54:24.291 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Successfully created port: 004785c1-d0dc-4842-b644-9171e9d5b931 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:54:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:24.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:24 np0005603622 nova_compute[221301]: 2026-01-31 08:54:24.941 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:25 np0005603622 nova_compute[221301]: 2026-01-31 08:54:25.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:25.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:26.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:27.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.072 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Successfully updated port: 004785c1-d0dc-4842-b644-9171e9d5b931 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:54:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:28.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.578 221324 DEBUG nova.compute.manager [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-changed-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.578 221324 DEBUG nova.compute.manager [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Refreshing instance network info cache due to event network-changed-004785c1-d0dc-4842-b644-9171e9d5b931. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.579 221324 DEBUG oslo_concurrency.lockutils [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.579 221324 DEBUG oslo_concurrency.lockutils [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.579 221324 DEBUG nova.network.neutron [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Refreshing network info cache for port 004785c1-d0dc-4842-b644-9171e9d5b931 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:54:28 np0005603622 nova_compute[221301]: 2026-01-31 08:54:28.876 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:29 np0005603622 nova_compute[221301]: 2026-01-31 08:54:29.196 221324 DEBUG nova.network.neutron [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:54:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:29.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:29 np0005603622 nova_compute[221301]: 2026-01-31 08:54:29.943 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:30 np0005603622 nova_compute[221301]: 2026-01-31 08:54:30.004 221324 DEBUG nova.network.neutron [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:30 np0005603622 nova_compute[221301]: 2026-01-31 08:54:30.049 221324 DEBUG oslo_concurrency.lockutils [req-06a89ac4-7d9f-4a42-8ed6-88873823778c req-cc264d92-194b-43f5-9284-cf05e1a78f52 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:30 np0005603622 nova_compute[221301]: 2026-01-31 08:54:30.050 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:30 np0005603622 nova_compute[221301]: 2026-01-31 08:54:30.050 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:54:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:30.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:30 np0005603622 nova_compute[221301]: 2026-01-31 08:54:30.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:30.912 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:30.913 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:30.913 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:31 np0005603622 nova_compute[221301]: 2026-01-31 08:54:31.452 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:54:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:31.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:32.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:33.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:34.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:34 np0005603622 nova_compute[221301]: 2026-01-31 08:54:34.945 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:35 np0005603622 nova_compute[221301]: 2026-01-31 08:54:35.386 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:35.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.267 221324 DEBUG nova.network.neutron [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.369 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.370 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance network_info: |[{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.372 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start _get_guest_xml network_info=[{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:54:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:36.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.377 221324 WARNING nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.470 221324 DEBUG nova.virt.libvirt.host [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.472 221324 DEBUG nova.virt.libvirt.host [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.491 221324 DEBUG nova.virt.libvirt.host [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.492 221324 DEBUG nova.virt.libvirt.host [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.493 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.494 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.494 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.495 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.495 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.495 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.495 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.496 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.496 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.496 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.497 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.497 221324 DEBUG nova.virt.hardware [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.501 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:54:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/224646209' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.970 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:36 np0005603622 nova_compute[221301]: 2026-01-31 08:54:36.997 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.001 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:54:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2868829078' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.417 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.419 221324 DEBUG nova.virt.libvirt.vif [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:54:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.419 221324 DEBUG nova.network.os_vif_util [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.420 221324 DEBUG nova.network.os_vif_util [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.421 221324 DEBUG nova.objects.instance [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'pci_devices' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.460 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <uuid>c4e77744-5254-4a68-8cdf-49fce00d7b32</uuid>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <name>instance-000000b0</name>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeTestJSON-server-1705915819</nova:name>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:54:36</nova:creationTime>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:user uuid="53804fd0f3a14f95a4955e3bc6dcc8cb">tempest-AttachVolumeTestJSON-1437067745-project-member</nova:user>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:project uuid="ab2d642eb03c4bda84a9a23e86f1fa4d">tempest-AttachVolumeTestJSON-1437067745</nova:project>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <nova:port uuid="004785c1-d0dc-4842-b644-9171e9d5b931">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="serial">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="uuid">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ba:41:c4"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <target dev="tap004785c1-d0"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/console.log" append="off"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:54:37 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:54:37 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:54:37 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:54:37 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.462 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Preparing to wait for external event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.462 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.463 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.463 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.464 221324 DEBUG nova.virt.libvirt.vif [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:54:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.465 221324 DEBUG nova.network.os_vif_util [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.465 221324 DEBUG nova.network.os_vif_util [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.466 221324 DEBUG os_vif [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.467 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.467 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.471 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.471 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap004785c1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.472 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap004785c1-d0, col_values=(('external_ids', {'iface-id': '004785c1-d0dc-4842-b644-9171e9d5b931', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:41:c4', 'vm-uuid': 'c4e77744-5254-4a68-8cdf-49fce00d7b32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:37 np0005603622 NetworkManager[49080]: <info>  [1769849677.4751] manager: (tap004785c1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.475 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.481 221324 INFO os_vif [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.548 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.548 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.548 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No VIF found with MAC fa:16:3e:ba:41:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.549 221324 INFO nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Using config drive#033[00m
Jan 31 03:54:37 np0005603622 nova_compute[221301]: 2026-01-31 08:54:37.574 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:37.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:38.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:38 np0005603622 nova_compute[221301]: 2026-01-31 08:54:38.761 221324 INFO nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Creating config drive at /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config#033[00m
Jan 31 03:54:38 np0005603622 nova_compute[221301]: 2026-01-31 08:54:38.765 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpus4xtiv7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:38 np0005603622 nova_compute[221301]: 2026-01-31 08:54:38.888 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpus4xtiv7" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:38 np0005603622 nova_compute[221301]: 2026-01-31 08:54:38.917 221324 DEBUG nova.storage.rbd_utils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] rbd image c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:54:38 np0005603622 nova_compute[221301]: 2026-01-31 08:54:38.921 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.086 221324 DEBUG oslo_concurrency.processutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.087 221324 INFO nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Deleting local config drive /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/disk.config because it was imported into RBD.#033[00m
Jan 31 03:54:39 np0005603622 kernel: tap004785c1-d0: entered promiscuous mode
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.1310] manager: (tap004785c1-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Jan 31 03:54:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:39Z|00818|binding|INFO|Claiming lport 004785c1-d0dc-4842-b644-9171e9d5b931 for this chassis.
Jan 31 03:54:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:39Z|00819|binding|INFO|004785c1-d0dc-4842-b644-9171e9d5b931: Claiming fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.132 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.137 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 systemd-machined[190406]: New machine qemu-85-instance-000000b0.
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.154 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.155 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d bound to our chassis#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.157 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1b5345e-f6dc-4309-b059-80678428d42d#033[00m
Jan 31 03:54:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:39Z|00820|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 ovn-installed in OVS
Jan 31 03:54:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:39Z|00821|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 up in Southbound
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 systemd[1]: Started Virtual Machine qemu-85-instance-000000b0.
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.167 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[41a07135-7dc7-4694-a851-1f9f4e12dbe4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.168 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1b5345e-f1 in ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.170 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1b5345e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.171 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b3da9184-7bca-482a-a5ba-869b3ad9036b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.171 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3b319a74-17c7-489f-91e0-6da424e93c90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 systemd-udevd[293080]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.181 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[34c67098-32b3-4bb2-9f5b-a8221ac1bfa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.1838] device (tap004785c1-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.1844] device (tap004785c1-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.193 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[59417f15-8199-4eb5-b9d0-a7706e2e6d56]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.214 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a351ef71-a3a5-469d-8bdf-1fda5ff8a002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.220 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd7c8fd-c9d2-4689-b317-04fe81e7bc2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.2214] manager: (tapa1b5345e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/358)
Jan 31 03:54:39 np0005603622 systemd-udevd[293084]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.248 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[5acc09b8-6514-404d-816c-19a9d0dbfc50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.252 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[de2a87d6-780a-4715-a7d0-7888a0d69b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.2734] device (tapa1b5345e-f0): carrier: link connected
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.278 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[96b390b4-c0c7-4bb0-8594-dc5f80204a55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.291 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb8b87c2-81d2-44ff-9312-5680fd40d95c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 893061, 'reachable_time': 37893, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293112, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.301 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae6b642-968c-40f8-9534-33ea8989a8ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:43a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 893061, 'tstamp': 893061}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293113, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.317 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cce6cdf8-3360-4dad-a8ca-d09b3350eb01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 893061, 'reachable_time': 37893, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293114, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.338 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[427ec314-886d-4615-9f61-90ae26f7f6a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.376 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06203cb5-5467-4baa-b234-678e51e7de0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.377 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.378 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.378 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1b5345e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.379 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 kernel: tapa1b5345e-f0: entered promiscuous mode
Jan 31 03:54:39 np0005603622 NetworkManager[49080]: <info>  [1769849679.3804] manager: (tapa1b5345e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.383 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1b5345e-f0, col_values=(('external_ids', {'iface-id': '7a59c286-57bd-4dc4-87e7-a6bfcee69c68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:39Z|00822|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.386 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.386 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3052236-a1df-42de-8cec-55868d9855a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.387 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:54:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:54:39.388 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'env', 'PROCESS_TAG=haproxy-a1b5345e-f6dc-4309-b059-80678428d42d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1b5345e-f6dc-4309-b059-80678428d42d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 podman[293183]: 2026-01-31 08:54:39.735103677 +0000 UTC m=+0.059207686 container create 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 03:54:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:39.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:39 np0005603622 systemd[1]: Started libpod-conmon-26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620.scope.
Jan 31 03:54:39 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.771 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849679.7708755, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.772 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Started (Lifecycle Event)#033[00m
Jan 31 03:54:39 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae23282f617be58dde87454f23a8b3289be0a6d86688491afed6bb05729b2908/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:54:39 np0005603622 podman[293183]: 2026-01-31 08:54:39.789632667 +0000 UTC m=+0.113736686 container init 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 03:54:39 np0005603622 podman[293183]: 2026-01-31 08:54:39.794327193 +0000 UTC m=+0.118431202 container start 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:54:39 np0005603622 podman[293183]: 2026-01-31 08:54:39.701424055 +0000 UTC m=+0.025528094 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:54:39 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [NOTICE]   (293207) : New worker (293209) forked
Jan 31 03:54:39 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [NOTICE]   (293207) : Loading success.
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.858 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.864 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849679.771098, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.864 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.897 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.904 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.947 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.963 221324 DEBUG nova.compute.manager [req-8d827cab-46d4-425b-b875-5d6f89b85ac5 req-103f3d61-aa78-4a14-824f-8ae9201b0b4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.963 221324 DEBUG oslo_concurrency.lockutils [req-8d827cab-46d4-425b-b875-5d6f89b85ac5 req-103f3d61-aa78-4a14-824f-8ae9201b0b4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.963 221324 DEBUG oslo_concurrency.lockutils [req-8d827cab-46d4-425b-b875-5d6f89b85ac5 req-103f3d61-aa78-4a14-824f-8ae9201b0b4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.964 221324 DEBUG oslo_concurrency.lockutils [req-8d827cab-46d4-425b-b875-5d6f89b85ac5 req-103f3d61-aa78-4a14-824f-8ae9201b0b4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.964 221324 DEBUG nova.compute.manager [req-8d827cab-46d4-425b-b875-5d6f89b85ac5 req-103f3d61-aa78-4a14-824f-8ae9201b0b4b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Processing event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.964 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.969 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849679.9687943, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.969 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.970 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.974 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance spawned successfully.#033[00m
Jan 31 03:54:39 np0005603622 nova_compute[221301]: 2026-01-31 08:54:39.975 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.005 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.012 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.014 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.015 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.015 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.016 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.016 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.017 221324 DEBUG nova.virt.libvirt.driver [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.058 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.130 221324 INFO nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Took 18.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.131 221324 DEBUG nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.257 221324 INFO nova.compute.manager [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Took 26.78 seconds to build instance.#033[00m
Jan 31 03:54:40 np0005603622 nova_compute[221301]: 2026-01-31 08:54:40.317 221324 DEBUG oslo_concurrency.lockutils [None req-85f09039-b148-491a-a09b-29e2f238018a 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:40.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:41.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.121 221324 DEBUG nova.compute.manager [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.121 221324 DEBUG oslo_concurrency.lockutils [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.122 221324 DEBUG oslo_concurrency.lockutils [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.122 221324 DEBUG oslo_concurrency.lockutils [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.122 221324 DEBUG nova.compute.manager [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.122 221324 WARNING nova.compute.manager [req-88276082-39ab-4fa1-98d1-a102ce414b1d req-8ddc2fca-378a-4df4-92da-c567e1dfffc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:54:42 np0005603622 podman[293219]: 2026-01-31 08:54:42.151120345 +0000 UTC m=+0.067252291 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 03:54:42 np0005603622 podman[293218]: 2026-01-31 08:54:42.196542231 +0000 UTC m=+0.113576512 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:54:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:42.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:42 np0005603622 nova_compute[221301]: 2026-01-31 08:54:42.474 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:43.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:44.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:44 np0005603622 nova_compute[221301]: 2026-01-31 08:54:44.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:44 np0005603622 NetworkManager[49080]: <info>  [1769849684.9737] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Jan 31 03:54:44 np0005603622 NetworkManager[49080]: <info>  [1769849684.9754] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 31 03:54:44 np0005603622 nova_compute[221301]: 2026-01-31 08:54:44.983 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:45Z|00823|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.037 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.638 221324 DEBUG nova.compute.manager [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-changed-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.639 221324 DEBUG nova.compute.manager [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Refreshing instance network info cache due to event network-changed-004785c1-d0dc-4842-b644-9171e9d5b931. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.641 221324 DEBUG oslo_concurrency.lockutils [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.641 221324 DEBUG oslo_concurrency.lockutils [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:54:45 np0005603622 nova_compute[221301]: 2026-01-31 08:54:45.641 221324 DEBUG nova.network.neutron [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Refreshing network info cache for port 004785c1-d0dc-4842-b644-9171e9d5b931 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:54:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:54:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:54:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.451150) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687451182, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 625, "num_deletes": 252, "total_data_size": 931362, "memory_usage": 942784, "flush_reason": "Manual Compaction"}
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 31 03:54:47 np0005603622 nova_compute[221301]: 2026-01-31 08:54:47.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687660598, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 427060, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75464, "largest_seqno": 76084, "table_properties": {"data_size": 424340, "index_size": 691, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7578, "raw_average_key_size": 20, "raw_value_size": 418653, "raw_average_value_size": 1128, "num_data_blocks": 31, "num_entries": 371, "num_filter_entries": 371, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849652, "oldest_key_time": 1769849652, "file_creation_time": 1769849687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 209501 microseconds, and 1776 cpu microseconds.
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.660646) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 427060 bytes OK
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.660668) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.765096) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.765143) EVENT_LOG_v1 {"time_micros": 1769849687765126, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.765175) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 927899, prev total WAL file size 927899, number of live WAL files 2.
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.765630) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353034' seq:72057594037927935, type:22 .. '6D6772737461740032373537' seq:0, type:0; will stop at (end)
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(417KB)], [153(13MB)]
Jan 31 03:54:47 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849687765675, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 14682781, "oldest_snapshot_seqno": -1}
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9502 keys, 10991988 bytes, temperature: kUnknown
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849688274933, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 10991988, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10932829, "index_size": 34278, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23813, "raw_key_size": 251187, "raw_average_key_size": 26, "raw_value_size": 10768820, "raw_average_value_size": 1133, "num_data_blocks": 1299, "num_entries": 9502, "num_filter_entries": 9502, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.275218) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 10991988 bytes
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.295601) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 28.8 rd, 21.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.6 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(60.1) write-amplify(25.7) OK, records in: 10003, records dropped: 501 output_compression: NoCompression
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.295662) EVENT_LOG_v1 {"time_micros": 1769849688295638, "job": 98, "event": "compaction_finished", "compaction_time_micros": 509337, "compaction_time_cpu_micros": 23233, "output_level": 6, "num_output_files": 1, "total_output_size": 10991988, "num_input_records": 10003, "num_output_records": 9502, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849688296011, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849688298725, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:47.765557) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.298857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.298865) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.298868) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.298871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:54:48.298874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:54:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:48.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:49.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:49 np0005603622 nova_compute[221301]: 2026-01-31 08:54:49.826 221324 DEBUG nova.network.neutron [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updated VIF entry in instance network info cache for port 004785c1-d0dc-4842-b644-9171e9d5b931. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:54:49 np0005603622 nova_compute[221301]: 2026-01-31 08:54:49.827 221324 DEBUG nova.network.neutron [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:54:49 np0005603622 nova_compute[221301]: 2026-01-31 08:54:49.952 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:49 np0005603622 nova_compute[221301]: 2026-01-31 08:54:49.964 221324 DEBUG oslo_concurrency.lockutils [req-514c45ec-48a4-463b-8a6e-3e646f71cb5c req-26df7f63-b4b3-464b-acd6-d6ff6b085603 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:54:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:50.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:51 np0005603622 nova_compute[221301]: 2026-01-31 08:54:51.591 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:51.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:52.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:52 np0005603622 nova_compute[221301]: 2026-01-31 08:54:52.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:53.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:53Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:54:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:54:53Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:54:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:54.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:54 np0005603622 nova_compute[221301]: 2026-01-31 08:54:54.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:54:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:55.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 31 03:54:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:56.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:57 np0005603622 nova_compute[221301]: 2026-01-31 08:54:57.484 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:57.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:58 np0005603622 nova_compute[221301]: 2026-01-31 08:54:58.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:54:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:54:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:54:58.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:54:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:54:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:54:59 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:54:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:54:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:54:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:54:59.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:54:59 np0005603622 nova_compute[221301]: 2026-01-31 08:54:59.957 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:00.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:55:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:01.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:55:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:02.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:02 np0005603622 nova_compute[221301]: 2026-01-31 08:55:02.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:03.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.041 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.042 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.066 221324 DEBUG nova.objects.instance [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.160 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:55:04 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:04.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.706 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.707 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.707 221324 INFO nova.compute.manager [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Attaching volume 95b6d139-2444-4c12-8219-f46fb97d8a4d to /dev/vdb#033[00m
Jan 31 03:55:04 np0005603622 nova_compute[221301]: 2026-01-31 08:55:04.959 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.132 221324 DEBUG os_brick.utils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.134 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.145 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.145 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b551fb9e-c8c1-4e89-b199-4c56dd388113]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.147 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.153 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.153 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[4079ae7c-66bf-4df3-9af6-1e83876d5f84]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.154 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.161 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.161 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a8fff9-4ae8-424c-b424-e66275921c7e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.162 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccb27da-0672-48de-8899-64016761bd6c]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.163 221324 DEBUG oslo_concurrency.processutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.185 221324 DEBUG oslo_concurrency.processutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.187 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.187 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.187 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.187 221324 DEBUG os_brick.utils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] <== get_connector_properties: return (55ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:55:05 np0005603622 nova_compute[221301]: 2026-01-31 08:55:05.188 221324 DEBUG nova.virt.block_device [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating existing volume attachment record: 89694797-7609-4300-be9a-9184bf1e68fc _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:55:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:55:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:55:06 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:06Z|00824|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.108 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:06.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.716 221324 DEBUG nova.objects.instance [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.784 221324 DEBUG nova.virt.libvirt.driver [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Attempting to attach volume 95b6d139-2444-4c12-8219-f46fb97d8a4d with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.787 221324 DEBUG nova.virt.libvirt.guest [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-95b6d139-2444-4c12-8219-f46fb97d8a4d">
Jan 31 03:55:06 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:55:06 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:55:06 np0005603622 nova_compute[221301]:  <serial>95b6d139-2444-4c12-8219-f46fb97d8a4d</serial>
Jan 31 03:55:06 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:55:06 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.924 221324 DEBUG nova.virt.libvirt.driver [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.925 221324 DEBUG nova.virt.libvirt.driver [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.925 221324 DEBUG nova.virt.libvirt.driver [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:06 np0005603622 nova_compute[221301]: 2026-01-31 08:55:06.925 221324 DEBUG nova.virt.libvirt.driver [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] No VIF found with MAC fa:16:3e:ba:41:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.493 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:07 np0005603622 nova_compute[221301]: 2026-01-31 08:55:07.696 221324 DEBUG oslo_concurrency.lockutils [None req-3a9ced27-2899-4859-9a08-0def58452d1e 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:07.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.207 221324 DEBUG oslo_concurrency.lockutils [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.208 221324 DEBUG oslo_concurrency.lockutils [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.208 221324 DEBUG nova.compute.manager [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.212 221324 DEBUG nova.compute.manager [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.213 221324 DEBUG nova.objects.instance [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:08 np0005603622 nova_compute[221301]: 2026-01-31 08:55:08.261 221324 DEBUG nova.virt.libvirt.driver [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:55:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:08.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:09.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:09 np0005603622 nova_compute[221301]: 2026-01-31 08:55:09.960 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:55:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:10.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.428 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.428 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:10 np0005603622 kernel: tap004785c1-d0 (unregistering): left promiscuous mode
Jan 31 03:55:10 np0005603622 NetworkManager[49080]: <info>  [1769849710.5544] device (tap004785c1-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:55:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:10Z|00825|binding|INFO|Releasing lport 004785c1-d0dc-4842-b644-9171e9d5b931 from this chassis (sb_readonly=0)
Jan 31 03:55:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:10Z|00826|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 down in Southbound
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.558 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:10Z|00827|binding|INFO|Removing iface tap004785c1-d0 ovn-installed in OVS
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.560 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.567 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.569 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d unbound from our chassis#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.571 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1b5345e-f6dc-4309-b059-80678428d42d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[99f7dab0-f0ec-4777-989c-5b560d1cbcbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.573 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace which is not needed anymore#033[00m
Jan 31 03:55:10 np0005603622 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:10 np0005603622 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b0.scope: Consumed 14.662s CPU time.
Jan 31 03:55:10 np0005603622 systemd-machined[190406]: Machine qemu-85-instance-000000b0 terminated.
Jan 31 03:55:10 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [NOTICE]   (293207) : haproxy version is 2.8.14-c23fe91
Jan 31 03:55:10 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [NOTICE]   (293207) : path to executable is /usr/sbin/haproxy
Jan 31 03:55:10 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [WARNING]  (293207) : Exiting Master process...
Jan 31 03:55:10 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [ALERT]    (293207) : Current worker (293209) exited with code 143 (Terminated)
Jan 31 03:55:10 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293203]: [WARNING]  (293207) : All workers exited. Exiting... (0)
Jan 31 03:55:10 np0005603622 systemd[1]: libpod-26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603622 podman[293495]: 2026-01-31 08:55:10.707109812 +0000 UTC m=+0.045265003 container died 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 03:55:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620-userdata-shm.mount: Deactivated successfully.
Jan 31 03:55:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-ae23282f617be58dde87454f23a8b3289be0a6d86688491afed6bb05729b2908-merged.mount: Deactivated successfully.
Jan 31 03:55:10 np0005603622 podman[293495]: 2026-01-31 08:55:10.745377817 +0000 UTC m=+0.083533008 container cleanup 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 03:55:10 np0005603622 systemd[1]: libpod-conmon-26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620.scope: Deactivated successfully.
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.780 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 podman[293525]: 2026-01-31 08:55:10.804648423 +0000 UTC m=+0.045889459 container remove 26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.808 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[484c9645-9626-4084-9597-5857f98c4db2]: (4, ('Sat Jan 31 08:55:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620)\n26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620\nSat Jan 31 08:55:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620)\n26e384eac010adfc8ff72c74b21fc0a16691ac57fe1bca250d705251a8ee3620\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.810 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb98b57-0002-4538-b7ca-76e1394c8d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.811 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.813 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 kernel: tapa1b5345e-f0: left promiscuous mode
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.820 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 nova_compute[221301]: 2026-01-31 08:55:10.823 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[746592e1-dcc0-4e76-87fc-86aeefa9125a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.847 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1b92b1-655d-4a57-a617-5f22a2cabea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.849 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e2568afd-3ba3-4bca-a624-fe1a3da79f0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.862 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[237a9e41-f400-4b1e-9780-b9ba117a4a2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 893055, 'reachable_time': 22370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293553, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.865 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:55:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:10.865 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1684f9a8-71ea-421c-8ced-510ede1d50df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:10 np0005603622 systemd[1]: run-netns-ovnmeta\x2da1b5345e\x2df6dc\x2d4309\x2db059\x2d80678428d42d.mount: Deactivated successfully.
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.233 221324 DEBUG nova.compute.manager [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.234 221324 DEBUG oslo_concurrency.lockutils [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.234 221324 DEBUG oslo_concurrency.lockutils [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.234 221324 DEBUG oslo_concurrency.lockutils [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.235 221324 DEBUG nova.compute.manager [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.235 221324 WARNING nova.compute.manager [req-6140236f-a45e-4219-aca0-f3fa54ee0686 req-4f948d5c-0593-44dc-b911-806007ff3c24 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state powering-off.#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.278 221324 INFO nova.virt.libvirt.driver [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.284 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance destroyed successfully.#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.285 221324 DEBUG nova.objects.instance [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'numa_topology' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.314 221324 DEBUG nova.compute.manager [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:11 np0005603622 nova_compute[221301]: 2026-01-31 08:55:11.472 221324 DEBUG oslo_concurrency.lockutils [None req-fa0bd66b-54bc-4868-b9a5-60c1b0b59b94 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:12.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:12 np0005603622 nova_compute[221301]: 2026-01-31 08:55:12.497 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:13 np0005603622 podman[293555]: 2026-01-31 08:55:13.121941649 +0000 UTC m=+0.045554721 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 31 03:55:13 np0005603622 podman[293554]: 2026-01-31 08:55:13.146982379 +0000 UTC m=+0.073653343 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.597 221324 DEBUG nova.compute.manager [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.597 221324 DEBUG oslo_concurrency.lockutils [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.598 221324 DEBUG oslo_concurrency.lockutils [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.598 221324 DEBUG oslo_concurrency.lockutils [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.598 221324 DEBUG nova.compute.manager [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:13 np0005603622 nova_compute[221301]: 2026-01-31 08:55:13.598 221324 WARNING nova.compute.manager [req-d26931ee-9638-457b-bb6e-cee22fe809ce req-aad36a67-3750-4891-bb9e-c41f388d54a4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:55:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:13.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:14.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.768 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.825 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.825 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.826 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.826 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.860 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.861 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.861 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.861 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.862 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:14 np0005603622 nova_compute[221301]: 2026-01-31 08:55:14.962 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:15 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/526314432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.367 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.512 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.512 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.512 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.626 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.627 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4302MB free_disk=20.8839111328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.627 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:15.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.803 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance c4e77744-5254-4a68-8cdf-49fce00d7b32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.804 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:55:15 np0005603622 nova_compute[221301]: 2026-01-31 08:55:15.804 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.218 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.416 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:16.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.439 221324 DEBUG oslo_concurrency.lockutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.440 221324 DEBUG oslo_concurrency.lockutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.440 221324 DEBUG nova.network.neutron [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.440 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'info_cache' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2734651748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.645 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.651 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.706 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.758 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:55:16 np0005603622 nova_compute[221301]: 2026-01-31 08:55:16.759 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:17 np0005603622 nova_compute[221301]: 2026-01-31 08:55:17.329 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:17 np0005603622 nova_compute[221301]: 2026-01-31 08:55:17.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:17.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:18 np0005603622 nova_compute[221301]: 2026-01-31 08:55:18.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:18.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:18 np0005603622 nova_compute[221301]: 2026-01-31 08:55:18.991 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:19 np0005603622 nova_compute[221301]: 2026-01-31 08:55:19.084 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:19.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:19 np0005603622 nova_compute[221301]: 2026-01-31 08:55:19.857 221324 DEBUG nova.network.neutron [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.010 221324 DEBUG oslo_concurrency.lockutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.058 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance destroyed successfully.#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.058 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'numa_topology' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.109 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'resources' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.329 221324 DEBUG nova.virt.libvirt.vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.330 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.332 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.332 221324 DEBUG os_vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.335 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap004785c1-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.337 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.338 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.339 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.342 221324 INFO os_vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.354 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start _get_guest_xml network_info=[{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': None, 'attachment_id': '89694797-7609-4300-be9a-9184bf1e68fc', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-95b6d139-2444-4c12-8219-f46fb97d8a4d', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '95b6d139-2444-4c12-8219-f46fb97d8a4d', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'attached_at': '', 'detached_at': '', 'volume_id': '95b6d139-2444-4c12-8219-f46fb97d8a4d', 'serial': '95b6d139-2444-4c12-8219-f46fb97d8a4d'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.359 221324 WARNING nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.402 221324 DEBUG nova.virt.libvirt.host [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.403 221324 DEBUG nova.virt.libvirt.host [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.408 221324 DEBUG nova.virt.libvirt.host [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.408 221324 DEBUG nova.virt.libvirt.host [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.410 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.410 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.410 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.411 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.411 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.411 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.411 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.412 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.412 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.412 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.412 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.412 221324 DEBUG nova.virt.hardware [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.413 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'vcpu_model' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 03:55:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:20.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.476 221324 DEBUG oslo_concurrency.processutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3303057521' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.907 221324 DEBUG oslo_concurrency.processutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:20 np0005603622 nova_compute[221301]: 2026-01-31 08:55:20.982 221324 DEBUG oslo_concurrency.processutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1386775361' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.427 221324 DEBUG oslo_concurrency.processutils [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.478 221324 DEBUG nova.virt.libvirt.vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.478 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.479 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.480 221324 DEBUG nova.objects.instance [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'pci_devices' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.517 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <uuid>c4e77744-5254-4a68-8cdf-49fce00d7b32</uuid>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <name>instance-000000b0</name>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeTestJSON-server-1705915819</nova:name>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:55:20</nova:creationTime>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:user uuid="53804fd0f3a14f95a4955e3bc6dcc8cb">tempest-AttachVolumeTestJSON-1437067745-project-member</nova:user>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:project uuid="ab2d642eb03c4bda84a9a23e86f1fa4d">tempest-AttachVolumeTestJSON-1437067745</nova:project>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <nova:port uuid="004785c1-d0dc-4842-b644-9171e9d5b931">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="serial">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="uuid">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-95b6d139-2444-4c12-8219-f46fb97d8a4d">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <serial>95b6d139-2444-4c12-8219-f46fb97d8a4d</serial>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ba:41:c4"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <target dev="tap004785c1-d0"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/console.log" append="off"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:55:21 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:55:21 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:55:21 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:55:21 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.519 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.519 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.520 221324 DEBUG nova.virt.libvirt.driver [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.521 221324 DEBUG nova.virt.libvirt.vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.521 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.522 221324 DEBUG nova.network.os_vif_util [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.522 221324 DEBUG os_vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.523 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.523 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.523 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.526 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap004785c1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.526 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap004785c1-d0, col_values=(('external_ids', {'iface-id': '004785c1-d0dc-4842-b644-9171e9d5b931', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:41:c4', 'vm-uuid': 'c4e77744-5254-4a68-8cdf-49fce00d7b32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.5295] manager: (tap004785c1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.532 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.534 221324 INFO os_vif [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:55:21 np0005603622 kernel: tap004785c1-d0: entered promiscuous mode
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.5950] manager: (tap004785c1-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:21Z|00828|binding|INFO|Claiming lport 004785c1-d0dc-4842-b644-9171e9d5b931 for this chassis.
Jan 31 03:55:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:21Z|00829|binding|INFO|004785c1-d0dc-4842-b644-9171e9d5b931: Claiming fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.609 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.6105] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.6116] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 31 03:55:21 np0005603622 systemd-udevd[293718]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:21 np0005603622 systemd-machined[190406]: New machine qemu-86-instance-000000b0.
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.6310] device (tap004785c1-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.6317] device (tap004785c1-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.635 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.636 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d bound to our chassis#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.638 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1b5345e-f6dc-4309-b059-80678428d42d#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.644 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[00da7b5d-377f-4f8c-acd9-b37e600b5555]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.644 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1b5345e-f1 in ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.646 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1b5345e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.646 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ac26b8fb-a04b-416f-9205-4b2af02f6594]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.647 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ffaae559-7609-4104-b8d9-ed2fef539c78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 systemd[1]: Started Virtual Machine qemu-86-instance-000000b0.
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.654 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf89ee7-7a44-4574-a7ae-36d811dd57c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.669 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.675 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b0e5f3-fc18-41d6-9b39-62ab20cc6abf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.697 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:21Z|00830|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 ovn-installed in OVS
Jan 31 03:55:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:21Z|00831|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 up in Southbound
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.700 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.699 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[43170f55-4c75-402f-b5ee-d5fb9f580ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.706 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[176ab0f9-2e05-491a-8e63-89b0bad4395f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.7068] manager: (tapa1b5345e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/366)
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.730 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3b97c236-38f6-4214-b74c-5ef75b77c89c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.735 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f821c36c-a01e-409c-ba8b-12321caa5d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.7552] device (tapa1b5345e-f0): carrier: link connected
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.759 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0ddc2a-4fec-4a07-943e-2c983ee9f391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b55f4d63-3a33-4586-b234-5db105903890]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 897309, 'reachable_time': 35833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293754, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.786 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d7291ccb-dad5-4933-b445-8c3c4ac9a8e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:43a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 897309, 'tstamp': 897309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293755, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.799 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8084ee63-94cf-4e38-bf73-8ab4d4a3ba1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 897309, 'reachable_time': 35833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293756, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:55:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:21.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.823 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4c0b0b-5fb5-4c5e-a099-25dc48b9200f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.868 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd1597-9423-49ec-ad90-fee504ec6624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.870 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.870 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.871 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1b5345e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 kernel: tapa1b5345e-f0: entered promiscuous mode
Jan 31 03:55:21 np0005603622 NetworkManager[49080]: <info>  [1769849721.8755] manager: (tapa1b5345e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.878 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1b5345e-f0, col_values=(('external_ids', {'iface-id': '7a59c286-57bd-4dc4-87e7-a6bfcee69c68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:21Z|00832|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.885 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:55:21 np0005603622 nova_compute[221301]: 2026-01-31 08:55:21.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.885 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[36587e12-3083-416e-8262-c0cfabb81e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.886 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:55:21 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:21.887 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'env', 'PROCESS_TAG=haproxy-a1b5345e-f6dc-4309-b059-80678428d42d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1b5345e-f6dc-4309-b059-80678428d42d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.185 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for c4e77744-5254-4a68-8cdf-49fce00d7b32 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.185 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849722.1846728, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.185 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.189 221324 DEBUG nova.compute.manager [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.192 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance rebooted successfully.#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.192 221324 DEBUG nova.compute.manager [None req-b8a0a046-6c54-422f-9fd8-d6b64c485867 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.220 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:22 np0005603622 podman[293847]: 2026-01-31 08:55:22.223826671 +0000 UTC m=+0.059112433 container create daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.223 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.251 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.252 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849722.186088, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.252 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Started (Lifecycle Event)#033[00m
Jan 31 03:55:22 np0005603622 systemd[1]: Started libpod-conmon-daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d.scope.
Jan 31 03:55:22 np0005603622 podman[293847]: 2026-01-31 08:55:22.187209111 +0000 UTC m=+0.022494893 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:55:22 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:55:22 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ff9fd3df085fb741b20f7d827e9cbbbbc22cbf4ac7384bf197af6dae3a8cd7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:55:22 np0005603622 podman[293847]: 2026-01-31 08:55:22.30033184 +0000 UTC m=+0.135617602 container init daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:22 np0005603622 podman[293847]: 2026-01-31 08:55:22.308021955 +0000 UTC m=+0.143307727 container start daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:55:22 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [NOTICE]   (293865) : New worker (293867) forked
Jan 31 03:55:22 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [NOTICE]   (293865) : Loading success.
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.395 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.397 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:22.417 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:22.418 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:55:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:22.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:22 np0005603622 nova_compute[221301]: 2026-01-31 08:55:22.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.312 221324 DEBUG nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.312 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.313 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.313 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.314 221324 DEBUG nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.314 221324 WARNING nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.315 221324 DEBUG nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.315 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.315 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.316 221324 DEBUG oslo_concurrency.lockutils [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.316 221324 DEBUG nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.317 221324 WARNING nova.compute.manager [req-9d67e3c1-21af-4cda-975b-005a0b679dc5 req-d0e36bc0-0fc1-405d-b807-891f5ac30d5c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:55:23 np0005603622 nova_compute[221301]: 2026-01-31 08:55:23.495 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:23.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:24.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:25 np0005603622 nova_compute[221301]: 2026-01-31 08:55:25.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:25.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:26.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:26 np0005603622 nova_compute[221301]: 2026-01-31 08:55:26.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:27.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:28.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:29.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:30 np0005603622 nova_compute[221301]: 2026-01-31 08:55:30.014 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:30.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:30.914 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:30.914 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:30.915 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:31.421 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:31 np0005603622 nova_compute[221301]: 2026-01-31 08:55:31.531 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:55:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:31.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:55:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:32.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:32 np0005603622 nova_compute[221301]: 2026-01-31 08:55:32.729 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:33.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:34Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:55:34 np0005603622 nova_compute[221301]: 2026-01-31 08:55:34.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:55:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:34.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:35 np0005603622 nova_compute[221301]: 2026-01-31 08:55:35.017 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:35 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:35Z|00833|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:55:35 np0005603622 nova_compute[221301]: 2026-01-31 08:55:35.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:35.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:36.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:36 np0005603622 nova_compute[221301]: 2026-01-31 08:55:36.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:37.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:38.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:39.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:40 np0005603622 nova_compute[221301]: 2026-01-31 08:55:40.059 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:40.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.106 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.106 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.110 221324 DEBUG oslo_concurrency.lockutils [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.111 221324 DEBUG oslo_concurrency.lockutils [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.201 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.202 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.203 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.213 221324 INFO nova.compute.manager [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Detaching volume 95b6d139-2444-4c12-8219-f46fb97d8a4d#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.271 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.403 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.403 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.413 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.413 221324 INFO nova.compute.claims [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.449 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.541 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.557 221324 INFO nova.virt.block_device [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Attempting to driver detach volume 95b6d139-2444-4c12-8219-f46fb97d8a4d from mountpoint /dev/vdb#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.570 221324 DEBUG nova.virt.libvirt.driver [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Attempting to detach device vdb from instance c4e77744-5254-4a68-8cdf-49fce00d7b32 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.571 221324 DEBUG nova.virt.libvirt.guest [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-95b6d139-2444-4c12-8219-f46fb97d8a4d">
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <serial>95b6d139-2444-4c12-8219-f46fb97d8a4d</serial>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:55:41 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.581 221324 INFO nova.virt.libvirt.driver [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdb from instance c4e77744-5254-4a68-8cdf-49fce00d7b32 from the persistent domain config.#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.582 221324 DEBUG nova.virt.libvirt.driver [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance c4e77744-5254-4a68-8cdf-49fce00d7b32 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.583 221324 DEBUG nova.virt.libvirt.guest [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-95b6d139-2444-4c12-8219-f46fb97d8a4d">
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <serial>95b6d139-2444-4c12-8219-f46fb97d8a4d</serial>
Jan 31 03:55:41 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 03:55:41 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:55:41 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.669 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.699 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849741.698995, c4e77744-5254-4a68-8cdf-49fce00d7b32 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.701 221324 DEBUG nova.virt.libvirt.driver [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance c4e77744-5254-4a68-8cdf-49fce00d7b32 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:55:41 np0005603622 nova_compute[221301]: 2026-01-31 08:55:41.705 221324 INFO nova.virt.libvirt.driver [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully detached device vdb from instance c4e77744-5254-4a68-8cdf-49fce00d7b32 from the live domain config.#033[00m
Jan 31 03:55:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:41.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4202413044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.067 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.075 221324 DEBUG nova.compute.provider_tree [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.087 221324 DEBUG nova.objects.instance [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.145 221324 DEBUG nova.scheduler.client.report [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.175 221324 DEBUG oslo_concurrency.lockutils [None req-9a8d521f-fc46-47a7-9d4c-15766f75b0fe 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.177 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.178 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.180 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.185 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.185 221324 INFO nova.compute.claims [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.278 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.279 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.307 221324 INFO nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.333 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.389 221324 INFO nova.virt.block_device [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Booting with volume a6407347-cd1c-4e32-a08c-1836a1f51dd6 at /dev/vda#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.470 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:42.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.654 221324 DEBUG os_brick.utils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.656 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.663 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.663 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[ed62bf63-29b4-40a5-a42f-113a8a53a091]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.665 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.669 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.670 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[de75aaf6-608c-47f6-85e7-c0b7bf96bf1a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.672 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.676 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.004s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.677 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[6251ffc7-29df-4a5d-9036-b74ec7c25027]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.679 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f2ba46-016d-4ee8-9791-4c97c65dd720]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.680 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.704 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.706 221324 DEBUG os_brick.initiator.connectors.lightos [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.707 221324 DEBUG os_brick.initiator.connectors.lightos [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.707 221324 DEBUG os_brick.initiator.connectors.lightos [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.707 221324 DEBUG os_brick.utils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (52ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.708 221324 DEBUG nova.virt.block_device [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating existing volume attachment record: 5aeae977-62f2-4ef6-a868-d5612d941b40 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.830 221324 DEBUG oslo_concurrency.lockutils [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.830 221324 DEBUG oslo_concurrency.lockutils [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.830 221324 DEBUG nova.compute.manager [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.833 221324 DEBUG nova.compute.manager [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.833 221324 DEBUG nova.objects.instance [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.896 221324 DEBUG nova.virt.libvirt.driver [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 31 03:55:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:55:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2841515218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.939 221324 DEBUG nova.policy [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfaebb011a374541b083e772a6c83f25', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.954 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.961 221324 DEBUG nova.compute.provider_tree [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:55:42 np0005603622 nova_compute[221301]: 2026-01-31 08:55:42.987 221324 DEBUG nova.scheduler.client.report [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.039 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.040 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.129 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.130 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.189 221324 INFO nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.224 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.379 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.380 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.380 221324 INFO nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Creating image(s)#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.414 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.448 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.478 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.482 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.544 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.545 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.546 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.546 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.574 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.577 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 272230f9-fca3-4297-95c2-904c72ad5988_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:43.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.875 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 272230f9-fca3-4297-95c2-904c72ad5988_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:43 np0005603622 nova_compute[221301]: 2026-01-31 08:55:43.945 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] resizing rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.047 221324 DEBUG nova.objects.instance [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'migration_context' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.062 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.062 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Ensure instance console log exists: /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.063 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.063 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.063 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.119 221324 DEBUG nova.policy [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1c6e7eff11b435a81429826a682b32f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0bfe11bd9d694684b527666e2c378eed', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:55:44 np0005603622 podman[294095]: 2026-01-31 08:55:44.148069096 +0000 UTC m=+0.072059240 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:55:44 np0005603622 podman[294096]: 2026-01-31 08:55:44.152937867 +0000 UTC m=+0.079768817 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.359 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.361 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.361 221324 INFO nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Creating image(s)#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.362 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.362 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Ensure instance console log exists: /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.362 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.363 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:44 np0005603622 nova_compute[221301]: 2026-01-31 08:55:44.363 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:44.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.062 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 kernel: tap004785c1-d0 (unregistering): left promiscuous mode
Jan 31 03:55:45 np0005603622 NetworkManager[49080]: <info>  [1769849745.2857] device (tap004785c1-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.285 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:45Z|00834|binding|INFO|Releasing lport 004785c1-d0dc-4842-b644-9171e9d5b931 from this chassis (sb_readonly=0)
Jan 31 03:55:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:45Z|00835|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 down in Southbound
Jan 31 03:55:45 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:45Z|00836|binding|INFO|Removing iface tap004785c1-d0 ovn-installed in OVS
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.291 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.292 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.299 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.304 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.305 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d unbound from our chassis#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.307 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1b5345e-f6dc-4309-b059-80678428d42d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.308 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3e762c4b-9a39-4c26-b7a7-e04cb2198f0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.309 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace which is not needed anymore#033[00m
Jan 31 03:55:45 np0005603622 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 31 03:55:45 np0005603622 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b0.scope: Consumed 13.870s CPU time.
Jan 31 03:55:45 np0005603622 systemd-machined[190406]: Machine qemu-86-instance-000000b0 terminated.
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.397 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Successfully created port: 71b92e93-ebb3-4664-a6af-06e2b936453f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [NOTICE]   (293865) : haproxy version is 2.8.14-c23fe91
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [NOTICE]   (293865) : path to executable is /usr/sbin/haproxy
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [WARNING]  (293865) : Exiting Master process...
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [WARNING]  (293865) : Exiting Master process...
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [ALERT]    (293865) : Current worker (293867) exited with code 143 (Terminated)
Jan 31 03:55:45 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[293861]: [WARNING]  (293865) : All workers exited. Exiting... (0)
Jan 31 03:55:45 np0005603622 systemd[1]: libpod-daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d.scope: Deactivated successfully.
Jan 31 03:55:45 np0005603622 podman[294163]: 2026-01-31 08:55:45.429868371 +0000 UTC m=+0.049853826 container died daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 03:55:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d-userdata-shm.mount: Deactivated successfully.
Jan 31 03:55:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay-0ff9fd3df085fb741b20f7d827e9cbbbbc22cbf4ac7384bf197af6dae3a8cd7d-merged.mount: Deactivated successfully.
Jan 31 03:55:45 np0005603622 podman[294163]: 2026-01-31 08:55:45.48064928 +0000 UTC m=+0.100634715 container cleanup daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:45 np0005603622 systemd[1]: libpod-conmon-daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d.scope: Deactivated successfully.
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.506 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 podman[294194]: 2026-01-31 08:55:45.547651244 +0000 UTC m=+0.049238629 container remove daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.552 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4ce7a4-3010-4cc3-adb0-9723c050c056]: (4, ('Sat Jan 31 08:55:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d)\ndaab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d\nSat Jan 31 08:55:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (daab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d)\ndaab4ec7b80fd6eca9205738fe19e7f1f69133739f177dfa1a0c572f9d606c7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.554 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f43c1193-8082-4f2d-bb2e-428add612046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.555 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 kernel: tapa1b5345e-f0: left promiscuous mode
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.565 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.569 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[41d70a22-3b41-493e-9a4c-05604decbdca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.583 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[31abe7a6-d592-41ab-bc8c-9edc5555d958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.585 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4064e84c-55c5-4b88-b4b6-f831066f0883]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.597 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0feaa0-9721-4f4e-8432-5e888bd3b911]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 897303, 'reachable_time': 27659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294223, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 systemd[1]: run-netns-ovnmeta\x2da1b5345e\x2df6dc\x2d4309\x2db059\x2d80678428d42d.mount: Deactivated successfully.
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.601 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:55:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:45.602 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[241deda7-e8a4-4151-9dde-38c6e93a1362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.739 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:55:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:45.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.990 221324 INFO nova.virt.libvirt.driver [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance shutdown successfully after 3 seconds.#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.996 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance destroyed successfully.#033[00m
Jan 31 03:55:45 np0005603622 nova_compute[221301]: 2026-01-31 08:55:45.996 221324 DEBUG nova.objects.instance [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'numa_topology' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.022 221324 DEBUG nova.compute.manager [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.100 221324 DEBUG oslo_concurrency.lockutils [None req-91abb322-b84c-45ef-8377-2c0b2f265467 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.424 221324 DEBUG nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.424 221324 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.425 221324 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.425 221324 DEBUG oslo_concurrency.lockutils [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.426 221324 DEBUG nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.426 221324 WARNING nova.compute.manager [req-43d09744-b752-4df6-9a13-d881b79565d7 req-0b5f91b4-bf87-496b-817f-0c2095c7e88b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state stopped and task_state None.#033[00m
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.469 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Successfully created port: 477ad4f0-2518-4735-b590-76e12526b832 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:55:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:46.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:46 np0005603622 nova_compute[221301]: 2026-01-31 08:55:46.543 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:47.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.370 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'flavor' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.372 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Successfully updated port: 71b92e93-ebb3-4664-a6af-06e2b936453f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.409 221324 DEBUG oslo_concurrency.lockutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.410 221324 DEBUG oslo_concurrency.lockutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.410 221324 DEBUG nova.network.neutron [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.410 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'info_cache' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.412 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.413 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.413 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:55:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:48.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.681 221324 DEBUG nova.compute.manager [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.681 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.681 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.682 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.682 221324 DEBUG nova.compute.manager [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.682 221324 WARNING nova.compute.manager [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.683 221324 DEBUG nova.compute.manager [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.683 221324 DEBUG nova.compute.manager [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.683 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:48 np0005603622 nova_compute[221301]: 2026-01-31 08:55:48.819 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:55:49 np0005603622 nova_compute[221301]: 2026-01-31 08:55:49.052 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Successfully updated port: 477ad4f0-2518-4735-b590-76e12526b832 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:55:49 np0005603622 nova_compute[221301]: 2026-01-31 08:55:49.086 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:49 np0005603622 nova_compute[221301]: 2026-01-31 08:55:49.087 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:49 np0005603622 nova_compute[221301]: 2026-01-31 08:55:49.087 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:55:49 np0005603622 nova_compute[221301]: 2026-01-31 08:55:49.404 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:55:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:49.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.065 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2194975475' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:50.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.692 221324 DEBUG nova.network.neutron [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.739 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.739 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Instance network_info: |[{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.740 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.740 221324 DEBUG nova.network.neutron [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.745 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Start _get_guest_xml network_info=[{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': 0, 'attachment_id': '5aeae977-62f2-4ef6-a868-d5612d941b40', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a6407347-cd1c-4e32-a08c-1836a1f51dd6', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a6407347-cd1c-4e32-a08c-1836a1f51dd6', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13eea30b-7972-4da3-a3dd-8d068b401d22', 'attached_at': '', 'detached_at': '', 'volume_id': 'a6407347-cd1c-4e32-a08c-1836a1f51dd6', 'serial': 'a6407347-cd1c-4e32-a08c-1836a1f51dd6'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.751 221324 WARNING nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.757 221324 DEBUG nova.virt.libvirt.host [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.758 221324 DEBUG nova.virt.libvirt.host [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.769 221324 DEBUG nova.virt.libvirt.host [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.770 221324 DEBUG nova.virt.libvirt.host [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.771 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.771 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.772 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.772 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.773 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.773 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.773 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.774 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.774 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.775 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.775 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.775 221324 DEBUG nova.virt.hardware [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.812 221324 DEBUG nova.storage.rbd_utils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image 13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.817 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.856 221324 DEBUG nova.compute.manager [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-changed-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.856 221324 DEBUG nova.compute.manager [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing instance network info cache due to event network-changed-477ad4f0-2518-4735-b590-76e12526b832. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:55:50 np0005603622 nova_compute[221301]: 2026-01-31 08:55:50.857 221324 DEBUG oslo_concurrency.lockutils [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:55:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1730221209' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.272 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.335 221324 DEBUG nova.virt.libvirt.vif [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-2129503496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-2129503496',id=180,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-9xt0sci9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:42Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=13eea30b-7972-4da3-a3dd-8d068b401d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.336 221324 DEBUG nova.network.os_vif_util [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.337 221324 DEBUG nova.network.os_vif_util [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.339 221324 DEBUG nova.objects.instance [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.362 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <uuid>13eea30b-7972-4da3-a3dd-8d068b401d22</uuid>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <name>instance-000000b4</name>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-2129503496</nova:name>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:55:50</nova:creationTime>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:user uuid="cfaebb011a374541b083e772a6c83f25">tempest-TestInstancesWithCinderVolumes-2132464628-project-member</nova:user>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:project uuid="06b5fc9cfd4c49abb2d8b9f2f8a82c1f">tempest-TestInstancesWithCinderVolumes-2132464628</nova:project>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <nova:port uuid="71b92e93-ebb3-4664-a6af-06e2b936453f">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="serial">13eea30b-7972-4da3-a3dd-8d068b401d22</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="uuid">13eea30b-7972-4da3-a3dd-8d068b401d22</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-a6407347-cd1c-4e32-a08c-1836a1f51dd6">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <serial>a6407347-cd1c-4e32-a08c-1836a1f51dd6</serial>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:6d:3c:68"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <target dev="tap71b92e93-eb"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/console.log" append="off"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:55:51 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:55:51 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:55:51 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:55:51 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.363 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Preparing to wait for external event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.364 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.364 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.364 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.365 221324 DEBUG nova.virt.libvirt.vif [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-2129503496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-2129503496',id=180,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-9xt0sci9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:42Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=13eea30b-7972-4da3-a3dd-8d068b401d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.365 221324 DEBUG nova.network.os_vif_util [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.366 221324 DEBUG nova.network.os_vif_util [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.366 221324 DEBUG os_vif [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.367 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.367 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.371 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.371 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71b92e93-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.372 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71b92e93-eb, col_values=(('external_ids', {'iface-id': '71b92e93-ebb3-4664-a6af-06e2b936453f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:3c:68', 'vm-uuid': '13eea30b-7972-4da3-a3dd-8d068b401d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:51 np0005603622 NetworkManager[49080]: <info>  [1769849751.3760] manager: (tap71b92e93-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.381 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.383 221324 INFO os_vif [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb')#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.500 221324 DEBUG nova.network.neutron [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.541 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.541 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.541 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:6d:3c:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.542 221324 INFO nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Using config drive#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.583 221324 DEBUG nova.storage.rbd_utils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image 13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.595 221324 DEBUG oslo_concurrency.lockutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.655 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance destroyed successfully.#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.656 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'numa_topology' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.677 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'resources' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.705 221324 DEBUG nova.virt.libvirt.vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.706 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.708 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.709 221324 DEBUG os_vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.711 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.712 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap004785c1-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.714 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.718 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.720 221324 INFO os_vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.727 221324 DEBUG nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start _get_guest_xml network_info=[{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.733 221324 WARNING nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.739 221324 DEBUG nova.virt.libvirt.host [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.740 221324 DEBUG nova.virt.libvirt.host [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.747 221324 DEBUG nova.virt.libvirt.host [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.748 221324 DEBUG nova.virt.libvirt.host [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.750 221324 DEBUG nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.750 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.750 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.751 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.751 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.751 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.751 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.752 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.752 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.752 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.753 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.753 221324 DEBUG nova.virt.hardware [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.753 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'vcpu_model' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.798 221324 DEBUG oslo_concurrency.processutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:51.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.943 221324 DEBUG nova.network.neutron [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.979 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.979 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance network_info: |[{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.980 221324 DEBUG oslo_concurrency.lockutils [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.981 221324 DEBUG nova.network.neutron [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing network info cache for port 477ad4f0-2518-4735-b590-76e12526b832 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.984 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Start _get_guest_xml network_info=[{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.989 221324 WARNING nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.996 221324 DEBUG nova.virt.libvirt.host [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:55:51 np0005603622 nova_compute[221301]: 2026-01-31 08:55:51.996 221324 DEBUG nova.virt.libvirt.host [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.000 221324 DEBUG nova.virt.libvirt.host [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.000 221324 DEBUG nova.virt.libvirt.host [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.001 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.002 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.002 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.002 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.002 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.003 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.003 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.003 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.003 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.004 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.004 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.004 221324 DEBUG nova.virt.hardware [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.008 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3224189961' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.240 221324 DEBUG oslo_concurrency.processutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.273 221324 DEBUG oslo_concurrency.processutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/940501411' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.434 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.457 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.460 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.487 221324 INFO nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Creating config drive at /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.492 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnh54dw2t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:52.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.620 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpnh54dw2t" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.645 221324 DEBUG nova.storage.rbd_utils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] rbd image 13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.650 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config 13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2755211088' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.709 221324 DEBUG oslo_concurrency.processutils [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.711 221324 DEBUG nova.virt.libvirt.vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.712 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.714 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.715 221324 DEBUG nova.objects.instance [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'pci_devices' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.745 221324 DEBUG nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <uuid>c4e77744-5254-4a68-8cdf-49fce00d7b32</uuid>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <name>instance-000000b0</name>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:name>tempest-AttachVolumeTestJSON-server-1705915819</nova:name>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:55:51</nova:creationTime>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:user uuid="53804fd0f3a14f95a4955e3bc6dcc8cb">tempest-AttachVolumeTestJSON-1437067745-project-member</nova:user>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:project uuid="ab2d642eb03c4bda84a9a23e86f1fa4d">tempest-AttachVolumeTestJSON-1437067745</nova:project>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:port uuid="004785c1-d0dc-4842-b644-9171e9d5b931">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="serial">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="uuid">c4e77744-5254-4a68-8cdf-49fce00d7b32</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/c4e77744-5254-4a68-8cdf-49fce00d7b32_disk.config">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:ba:41:c4"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="tap004785c1-d0"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32/console.log" append="off"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:55:52 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:55:52 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.746 221324 DEBUG nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.747 221324 DEBUG nova.virt.libvirt.driver [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.747 221324 DEBUG nova.virt.libvirt.vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.748 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.748 221324 DEBUG nova.network.os_vif_util [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.748 221324 DEBUG os_vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.750 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.750 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.752 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.753 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap004785c1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.753 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap004785c1-d0, col_values=(('external_ids', {'iface-id': '004785c1-d0dc-4842-b644-9171e9d5b931', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:41:c4', 'vm-uuid': 'c4e77744-5254-4a68-8cdf-49fce00d7b32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.7558] manager: (tap004785c1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.757 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.761 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.763 221324 INFO os_vif [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.8183] manager: (tap004785c1-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Jan 31 03:55:52 np0005603622 kernel: tap004785c1-d0: entered promiscuous mode
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.823 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:52Z|00837|binding|INFO|Claiming lport 004785c1-d0dc-4842-b644-9171e9d5b931 for this chassis.
Jan 31 03:55:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:52Z|00838|binding|INFO|004785c1-d0dc-4842-b644-9171e9d5b931: Claiming fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:55:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:52Z|00839|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 ovn-installed in OVS
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.838 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 systemd-machined[190406]: New machine qemu-87-instance-000000b0.
Jan 31 03:55:52 np0005603622 systemd[1]: Started Virtual Machine qemu-87-instance-000000b0.
Jan 31 03:55:52 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:52Z|00840|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 up in Southbound
Jan 31 03:55:52 np0005603622 systemd-udevd[294469]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.874 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.875 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d bound to our chassis#033[00m
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:55:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3643448685' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.877 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1b5345e-f6dc-4309-b059-80678428d42d#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.8837] device (tap004785c1-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.8849] device (tap004785c1-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.887 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f2029495-11e3-4667-9486-a8d6d57f8a63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.889 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1b5345e-f1 in ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.891 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1b5345e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.891 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[860de93b-d013-4159-86c0-632acb4bfdee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.892 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[976d7d44-122d-4665-afc9-760c7da59b23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.898 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.900 221324 DEBUG nova.virt.libvirt.vif [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1192767468',display_name='tempest-TestNetworkAdvancedServerOps-server-1192767468',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1192767468',id=181,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh8eNOSEtA28nIoBnpGfGncUlsTJ6dGjCeR78o088HDWvpGieZrHJaQD2ZkpnLaDHq3NsBcKEYQLjURxwKqNK7xwxG7upclFXm84aV7n3hI2gb38w18glCQIaEF/q/95A==',key_name='tempest-TestNetworkAdvancedServerOps-1417118072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-gakt29bk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:43Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=272230f9-fca3-4297-95c2-904c72ad5988,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.900 221324 DEBUG nova.network.os_vif_util [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.900 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[abdc37a0-cd95-4078-b781-2770c8da96ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.901 221324 DEBUG nova.network.os_vif_util [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.902 221324 DEBUG nova.objects.instance [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'pci_devices' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.909 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[49b3090c-65c1-411e-973c-09ac5f03eb8d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.927 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <uuid>272230f9-fca3-4297-95c2-904c72ad5988</uuid>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <name>instance-000000b5</name>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1192767468</nova:name>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:55:51</nova:creationTime>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:user uuid="f1c6e7eff11b435a81429826a682b32f">tempest-TestNetworkAdvancedServerOps-840410497-project-member</nova:user>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:project uuid="0bfe11bd9d694684b527666e2c378eed">tempest-TestNetworkAdvancedServerOps-840410497</nova:project>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <nova:port uuid="477ad4f0-2518-4735-b590-76e12526b832">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="serial">272230f9-fca3-4297-95c2-904c72ad5988</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="uuid">272230f9-fca3-4297-95c2-904c72ad5988</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/272230f9-fca3-4297-95c2-904c72ad5988_disk">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/272230f9-fca3-4297-95c2-904c72ad5988_disk.config">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:20:74:a3"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <target dev="tap477ad4f0-25"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/console.log" append="off"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:55:52 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:55:52 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:55:52 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:55:52 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.928 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Preparing to wait for external event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.928 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.929 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.929 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.928 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[813e640e-6bf1-4d84-a522-7f3bb515e310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.930 221324 DEBUG nova.virt.libvirt.vif [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1192767468',display_name='tempest-TestNetworkAdvancedServerOps-server-1192767468',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1192767468',id=181,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh8eNOSEtA28nIoBnpGfGncUlsTJ6dGjCeR78o088HDWvpGieZrHJaQD2ZkpnLaDHq3NsBcKEYQLjURxwKqNK7xwxG7upclFXm84aV7n3hI2gb38w18glCQIaEF/q/95A==',key_name='tempest-TestNetworkAdvancedServerOps-1417118072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-gakt29bk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:55:43Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=272230f9-fca3-4297-95c2-904c72ad5988,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.930 221324 DEBUG nova.network.os_vif_util [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.930 221324 DEBUG nova.network.os_vif_util [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.931 221324 DEBUG os_vif [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.931 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.932 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.933 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5551f563-cf62-410c-a26b-99808aa92806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.9347] manager: (tapa1b5345e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/371)
Jan 31 03:55:52 np0005603622 systemd-udevd[294477]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.935 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap477ad4f0-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.936 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap477ad4f0-25, col_values=(('external_ids', {'iface-id': '477ad4f0-2518-4735-b590-76e12526b832', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:74:a3', 'vm-uuid': '272230f9-fca3-4297-95c2-904c72ad5988'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.9378] manager: (tap477ad4f0-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.954 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:52 np0005603622 nova_compute[221301]: 2026-01-31 08:55:52.955 221324 INFO os_vif [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25')#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.968 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[685c0869-3a24-4fbe-a261-1913989f9bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.971 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[957c458d-0fc1-4ec0-acb8-c06fc0ecba7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:52 np0005603622 NetworkManager[49080]: <info>  [1769849752.9862] device (tapa1b5345e-f0): carrier: link connected
Jan 31 03:55:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:52.991 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[996b4f54-b408-45ff-aff7-d4e9e42ae682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.004 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f81333d2-9428-4052-9e44-c3434dcac1dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900432, 'reachable_time': 30439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294510, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.018 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[443a8db8-b42d-42da-9f5e-33d66932a073]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:43a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 900432, 'tstamp': 900432}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294511, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.031 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8c89f7-6acf-4a6d-96fe-4a79f27d92ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1b5345e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:43:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900432, 'reachable_time': 30439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294519, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.053 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1704bc12-0de0-4754-a50b-831986c16d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.066 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.066 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.067 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] No VIF found with MAC fa:16:3e:20:74:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.067 221324 INFO nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Using config drive#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.098 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.101 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2ca1f7-5d8c-404c-9acc-42ffea77b621]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.103 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.104 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.104 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1b5345e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.107 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.1082] manager: (tapa1b5345e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/373)
Jan 31 03:55:53 np0005603622 kernel: tapa1b5345e-f0: entered promiscuous mode
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.115 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.116 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1b5345e-f0, col_values=(('external_ids', {'iface-id': '7a59c286-57bd-4dc4-87e7-a6bfcee69c68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.117 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00841|binding|INFO|Releasing lport 7a59c286-57bd-4dc4-87e7-a6bfcee69c68 from this chassis (sb_readonly=0)
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.125 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.129 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.130 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.131 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[65ae3200-fe40-41c0-87f7-ddcd5419a822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.131 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/a1b5345e-f6dc-4309-b059-80678428d42d.pid.haproxy
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID a1b5345e-f6dc-4309-b059-80678428d42d
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.132 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'env', 'PROCESS_TAG=haproxy-a1b5345e-f6dc-4309-b059-80678428d42d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1b5345e-f6dc-4309-b059-80678428d42d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:55:53 np0005603622 podman[294604]: 2026-01-31 08:55:53.478966168 +0000 UTC m=+0.072859152 container create ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.492 221324 DEBUG oslo_concurrency.processutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config 13eea30b-7972-4da3-a3dd-8d068b401d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.842s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.493 221324 INFO nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Deleting local config drive /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22/disk.config because it was imported into RBD.#033[00m
Jan 31 03:55:53 np0005603622 systemd[1]: Started libpod-conmon-ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e.scope.
Jan 31 03:55:53 np0005603622 podman[294604]: 2026-01-31 08:55:53.424299204 +0000 UTC m=+0.018192208 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:55:53 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.5344] manager: (tap71b92e93-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/374)
Jan 31 03:55:53 np0005603622 kernel: tap71b92e93-eb: entered promiscuous mode
Jan 31 03:55:53 np0005603622 systemd-udevd[294495]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00842|binding|INFO|Claiming lport 71b92e93-ebb3-4664-a6af-06e2b936453f for this chassis.
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00843|binding|INFO|71b92e93-ebb3-4664-a6af-06e2b936453f: Claiming fa:16:3e:6d:3c:68 10.100.0.9
Jan 31 03:55:53 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/823846a325f7d3393dd9d322bfe3dec7dc3d9e48557ffdd9f0114eab0e245322/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.537 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.5460] device (tap71b92e93-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.5465] device (tap71b92e93-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00844|binding|INFO|Setting lport 71b92e93-ebb3-4664-a6af-06e2b936453f ovn-installed in OVS
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.548 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 podman[294604]: 2026-01-31 08:55:53.553855683 +0000 UTC m=+0.147748657 container init ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.553 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 podman[294604]: 2026-01-31 08:55:53.557935092 +0000 UTC m=+0.151828066 container start ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:55:53 np0005603622 systemd-machined[190406]: New machine qemu-88-instance-000000b4.
Jan 31 03:55:53 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [NOTICE]   (294639) : New worker (294641) forked
Jan 31 03:55:53 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [NOTICE]   (294639) : Loading success.
Jan 31 03:55:53 np0005603622 systemd[1]: Started Virtual Machine qemu-88-instance-000000b4.
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00845|binding|INFO|Setting lport 71b92e93-ebb3-4664-a6af-06e2b936453f up in Southbound
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.600 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:3c:68 10.100.0.9'], port_security=['fa:16:3e:6d:3c:68 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '13eea30b-7972-4da3-a3dd-8d068b401d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d7b4c6b-30ca-4a01-b275-d4aa9d87b845', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe6e8b31-5a27-4e0f-b157-3b33899fa37b, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=71b92e93-ebb3-4664-a6af-06e2b936453f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.614 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 71b92e93-ebb3-4664-a6af-06e2b936453f in datapath 405bd95c-1bad-49fb-83bf-a97a0c66786e bound to our chassis#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.616 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405bd95c-1bad-49fb-83bf-a97a0c66786e#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.626 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a8150fec-3e13-4f4e-aafb-2286d7fec284]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.627 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405bd95c-11 in ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.629 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405bd95c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.629 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac21dd7-05fc-407c-b3d4-adf7cb553d72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.630 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6be9c2b1-ff35-4110-b625-ed4f6a76682e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.640 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc3e135-8627-4826-9962-cfedcf9ef112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.653 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7a9fdd-87cc-41bc-8503-e7d9175b6628]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.678 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec926ed-84e2-4683-9de4-effcac54cd0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.682 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[adfb0b6f-5d3a-4f2d-9a68-2c84d62b9ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.6843] manager: (tap405bd95c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/375)
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.711 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b333f640-d370-490a-91dd-ac3f953a1115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.715 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[94cb2947-bc40-45e9-b562-c6fca3ff64eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.7327] device (tap405bd95c-10): carrier: link connected
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.735 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[04e9fe07-4fd2-4cfb-8b4e-662c0002d743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.748 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[56d049aa-0c99-4941-b8d3-cb46efc362c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405bd95c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:d8:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 255], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900507, 'reachable_time': 37040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294676, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.760 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[369b3050-3005-4d58-9eb9-0291dc31c6fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:d880'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 900507, 'tstamp': 900507}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294677, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d6fa3f79-7a88-48ef-933e-638f3cda3232]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405bd95c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:d8:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 255], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900507, 'reachable_time': 37040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294678, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.792 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.800 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for c4e77744-5254-4a68-8cdf-49fce00d7b32 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.801 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849753.8002374, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.801 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.803 221324 DEBUG nova.compute.manager [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.806 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance rebooted successfully.#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.807 221324 DEBUG nova.compute.manager [None req-99dc3ac4-0605-4cec-aad4-c0c8c071a6c5 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.806 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88331a43-0285-4df5-ab03-02e4469a9cfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.843 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9596ff-466c-424c-801b-87f0663262f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.845 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405bd95c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.845 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.846 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405bd95c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:53.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:53 np0005603622 NetworkManager[49080]: <info>  [1769849753.8957] manager: (tap405bd95c-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Jan 31 03:55:53 np0005603622 kernel: tap405bd95c-10: entered promiscuous mode
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.898 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.898 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.900 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405bd95c-10, col_values=(('external_ids', {'iface-id': '5a0136e3-84ab-4495-80ff-8006a0a74934'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.902 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:53Z|00846|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.908 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.910 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.910 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.910 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.911 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2d5a61-04b3-4050-9e48-c8efdc104582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.912 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-405bd95c-1bad-49fb-83bf-a97a0c66786e
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/405bd95c-1bad-49fb-83bf-a97a0c66786e.pid.haproxy
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 405bd95c-1bad-49fb-83bf-a97a0c66786e
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:55:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:53.913 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'env', 'PROCESS_TAG=haproxy-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405bd95c-1bad-49fb-83bf-a97a0c66786e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.936 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849753.8014958, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.936 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Started (Lifecycle Event)#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.964 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:53 np0005603622 nova_compute[221301]: 2026-01-31 08:55:53.967 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.224 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849754.2236252, 13eea30b-7972-4da3-a3dd-8d068b401d22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.225 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] VM Started (Lifecycle Event)#033[00m
Jan 31 03:55:54 np0005603622 podman[294757]: 2026-01-31 08:55:54.241933223 +0000 UTC m=+0.050676537 container create 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.260 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.265 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849754.2237928, 13eea30b-7972-4da3-a3dd-8d068b401d22 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.265 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:55:54 np0005603622 systemd[1]: Started libpod-conmon-980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1.scope.
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.298 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.302 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:54 np0005603622 podman[294757]: 2026-01-31 08:55:54.212908756 +0000 UTC m=+0.021652100 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:55:54 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:55:54 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/919da9e74d20990373f80da3e5e4b8056da418ab3a1f32f19e6999fca16181f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:55:54 np0005603622 podman[294757]: 2026-01-31 08:55:54.333599077 +0000 UTC m=+0.142342411 container init 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 03:55:54 np0005603622 podman[294757]: 2026-01-31 08:55:54.339258849 +0000 UTC m=+0.148002173 container start 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.342 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:54 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [NOTICE]   (294778) : New worker (294780) forked
Jan 31 03:55:54 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [NOTICE]   (294778) : Loading success.
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.405 221324 DEBUG nova.compute.manager [req-0bd552cc-b5b5-4288-91f3-1e7755a1e397 req-4d3b67c7-e6e9-45be-95b0-75ade3d34c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.405 221324 DEBUG oslo_concurrency.lockutils [req-0bd552cc-b5b5-4288-91f3-1e7755a1e397 req-4d3b67c7-e6e9-45be-95b0-75ade3d34c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.405 221324 DEBUG oslo_concurrency.lockutils [req-0bd552cc-b5b5-4288-91f3-1e7755a1e397 req-4d3b67c7-e6e9-45be-95b0-75ade3d34c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.406 221324 DEBUG oslo_concurrency.lockutils [req-0bd552cc-b5b5-4288-91f3-1e7755a1e397 req-4d3b67c7-e6e9-45be-95b0-75ade3d34c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.406 221324 DEBUG nova.compute.manager [req-0bd552cc-b5b5-4288-91f3-1e7755a1e397 req-4d3b67c7-e6e9-45be-95b0-75ade3d34c09 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Processing event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.406 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.411 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849754.4111025, 13eea30b-7972-4da3-a3dd-8d068b401d22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.411 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.413 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.418 221324 INFO nova.virt.libvirt.driver [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Instance spawned successfully.#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.420 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.473 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.481 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.484 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.484 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.485 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.485 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.486 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.486 221324 DEBUG nova.virt.libvirt.driver [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:55:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:54.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.511 221324 INFO nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Creating config drive at /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.516 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptdrky_qv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.570 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.631 221324 DEBUG nova.network.neutron [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.631 221324 DEBUG nova.network.neutron [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.652 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptdrky_qv" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.684 221324 DEBUG nova.storage.rbd_utils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] rbd image 272230f9-fca3-4297-95c2-904c72ad5988_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.688 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config 272230f9-fca3-4297-95c2-904c72ad5988_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.715 221324 INFO nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Took 10.36 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.716 221324 DEBUG nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.717 221324 DEBUG oslo_concurrency.lockutils [req-acef544f-1cc8-4e22-a6f2-2966fc159e9b req-1ec87040-78bc-4d99-8711-64b5ebf22fd4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.837 221324 INFO nova.compute.manager [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Took 13.49 seconds to build instance.#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.842 221324 DEBUG oslo_concurrency.processutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config 272230f9-fca3-4297-95c2-904c72ad5988_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.842 221324 INFO nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Deleting local config drive /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988/disk.config because it was imported into RBD.#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.878 221324 DEBUG oslo_concurrency.lockutils [None req-9e572327-0636-4d92-ab4f-4e60823e6082 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:54 np0005603622 NetworkManager[49080]: <info>  [1769849754.8826] manager: (tap477ad4f0-25): new Tun device (/org/freedesktop/NetworkManager/Devices/377)
Jan 31 03:55:54 np0005603622 kernel: tap477ad4f0-25: entered promiscuous mode
Jan 31 03:55:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:54Z|00847|binding|INFO|Claiming lport 477ad4f0-2518-4735-b590-76e12526b832 for this chassis.
Jan 31 03:55:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:54Z|00848|binding|INFO|477ad4f0-2518-4735-b590-76e12526b832: Claiming fa:16:3e:20:74:a3 10.100.0.10
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:54 np0005603622 NetworkManager[49080]: <info>  [1769849754.8963] device (tap477ad4f0-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:55:54 np0005603622 NetworkManager[49080]: <info>  [1769849754.8971] device (tap477ad4f0-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:55:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:54Z|00849|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 ovn-installed in OVS
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.898 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:54 np0005603622 nova_compute[221301]: 2026-01-31 08:55:54.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:54 np0005603622 systemd-machined[190406]: New machine qemu-89-instance-000000b5.
Jan 31 03:55:54 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:54Z|00850|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 up in Southbound
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.930 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.932 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd bound to our chassis#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.934 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd#033[00m
Jan 31 03:55:54 np0005603622 systemd[1]: Started Virtual Machine qemu-89-instance-000000b5.
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.945 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ebafe8b1-0c5a-4418-ba9c-4d8d43f07b41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.946 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc3d0342-b1 in ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.948 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc3d0342-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.948 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a66970b-bad5-4803-89a1-58014b207388]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.949 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[55da0e0e-b531-4b2c-9291-7bf61c9e384e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.957 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[620fe386-c172-4d2f-ac59-c7ee6d170ee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:54.967 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[49e6c83f-85d9-42cf-a5f7-d5895bc2cb48]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.000 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3152452d-77e4-495d-b96c-f648defecbc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.004 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[950d4151-69ac-4dc8-b413-cc39c9f82f82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 NetworkManager[49080]: <info>  [1769849755.0071] manager: (tapcc3d0342-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/378)
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.026 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[98f0e9c8-3d40-4567-af54-3d05e74f770f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.030 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[52db7854-c7e5-4e26-a1e3-02177472085e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 NetworkManager[49080]: <info>  [1769849755.0480] device (tapcc3d0342-b0): carrier: link connected
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.053 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[57158a0c-3de4-441d-b4b8-0c9f3e6396e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.077 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9e372b0d-b85e-4c51-bc7b-7aa47a7a6605]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900638, 'reachable_time': 33943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294859, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.092 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2444515c-f067-4d21-bfb8-dd732a095ed1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:2369'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 900638, 'tstamp': 900638}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294860, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.107 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[47e21a6b-9f4f-469b-b9b0-9cbb9c07eecf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900638, 'reachable_time': 33943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294861, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.141 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f053c8da-5972-42bc-bee9-7e22961ac7c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.194 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[68c391e1-7d7f-42a6-b629-1d3da9ff345f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.198 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.198 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.199 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc3d0342-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.200 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 kernel: tapcc3d0342-b0: entered promiscuous mode
Jan 31 03:55:55 np0005603622 NetworkManager[49080]: <info>  [1769849755.2012] manager: (tapcc3d0342-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.202 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.204 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc3d0342-b0, col_values=(('external_ids', {'iface-id': '3fd8b9be-ff5e-4117-a28f-60f54d1fb938'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.205 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 ovn_controller[130109]: 2026-01-31T08:55:55Z|00851|binding|INFO|Releasing lport 3fd8b9be-ff5e-4117-a28f-60f54d1fb938 from this chassis (sb_readonly=0)
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.207 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.207 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9912e4ca-04f7-45dc-94c2-d9ecb336cb19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.208 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:55:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:55:55.209 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'env', 'PROCESS_TAG=haproxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.211 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.486 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849755.4856722, 272230f9-fca3-4297-95c2-904c72ad5988 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.486 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Started (Lifecycle Event)#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.510 221324 DEBUG nova.network.neutron [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updated VIF entry in instance network info cache for port 477ad4f0-2518-4735-b590-76e12526b832. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.511 221324 DEBUG nova.network.neutron [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.513 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.516 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849755.4868195, 272230f9-fca3-4297-95c2-904c72ad5988 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.517 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.531 221324 DEBUG oslo_concurrency.lockutils [req-c523a82c-2bcd-4225-854c-c9c63f4dfbd6 req-da7bca04-7ec1-4b59-853f-1936f3de2a59 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:55:55 np0005603622 podman[294935]: 2026-01-31 08:55:55.537671701 +0000 UTC m=+0.048273943 container create 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.547 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.552 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:55 np0005603622 systemd[1]: Started libpod-conmon-06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8.scope.
Jan 31 03:55:55 np0005603622 nova_compute[221301]: 2026-01-31 08:55:55.597 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:55 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:55:55 np0005603622 podman[294935]: 2026-01-31 08:55:55.509258781 +0000 UTC m=+0.019861043 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:55:55 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b44ef2a833769a2c39df461bff6ba8f8b665a26c591ef1a443073983f0b83c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:55:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:55:55 np0005603622 podman[294935]: 2026-01-31 08:55:55.623038857 +0000 UTC m=+0.133641129 container init 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:55:55 np0005603622 podman[294935]: 2026-01-31 08:55:55.628493883 +0000 UTC m=+0.139096135 container start 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:55:55 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [NOTICE]   (294954) : New worker (294956) forked
Jan 31 03:55:55 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [NOTICE]   (294954) : Loading success.
Jan 31 03:55:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:55.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:56.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.547 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.548 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.548 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.548 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.549 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] No waiting events found dispatching network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.549 221324 WARNING nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received unexpected event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f for instance with vm_state active and task_state None.#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.549 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.549 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.549 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.550 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.550 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Processing event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.550 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.550 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.550 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.551 221324 DEBUG oslo_concurrency.lockutils [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.551 221324 DEBUG nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.551 221324 WARNING nova.compute.manager [req-f01f7f76-6cdc-4f98-8b89-c794a3590dcf req-4b34fb04-8783-4039-a21c-313788a5a510 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.552 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.555 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849756.5549667, 272230f9-fca3-4297-95c2-904c72ad5988 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.555 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.557 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.560 221324 INFO nova.virt.libvirt.driver [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance spawned successfully.#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.560 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.589 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.594 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.597 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.597 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.598 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.598 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.599 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.599 221324 DEBUG nova.virt.libvirt.driver [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:55:56 np0005603622 nova_compute[221301]: 2026-01-31 08:55:56.728 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:55:57 np0005603622 nova_compute[221301]: 2026-01-31 08:55:57.019 221324 INFO nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Took 13.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:55:57 np0005603622 nova_compute[221301]: 2026-01-31 08:55:57.019 221324 DEBUG nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:55:57 np0005603622 nova_compute[221301]: 2026-01-31 08:55:57.166 221324 INFO nova.compute.manager [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Took 15.78 seconds to build instance.#033[00m
Jan 31 03:55:57 np0005603622 nova_compute[221301]: 2026-01-31 08:55:57.222 221324 DEBUG oslo_concurrency.lockutils [None req-282d206b-a262-4b1d-8e39-b9540d74dcb7 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:55:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:57.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:57 np0005603622 nova_compute[221301]: 2026-01-31 08:55:57.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:55:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:55:58.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:55:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:55:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:55:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:55:59.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:00 np0005603622 nova_compute[221301]: 2026-01-31 08:56:00.069 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:00.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.086 221324 DEBUG nova.compute.manager [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.086 221324 DEBUG oslo_concurrency.lockutils [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.087 221324 DEBUG oslo_concurrency.lockutils [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.087 221324 DEBUG oslo_concurrency.lockutils [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.087 221324 DEBUG nova.compute.manager [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:01 np0005603622 nova_compute[221301]: 2026-01-31 08:56:01.087 221324 WARNING nova.compute.manager [req-af5c8505-2725-4412-978c-57beedd8cda4 req-af3e4660-732d-4ab1-a3e9-e1593483b1db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.546196) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761546233, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 1044, "num_deletes": 251, "total_data_size": 1974459, "memory_usage": 2004680, "flush_reason": "Manual Compaction"}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761552197, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 1301000, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76089, "largest_seqno": 77128, "table_properties": {"data_size": 1296511, "index_size": 2076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10572, "raw_average_key_size": 19, "raw_value_size": 1287175, "raw_average_value_size": 2419, "num_data_blocks": 92, "num_entries": 532, "num_filter_entries": 532, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849688, "oldest_key_time": 1769849688, "file_creation_time": 1769849761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 6045 microseconds, and 2795 cpu microseconds.
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.552234) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 1301000 bytes OK
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.552254) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.554456) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.554472) EVENT_LOG_v1 {"time_micros": 1769849761554466, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.554490) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 1969348, prev total WAL file size 1969348, number of live WAL files 2.
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.554977) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(1270KB)], [156(10MB)]
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761555041, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 12292988, "oldest_snapshot_seqno": -1}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 9515 keys, 10423848 bytes, temperature: kUnknown
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761622927, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 10423848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10365155, "index_size": 33812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23813, "raw_key_size": 252189, "raw_average_key_size": 26, "raw_value_size": 10201348, "raw_average_value_size": 1072, "num_data_blocks": 1274, "num_entries": 9515, "num_filter_entries": 9515, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.623202) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 10423848 bytes
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.624839) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.8 rd, 153.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 10.5 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(17.5) write-amplify(8.0) OK, records in: 10034, records dropped: 519 output_compression: NoCompression
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.624857) EVENT_LOG_v1 {"time_micros": 1769849761624849, "job": 100, "event": "compaction_finished", "compaction_time_micros": 67980, "compaction_time_cpu_micros": 26951, "output_level": 6, "num_output_files": 1, "total_output_size": 10423848, "num_input_records": 10034, "num_output_records": 9515, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761625070, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849761626021, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.554875) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.626049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.626053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.626055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.626057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:56:01.626059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:56:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:01.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:02.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:02 np0005603622 nova_compute[221301]: 2026-01-31 08:56:02.977 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.417 221324 DEBUG nova.compute.manager [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.417 221324 DEBUG oslo_concurrency.lockutils [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.417 221324 DEBUG oslo_concurrency.lockutils [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.417 221324 DEBUG oslo_concurrency.lockutils [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.418 221324 DEBUG nova.compute.manager [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:03 np0005603622 nova_compute[221301]: 2026-01-31 08:56:03.418 221324 WARNING nova.compute.manager [req-632915b0-04b1-4bf3-857c-1718c4208240 req-025382ca-934c-4f58-9a28-0bf02bf890d3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:56:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:03.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:04.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.111 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.416 221324 DEBUG nova.compute.manager [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-changed-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.417 221324 DEBUG nova.compute.manager [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing instance network info cache due to event network-changed-477ad4f0-2518-4735-b590-76e12526b832. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.418 221324 DEBUG oslo_concurrency.lockutils [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.418 221324 DEBUG oslo_concurrency.lockutils [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:05 np0005603622 nova_compute[221301]: 2026-01-31 08:56:05.418 221324 DEBUG nova.network.neutron [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing network info cache for port 477ad4f0-2518-4735-b590-76e12526b832 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:05.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:06 np0005603622 nova_compute[221301]: 2026-01-31 08:56:06.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:06 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 31 03:56:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:06.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:56:07 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:07Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:41:c4 10.100.0.13
Jan 31 03:56:07 np0005603622 nova_compute[221301]: 2026-01-31 08:56:07.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:07 np0005603622 nova_compute[221301]: 2026-01-31 08:56:07.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:07 np0005603622 nova_compute[221301]: 2026-01-31 08:56:07.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:56:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:07.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:07 np0005603622 nova_compute[221301]: 2026-01-31 08:56:07.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:08.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:08 np0005603622 nova_compute[221301]: 2026-01-31 08:56:08.587 221324 DEBUG nova.network.neutron [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updated VIF entry in instance network info cache for port 477ad4f0-2518-4735-b590-76e12526b832. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:56:08 np0005603622 nova_compute[221301]: 2026-01-31 08:56:08.588 221324 DEBUG nova.network.neutron [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:08 np0005603622 nova_compute[221301]: 2026-01-31 08:56:08.625 221324 DEBUG oslo_concurrency.lockutils [req-a0971e5c-b325-49ff-93d7-519d79c9f83d req-449c29c6-56c1-43b6-9b33-47dc85fa395a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:09 np0005603622 nova_compute[221301]: 2026-01-31 08:56:09.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.113 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:56:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:10.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.792 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.792 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.792 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:56:10 np0005603622 nova_compute[221301]: 2026-01-31 08:56:10.793 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:11Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:3c:68 10.100.0.9
Jan 31 03:56:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:11Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:3c:68 10.100.0.9
Jan 31 03:56:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:11Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:74:a3 10.100.0.10
Jan 31 03:56:11 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:11Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:74:a3 10.100.0.10
Jan 31 03:56:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:11.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:12.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.676 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [{"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.784 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-c4e77744-5254-4a68-8cdf-49fce00d7b32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.785 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.785 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.812 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.812 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.813 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.813 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.814 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:12 np0005603622 nova_compute[221301]: 2026-01-31 08:56:12.980 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:13 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2449940771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.256 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:13.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.975 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.976 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.981 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.982 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.987 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:13 np0005603622 nova_compute[221301]: 2026-01-31 08:56:13.987 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.152 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.153 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3687MB free_disk=20.887706756591797GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.153 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.153 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.216 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.217 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.218 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.218 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.218 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.219 221324 INFO nova.compute.manager [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Terminating instance#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.220 221324 DEBUG nova.compute.manager [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:56:14 np0005603622 kernel: tap004785c1-d0 (unregistering): left promiscuous mode
Jan 31 03:56:14 np0005603622 NetworkManager[49080]: <info>  [1769849774.3466] device (tap004785c1-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:56:14 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:14Z|00852|binding|INFO|Releasing lport 004785c1-d0dc-4842-b644-9171e9d5b931 from this chassis (sb_readonly=0)
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:14Z|00853|binding|INFO|Setting lport 004785c1-d0dc-4842-b644-9171e9d5b931 down in Southbound
Jan 31 03:56:14 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:14Z|00854|binding|INFO|Removing iface tap004785c1-d0 ovn-installed in OVS
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.368 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Jan 31 03:56:14 np0005603622 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b0.scope: Consumed 13.223s CPU time.
Jan 31 03:56:14 np0005603622 systemd-machined[190406]: Machine qemu-87-instance-000000b0 terminated.
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.401 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance c4e77744-5254-4a68-8cdf-49fce00d7b32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.401 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 13eea30b-7972-4da3-a3dd-8d068b401d22 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.401 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 272230f9-fca3-4297-95c2-904c72ad5988 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.402 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.402 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:56:14 np0005603622 podman[295290]: 2026-01-31 08:56:14.42703955 +0000 UTC m=+0.047798560 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.448 221324 INFO nova.virt.libvirt.driver [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Instance destroyed successfully.#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.449 221324 DEBUG nova.objects.instance [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lazy-loading 'resources' on Instance uuid c4e77744-5254-4a68-8cdf-49fce00d7b32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:14 np0005603622 podman[295287]: 2026-01-31 08:56:14.452097411 +0000 UTC m=+0.082665803 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:56:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:14.487 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:41:c4 10.100.0.13'], port_security=['fa:16:3e:ba:41:c4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c4e77744-5254-4a68-8cdf-49fce00d7b32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1b5345e-f6dc-4309-b059-80678428d42d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab2d642eb03c4bda84a9a23e86f1fa4d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f1d32906-efe8-4331-baa1-63435daca045', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a496b319-a305-495c-a6a2-a324cd91f494, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=004785c1-d0dc-4842-b644-9171e9d5b931) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:14.488 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 004785c1-d0dc-4842-b644-9171e9d5b931 in datapath a1b5345e-f6dc-4309-b059-80678428d42d unbound from our chassis#033[00m
Jan 31 03:56:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:14.489 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1b5345e-f6dc-4309-b059-80678428d42d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:56:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:14.490 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9abf05-2e8a-42e9-abd4-00b7191264ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:14 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:14.491 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d namespace which is not needed anymore#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.510 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:14.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1036450406' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1036450406' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.757 221324 DEBUG nova.virt.libvirt.vif [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:54:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1705915819',display_name='tempest-AttachVolumeTestJSON-server-1705915819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1705915819',id=176,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLXALskIBvR8mjqSucneACaN6SAmNedlrxb0Dd0NcM4wwsoB6wxeLcuOjyvqb/HAykhEoHbt5I2YXGA4qhIeyI88PasvHMQeXFyb1fzm1IjR9i+9aAFT+n9h9ckpCcKUBw==',key_name='tempest-keypair-1731535286',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:54:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ab2d642eb03c4bda84a9a23e86f1fa4d',ramdisk_id='',reservation_id='r-87fcpzwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-1437067745',owner_user_name='tempest-AttachVolumeTestJSON-1437067745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53804fd0f3a14f95a4955e3bc6dcc8cb',uuid=c4e77744-5254-4a68-8cdf-49fce00d7b32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.759 221324 DEBUG nova.network.os_vif_util [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converting VIF {"id": "004785c1-d0dc-4842-b644-9171e9d5b931", "address": "fa:16:3e:ba:41:c4", "network": {"id": "a1b5345e-f6dc-4309-b059-80678428d42d", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-557627509-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab2d642eb03c4bda84a9a23e86f1fa4d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap004785c1-d0", "ovs_interfaceid": "004785c1-d0dc-4842-b644-9171e9d5b931", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.760 221324 DEBUG nova.network.os_vif_util [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.761 221324 DEBUG os_vif [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:56:14 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [NOTICE]   (294639) : haproxy version is 2.8.14-c23fe91
Jan 31 03:56:14 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [NOTICE]   (294639) : path to executable is /usr/sbin/haproxy
Jan 31 03:56:14 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [WARNING]  (294639) : Exiting Master process...
Jan 31 03:56:14 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [ALERT]    (294639) : Current worker (294641) exited with code 143 (Terminated)
Jan 31 03:56:14 np0005603622 neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d[294622]: [WARNING]  (294639) : All workers exited. Exiting... (0)
Jan 31 03:56:14 np0005603622 systemd[1]: libpod-ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e.scope: Deactivated successfully.
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.765 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.765 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap004785c1-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.768 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:14 np0005603622 podman[295366]: 2026-01-31 08:56:14.772595091 +0000 UTC m=+0.211946915 container died ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.774 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.777 221324 INFO os_vif [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:41:c4,bridge_name='br-int',has_traffic_filtering=True,id=004785c1-d0dc-4842-b644-9171e9d5b931,network=Network(a1b5345e-f6dc-4309-b059-80678428d42d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap004785c1-d0')#033[00m
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3376089257' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.991 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:14 np0005603622 nova_compute[221301]: 2026-01-31 08:56:14.996 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:56:15 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e-userdata-shm.mount: Deactivated successfully.
Jan 31 03:56:15 np0005603622 systemd[1]: var-lib-containers-storage-overlay-823846a325f7d3393dd9d322bfe3dec7dc3d9e48557ffdd9f0114eab0e245322-merged.mount: Deactivated successfully.
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.116 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.245 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:56:15 np0005603622 podman[295366]: 2026-01-31 08:56:15.420756903 +0000 UTC m=+0.860108747 container cleanup ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:56:15 np0005603622 systemd[1]: libpod-conmon-ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e.scope: Deactivated successfully.
Jan 31 03:56:15 np0005603622 podman[295437]: 2026-01-31 08:56:15.604605024 +0000 UTC m=+0.168736628 container remove ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.609 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[657c876f-3f55-4a7e-994d-dc674ce694c4]: (4, ('Sat Jan 31 08:56:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e)\nace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e\nSat Jan 31 08:56:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d (ace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e)\nace9f0dcb5d25e1b448c2e98b61cb507b7c0ce1fed4f4223fa59aecadcad940e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.611 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfe81e7-76cf-4710-b919-a6d8a504a37f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.611 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1b5345e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:15 np0005603622 kernel: tapa1b5345e-f0: left promiscuous mode
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.676 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.678 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c5822a8b-375d-4afa-a3b0-23bdbdb40a27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.698 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2d6ca1-56b7-4a0d-9509-fd5f7c6f243a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.700 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f9919066-e390-4c71-9f34-cc67f32df56d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.704 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:56:15 np0005603622 nova_compute[221301]: 2026-01-31 08:56:15.704 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.719 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[12a8e63f-6e10-4650-8229-040bf3d7b202]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900426, 'reachable_time': 15947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295452, 'error': None, 'target': 'ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 systemd[1]: run-netns-ovnmeta\x2da1b5345e\x2df6dc\x2d4309\x2db059\x2d80678428d42d.mount: Deactivated successfully.
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.722 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1b5345e-f6dc-4309-b059-80678428d42d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:56:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:15.722 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bc047d-2028-4c4c-959e-a424db7993f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:15.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:16 np0005603622 nova_compute[221301]: 2026-01-31 08:56:16.314 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:16 np0005603622 nova_compute[221301]: 2026-01-31 08:56:16.314 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:16.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:16 np0005603622 nova_compute[221301]: 2026-01-31 08:56:16.665 221324 INFO nova.virt.libvirt.driver [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Deleting instance files /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32_del#033[00m
Jan 31 03:56:16 np0005603622 nova_compute[221301]: 2026-01-31 08:56:16.666 221324 INFO nova.virt.libvirt.driver [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Deletion of /var/lib/nova/instances/c4e77744-5254-4a68-8cdf-49fce00d7b32_del complete#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.257 221324 INFO nova.compute.manager [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Took 3.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.258 221324 DEBUG oslo.service.loopingcall [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.259 221324 DEBUG nova.compute.manager [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.260 221324 DEBUG nova.network.neutron [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.267 221324 DEBUG nova.compute.manager [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.268 221324 DEBUG oslo_concurrency.lockutils [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.269 221324 DEBUG oslo_concurrency.lockutils [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.269 221324 DEBUG oslo_concurrency.lockutils [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.269 221324 DEBUG nova.compute.manager [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.270 221324 DEBUG nova.compute.manager [req-49692e4e-cbd9-4984-9736-e25fd371d6a5 req-df3bb1c2-3dbe-41f9-baf7-e934a60f89a0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-unplugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.478 221324 INFO nova.compute.manager [None req-6fdc2053-8b36-4101-9d1e-7ff9cf1ef1b0 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Get console output#033[00m
Jan 31 03:56:17 np0005603622 nova_compute[221301]: 2026-01-31 08:56:17.485 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:56:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:17.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.242 221324 DEBUG nova.objects.instance [None req-ec9af17e-0016-4e93-bfff-4d6ea6da7947 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'pci_devices' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.384 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849778.383978, 272230f9-fca3-4297-95c2-904c72ad5988 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.384 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:56:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:18.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.579 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.586 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:56:18 np0005603622 nova_compute[221301]: 2026-01-31 08:56:18.728 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 03:56:19 np0005603622 kernel: tap477ad4f0-25 (unregistering): left promiscuous mode
Jan 31 03:56:19 np0005603622 NetworkManager[49080]: <info>  [1769849779.0978] device (tap477ad4f0-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00855|binding|INFO|Releasing lport 477ad4f0-2518-4735-b590-76e12526b832 from this chassis (sb_readonly=0)
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.109 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00856|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 down in Southbound
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00857|binding|INFO|Removing iface tap477ad4f0-25 ovn-installed in OVS
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.112 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.138 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.139 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd unbound from our chassis#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.140 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.141 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7e3cb8-60d5-4f4e-9000-d50f32591c87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.142 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace which is not needed anymore#033[00m
Jan 31 03:56:19 np0005603622 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 31 03:56:19 np0005603622 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b5.scope: Consumed 14.423s CPU time.
Jan 31 03:56:19 np0005603622 systemd-machined[190406]: Machine qemu-89-instance-000000b5 terminated.
Jan 31 03:56:19 np0005603622 kernel: tap477ad4f0-25: entered promiscuous mode
Jan 31 03:56:19 np0005603622 NetworkManager[49080]: <info>  [1769849779.2599] manager: (tap477ad4f0-25): new Tun device (/org/freedesktop/NetworkManager/Devices/380)
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00858|binding|INFO|Claiming lport 477ad4f0-2518-4735-b590-76e12526b832 for this chassis.
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00859|binding|INFO|477ad4f0-2518-4735-b590-76e12526b832: Claiming fa:16:3e:20:74:a3 10.100.0.10
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 kernel: tap477ad4f0-25 (unregistering): left promiscuous mode
Jan 31 03:56:19 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [NOTICE]   (294954) : haproxy version is 2.8.14-c23fe91
Jan 31 03:56:19 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [NOTICE]   (294954) : path to executable is /usr/sbin/haproxy
Jan 31 03:56:19 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [WARNING]  (294954) : Exiting Master process...
Jan 31 03:56:19 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [ALERT]    (294954) : Current worker (294956) exited with code 143 (Terminated)
Jan 31 03:56:19 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[294950]: [WARNING]  (294954) : All workers exited. Exiting... (0)
Jan 31 03:56:19 np0005603622 systemd[1]: libpod-06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8.scope: Deactivated successfully.
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00860|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 ovn-installed in OVS
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00861|if_status|INFO|Dropped 2 log messages in last 561 seconds (most recently, 561 seconds ago) due to excessive rate
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00862|if_status|INFO|Not setting lport 477ad4f0-2518-4735-b590-76e12526b832 down as sb is readonly
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 podman[295482]: 2026-01-31 08:56:19.277400798 +0000 UTC m=+0.052316532 container died 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.278 221324 DEBUG nova.compute.manager [None req-ec9af17e-0016-4e93-bfff-4d6ea6da7947 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:19 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8-userdata-shm.mount: Deactivated successfully.
Jan 31 03:56:19 np0005603622 systemd[1]: var-lib-containers-storage-overlay-5b44ef2a833769a2c39df461bff6ba8f8b665a26c591ef1a443073983f0b83c6-merged.mount: Deactivated successfully.
Jan 31 03:56:19 np0005603622 podman[295482]: 2026-01-31 08:56:19.318635362 +0000 UTC m=+0.093551096 container cleanup 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 31 03:56:19 np0005603622 systemd[1]: libpod-conmon-06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8.scope: Deactivated successfully.
Jan 31 03:56:19 np0005603622 podman[295525]: 2026-01-31 08:56:19.401585222 +0000 UTC m=+0.061915558 container remove 06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.406 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[33a8e8b0-12d0-41b8-8a2e-d79903945813]: (4, ('Sat Jan 31 08:56:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8)\n06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8\nSat Jan 31 08:56:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8)\n06494d8631d94a445e6c3a38c07b9bc9e1c79ecd467f4b5498ae36e8bca7acb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.409 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[87ca7c5c-760f-41b1-aa44-4117507978c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.410 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.413 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 kernel: tapcc3d0342-b0: left promiscuous mode
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.422 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.425 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6639981a-8f5e-48f8-b0c1-1ea668a5de5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.445 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4c42cf9c-57c4-436f-b771-55ffea825c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.446 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2726c49d-ac49-4691-af97-265ddb7960b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.464 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2e68b28f-a293-475e-b5d9-c2f8b71797eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900633, 'reachable_time': 25526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295543, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.467 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.467 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b5489019-c614-48b0-a0a2-fbc6e59509f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 systemd[1]: run-netns-ovnmeta\x2dcc3d0342\x2db89d\x2d4e6c\x2d9aa7\x2d8542822eeddd.mount: Deactivated successfully.
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00863|binding|INFO|Releasing lport 477ad4f0-2518-4735-b590-76e12526b832 from this chassis (sb_readonly=0)
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.591 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.593 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd bound to our chassis#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.594 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.594 221324 DEBUG nova.compute.manager [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.595 221324 DEBUG oslo_concurrency.lockutils [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.597 221324 DEBUG oslo_concurrency.lockutils [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.597 221324 DEBUG oslo_concurrency.lockutils [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.598 221324 DEBUG nova.compute.manager [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] No waiting events found dispatching network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.599 221324 WARNING nova.compute.manager [req-01b3d6f7-8109-425b-a881-c6ce4d075c36 req-4ce03d94-9a83-4fbf-bd93-8ff947361e97 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received unexpected event network-vif-plugged-004785c1-d0dc-4842-b644-9171e9d5b931 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.606 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1944f150-55ac-4964-b2af-f31ebbf74f50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.607 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc3d0342-b1 in ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.609 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc3d0342-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.609 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f704424-7401-4d95-960a-5f3cdde96ef1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.610 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0251ff4f-9717-42d8-af5b-491ba343e917]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.620 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[24b79a67-1d07-435d-9228-a01208e22de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.632 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cbacb82a-68c4-48d6-8045-5d9e1a475461]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.651 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d3141e-24a8-402d-9cad-24f7ec1ee043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.656 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e9874785-a236-4dc7-9c57-27a945392c34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 systemd-udevd[295461]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:56:19 np0005603622 NetworkManager[49080]: <info>  [1769849779.6588] manager: (tapcc3d0342-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/381)
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.660 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.684 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4b493d3b-8625-47a3-a418-30bd08043b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.688 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[eca454e1-965b-4e8d-b833-fbc0200280a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 NetworkManager[49080]: <info>  [1769849779.7033] device (tapcc3d0342-b0): carrier: link connected
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.705 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[053870e3-604a-4320-bab2-e0dbf2cf6b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.719 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdc9686-ad68-4f4a-9dcf-02f636ad0283]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903104, 'reachable_time': 17402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295570, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb53a660-0271-4f10-b5cb-f1c4ee42f9e9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:2369'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 903104, 'tstamp': 903104}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295571, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.754 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7f771838-a252-42fe-905d-c92b92143673]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903104, 'reachable_time': 17402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295572, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.768 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.784 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b163e7a1-94cb-4414-b1f7-7907005cf91f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.834 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c61c6f24-5017-4de2-a149-63cecbbac83a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.836 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.836 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.837 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc3d0342-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 NetworkManager[49080]: <info>  [1769849779.8406] manager: (tapcc3d0342-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Jan 31 03:56:19 np0005603622 kernel: tapcc3d0342-b0: entered promiscuous mode
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.843 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.844 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc3d0342-b0, col_values=(('external_ids', {'iface-id': '3fd8b9be-ff5e-4117-a28f-60f54d1fb938'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:19Z|00864|binding|INFO|Releasing lport 3fd8b9be-ff5e-4117-a28f-60f54d1fb938 from this chassis (sb_readonly=0)
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.853 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 nova_compute[221301]: 2026-01-31 08:56:19.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.858 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.861 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e604159-89df-464e-a887-7d4eb15e978e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.861 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:56:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:19.862 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'env', 'PROCESS_TAG=haproxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:56:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:19.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:20 np0005603622 nova_compute[221301]: 2026-01-31 08:56:20.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:20 np0005603622 podman[295606]: 2026-01-31 08:56:20.246499371 +0000 UTC m=+0.042537959 container create e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:56:20 np0005603622 systemd[1]: Started libpod-conmon-e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10.scope.
Jan 31 03:56:20 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:56:20 np0005603622 podman[295606]: 2026-01-31 08:56:20.223398913 +0000 UTC m=+0.019437521 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:56:20 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ae2ebab7dcb16d856d1e5e6e4c72b4ed6ae3e453e0440a71c49e98af7e6bdc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:56:20 np0005603622 podman[295606]: 2026-01-31 08:56:20.338577716 +0000 UTC m=+0.134616414 container init e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:56:20 np0005603622 podman[295606]: 2026-01-31 08:56:20.346056417 +0000 UTC m=+0.142095025 container start e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [NOTICE]   (295625) : New worker (295627) forked
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [NOTICE]   (295625) : Loading success.
Jan 31 03:56:20 np0005603622 nova_compute[221301]: 2026-01-31 08:56:20.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:20 np0005603622 nova_compute[221301]: 2026-01-31 08:56:20.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.400 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd unbound from our chassis#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.402 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.403 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b715da-c2ee-4e2b-8d95-80c443c5dd78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.403 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace which is not needed anymore#033[00m
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [NOTICE]   (295625) : haproxy version is 2.8.14-c23fe91
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [NOTICE]   (295625) : path to executable is /usr/sbin/haproxy
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [WARNING]  (295625) : Exiting Master process...
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [ALERT]    (295625) : Current worker (295627) exited with code 143 (Terminated)
Jan 31 03:56:20 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295621]: [WARNING]  (295625) : All workers exited. Exiting... (0)
Jan 31 03:56:20 np0005603622 systemd[1]: libpod-e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10.scope: Deactivated successfully.
Jan 31 03:56:20 np0005603622 podman[295655]: 2026-01-31 08:56:20.524316279 +0000 UTC m=+0.044947745 container died e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:56:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:20.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:20 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10-userdata-shm.mount: Deactivated successfully.
Jan 31 03:56:20 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9ae2ebab7dcb16d856d1e5e6e4c72b4ed6ae3e453e0440a71c49e98af7e6bdc3-merged.mount: Deactivated successfully.
Jan 31 03:56:20 np0005603622 podman[295655]: 2026-01-31 08:56:20.564271998 +0000 UTC m=+0.084903454 container cleanup e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:56:20 np0005603622 systemd[1]: libpod-conmon-e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10.scope: Deactivated successfully.
Jan 31 03:56:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:20 np0005603622 podman[295684]: 2026-01-31 08:56:20.618813248 +0000 UTC m=+0.038181303 container remove e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.623 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[28e6f45a-6a2d-442e-ae2f-97aaa7ec37c7]: (4, ('Sat Jan 31 08:56:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10)\ne40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10\nSat Jan 31 08:56:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (e40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10)\ne40f70ac2e7f8f32f03ae583de43470aa6d8677a0881ef0fedd88190bd810c10\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.626 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[46aa0c33-8771-4dbb-b624-bebc204a6d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.627 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:20 np0005603622 nova_compute[221301]: 2026-01-31 08:56:20.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:20 np0005603622 kernel: tapcc3d0342-b0: left promiscuous mode
Jan 31 03:56:20 np0005603622 nova_compute[221301]: 2026-01-31 08:56:20.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.646 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[431dce32-7942-4b30-94cf-b473b89821c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.666 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c11c283d-f921-4fda-9941-b08b5ace1ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.669 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ea6604-aea8-463e-a741-1ec4fe65f617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.682 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[08255586-ae9b-4eb9-b0ff-422ff42a818b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903098, 'reachable_time': 23427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295703, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.685 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:56:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:20.685 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[2d94ac1b-68f0-42e5-bc97-ae56c9e81c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:20 np0005603622 systemd[1]: run-netns-ovnmeta\x2dcc3d0342\x2db89d\x2d4e6c\x2d9aa7\x2d8542822eeddd.mount: Deactivated successfully.
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.338 221324 DEBUG nova.network.neutron [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.387 221324 DEBUG nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Received event network-vif-deleted-004785c1-d0dc-4842-b644-9171e9d5b931 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.388 221324 INFO nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Neutron deleted interface 004785c1-d0dc-4842-b644-9171e9d5b931; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.388 221324 DEBUG nova.network.neutron [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.470 221324 INFO nova.compute.manager [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Took 4.21 seconds to deallocate network for instance.#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.823 221324 DEBUG nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Detach interface failed, port_id=004785c1-d0dc-4842-b644-9171e9d5b931, reason: Instance c4e77744-5254-4a68-8cdf-49fce00d7b32 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.824 221324 DEBUG nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.824 221324 DEBUG oslo_concurrency.lockutils [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.825 221324 DEBUG oslo_concurrency.lockutils [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.825 221324 DEBUG oslo_concurrency.lockutils [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.825 221324 DEBUG nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.825 221324 WARNING nova.compute.manager [req-0a55ac74-6ee8-4dd5-9f1c-3317531f75cc req-fc8754d6-8ea4-4e0d-b279-32e33a71c1e6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.916 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:21 np0005603622 nova_compute[221301]: 2026-01-31 08:56:21.917 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:21.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.056 221324 DEBUG oslo_concurrency.processutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:22.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3921566771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.587 221324 DEBUG oslo_concurrency.processutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.595 221324 DEBUG nova.compute.provider_tree [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.631 221324 DEBUG nova.scheduler.client.report [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.796 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:22 np0005603622 nova_compute[221301]: 2026-01-31 08:56:22.882 221324 INFO nova.scheduler.client.report [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Deleted allocations for instance c4e77744-5254-4a68-8cdf-49fce00d7b32#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.007 221324 DEBUG oslo_concurrency.lockutils [None req-af5df0b6-8ddc-48b1-b629-4e684d91ff60 53804fd0f3a14f95a4955e3bc6dcc8cb ab2d642eb03c4bda84a9a23e86f1fa4d - - default default] Lock "c4e77744-5254-4a68-8cdf-49fce00d7b32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.544 221324 DEBUG nova.compute.manager [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.545 221324 DEBUG oslo_concurrency.lockutils [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.545 221324 DEBUG oslo_concurrency.lockutils [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.546 221324 DEBUG oslo_concurrency.lockutils [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.546 221324 DEBUG nova.compute.manager [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.546 221324 WARNING nova.compute.manager [req-36c14d13-7a9b-45ad-a357-b65fffe48f41 req-e18ce3e9-86fd-4265-bacc-b0280053cd4f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.613 221324 INFO nova.compute.manager [None req-3f80a8d9-bf2e-4add-8f7e-db326d03561b f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Get console output#033[00m
Jan 31 03:56:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:23.855 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:23 np0005603622 nova_compute[221301]: 2026-01-31 08:56:23.856 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:23 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:23.856 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:56:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:23.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.169 221324 INFO nova.compute.manager [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Resuming#033[00m
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.170 221324 DEBUG nova.objects.instance [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'flavor' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.260 221324 DEBUG oslo_concurrency.lockutils [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.261 221324 DEBUG oslo_concurrency.lockutils [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquired lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.261 221324 DEBUG nova.network.neutron [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:56:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:56:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/104284511' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:56:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:56:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/104284511' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:56:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:24.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:24 np0005603622 nova_compute[221301]: 2026-01-31 08:56:24.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:25 np0005603622 nova_compute[221301]: 2026-01-31 08:56:25.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:25.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:26.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.047 221324 DEBUG nova.network.neutron [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.243 221324 DEBUG oslo_concurrency.lockutils [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Releasing lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.249 221324 DEBUG nova.virt.libvirt.vif [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1192767468',display_name='tempest-TestNetworkAdvancedServerOps-server-1192767468',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1192767468',id=181,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh8eNOSEtA28nIoBnpGfGncUlsTJ6dGjCeR78o088HDWvpGieZrHJaQD2ZkpnLaDHq3NsBcKEYQLjURxwKqNK7xwxG7upclFXm84aV7n3hI2gb38w18glCQIaEF/q/95A==',key_name='tempest-TestNetworkAdvancedServerOps-1417118072',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:55:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-gakt29bk',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:56:19Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=272230f9-fca3-4297-95c2-904c72ad5988,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.250 221324 DEBUG nova.network.os_vif_util [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.250 221324 DEBUG nova.network.os_vif_util [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.251 221324 DEBUG os_vif [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.251 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.251 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.252 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.254 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap477ad4f0-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.254 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap477ad4f0-25, col_values=(('external_ids', {'iface-id': '477ad4f0-2518-4735-b590-76e12526b832', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:74:a3', 'vm-uuid': '272230f9-fca3-4297-95c2-904c72ad5988'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.254 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.255 221324 INFO os_vif [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25')#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.280 221324 DEBUG nova.objects.instance [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'numa_topology' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:27 np0005603622 kernel: tap477ad4f0-25: entered promiscuous mode
Jan 31 03:56:27 np0005603622 NetworkManager[49080]: <info>  [1769849787.4687] manager: (tap477ad4f0-25): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Jan 31 03:56:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:27Z|00865|binding|INFO|Claiming lport 477ad4f0-2518-4735-b590-76e12526b832 for this chassis.
Jan 31 03:56:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:27Z|00866|binding|INFO|477ad4f0-2518-4735-b590-76e12526b832: Claiming fa:16:3e:20:74:a3 10.100.0.10
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.469 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:27Z|00867|binding|INFO|Removing lport 477ad4f0-2518-4735-b590-76e12526b832 ovn-installed in OVS
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:27Z|00868|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 ovn-installed in OVS
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.477 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 nova_compute[221301]: 2026-01-31 08:56:27.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:27 np0005603622 systemd-udevd[295739]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:56:27 np0005603622 systemd-machined[190406]: New machine qemu-90-instance-000000b5.
Jan 31 03:56:27 np0005603622 NetworkManager[49080]: <info>  [1769849787.5237] device (tap477ad4f0-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:56:27 np0005603622 NetworkManager[49080]: <info>  [1769849787.5244] device (tap477ad4f0-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:56:27 np0005603622 systemd[1]: Started Virtual Machine qemu-90-instance-000000b5.
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.768 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:27 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:27Z|00869|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 up in Southbound
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.770 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd bound to our chassis#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.772 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.780 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[69fefd4c-86dd-4641-beed-9a7483eee23c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.781 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc3d0342-b1 in ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.784 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc3d0342-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.784 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74b56e75-8069-4942-a6d7-c5c58f3584cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.785 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe450f5-01fc-486c-880a-d6cb84f16ebc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.794 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[93bc47c9-a279-4bb6-8374-6507fff62beb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.806 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b758f63f-1409-440b-ba74-c36fe9848788]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.830 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1bfd19-a3fd-43ea-a47f-569470b7dd12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.836 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f8e751-8c12-4f7c-a287-d00746c68b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 systemd-udevd[295742]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:56:27 np0005603622 NetworkManager[49080]: <info>  [1769849787.8397] manager: (tapcc3d0342-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/384)
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.866 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[73a6f88b-5a64-491c-8ef6-4cf364b28946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.876 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1c03abbf-ab3e-4c92-be41-3c0d824f29d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 NetworkManager[49080]: <info>  [1769849787.8929] device (tapcc3d0342-b0): carrier: link connected
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.898 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a81a736c-df0c-4c59-bdfb-752952665346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.909 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5776cb47-4252-45b4-859b-d6a743e5985d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903923, 'reachable_time': 43853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295807, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.926 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[69cab436-621b-4bbe-aaf1-6eaf49117886]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:2369'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 903923, 'tstamp': 903923}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295810, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.938 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[16a33003-a7a6-4539-afb9-397c6211096f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3d0342-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:23:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903923, 'reachable_time': 43853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295812, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:27.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:27.971 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[177e726b-587c-4bd3-bc4e-91e69d8f9281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.023 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[947cb324-c82f-4904-8f95-87f87359ffe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.028 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.028 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.028 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc3d0342-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:28 np0005603622 NetworkManager[49080]: <info>  [1769849788.0315] manager: (tapcc3d0342-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 31 03:56:28 np0005603622 kernel: tapcc3d0342-b0: entered promiscuous mode
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.035 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc3d0342-b0, col_values=(('external_ids', {'iface-id': '3fd8b9be-ff5e-4117-a28f-60f54d1fb938'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:28 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:28Z|00870|binding|INFO|Releasing lport 3fd8b9be-ff5e-4117-a28f-60f54d1fb938 from this chassis (sb_readonly=0)
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.042 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.043 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39d8f328-ebe6-4074-9178-a6f12a0db3af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.044 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.pid.haproxy
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID cc3d0342-b89d-4e6c-9aa7-8542822eeddd
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:56:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:28.045 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'env', 'PROCESS_TAG=haproxy-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc3d0342-b89d-4e6c-9aa7-8542822eeddd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.103 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 272230f9-fca3-4297-95c2-904c72ad5988 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.103 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849788.102518, 272230f9-fca3-4297-95c2-904c72ad5988 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.104 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Started (Lifecycle Event)#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.130 221324 DEBUG nova.compute.manager [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.130 221324 DEBUG nova.objects.instance [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'pci_devices' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.231 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.234 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.263 221324 INFO nova.virt.libvirt.driver [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance running successfully.#033[00m
Jan 31 03:56:28 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.265 221324 DEBUG nova.virt.libvirt.guest [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.265 221324 DEBUG nova.compute.manager [None req-8448379a-86da-4e99-b27e-de3f070a4a31 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.376 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:56:28 np0005603622 podman[295848]: 2026-01-31 08:56:28.376834354 +0000 UTC m=+0.050658558 container create 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.377 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849788.105392, 272230f9-fca3-4297-95c2-904c72ad5988 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.377 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:56:28 np0005603622 systemd[1]: Started libpod-conmon-341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144.scope.
Jan 31 03:56:28 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:56:28 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82baaa3e682fab7573c01c13eb3852316b003a89133cd739f966ff6dd4ea0e04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:56:28 np0005603622 podman[295848]: 2026-01-31 08:56:28.346962364 +0000 UTC m=+0.020786598 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:56:28 np0005603622 podman[295848]: 2026-01-31 08:56:28.444219747 +0000 UTC m=+0.118043961 container init 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:56:28 np0005603622 podman[295848]: 2026-01-31 08:56:28.448147423 +0000 UTC m=+0.121971617 container start 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 03:56:28 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [NOTICE]   (295867) : New worker (295869) forked
Jan 31 03:56:28 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [NOTICE]   (295867) : Loading success.
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.534 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:28 np0005603622 nova_compute[221301]: 2026-01-31 08:56:28.538 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:56:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:28.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:29 np0005603622 nova_compute[221301]: 2026-01-31 08:56:29.450 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849774.447725, c4e77744-5254-4a68-8cdf-49fce00d7b32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:29 np0005603622 nova_compute[221301]: 2026-01-31 08:56:29.451 221324 INFO nova.compute.manager [-] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:56:29 np0005603622 nova_compute[221301]: 2026-01-31 08:56:29.503 221324 DEBUG nova.compute.manager [None req-1499d7bc-8c96-461f-be4e-e5a50d5eb9fb - - - - - -] [instance: c4e77744-5254-4a68-8cdf-49fce00d7b32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:29 np0005603622 nova_compute[221301]: 2026-01-31 08:56:29.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:29.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.169 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.512 221324 DEBUG nova.compute.manager [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.512 221324 DEBUG oslo_concurrency.lockutils [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.513 221324 DEBUG oslo_concurrency.lockutils [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.513 221324 DEBUG oslo_concurrency.lockutils [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.514 221324 DEBUG nova.compute.manager [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:30 np0005603622 nova_compute[221301]: 2026-01-31 08:56:30.514 221324 WARNING nova.compute.manager [req-b8008870-6129-4f66-858f-7fe8b2b3f10b req-89f4ab17-d3fe-4b58-9977-294c576e3ae0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:56:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:30.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:30.858 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:30.914 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:30.915 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:30.916 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:31.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:32 np0005603622 nova_compute[221301]: 2026-01-31 08:56:32.170 221324 INFO nova.compute.manager [None req-e564d243-85e9-4201-a51a-f7074e7b3da6 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Get console output#033[00m
Jan 31 03:56:32 np0005603622 nova_compute[221301]: 2026-01-31 08:56:32.176 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 03:56:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:32.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.533 221324 DEBUG nova.compute.manager [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.533 221324 DEBUG oslo_concurrency.lockutils [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.533 221324 DEBUG oslo_concurrency.lockutils [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.533 221324 DEBUG oslo_concurrency.lockutils [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.533 221324 DEBUG nova.compute.manager [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:33 np0005603622 nova_compute[221301]: 2026-01-31 08:56:33.534 221324 WARNING nova.compute.manager [req-2a4fc627-607d-4edd-bf5c-3b3b4f1af680 req-6baff581-8547-4f52-9b68-7f94d7cfd471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:56:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:33.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:34.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.758 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.759 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.761 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.761 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.762 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.763 221324 INFO nova.compute.manager [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Terminating instance#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.765 221324 DEBUG nova.compute.manager [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:34 np0005603622 kernel: tap477ad4f0-25 (unregistering): left promiscuous mode
Jan 31 03:56:34 np0005603622 NetworkManager[49080]: <info>  [1769849794.8611] device (tap477ad4f0-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:56:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:34Z|00871|binding|INFO|Releasing lport 477ad4f0-2518-4735-b590-76e12526b832 from this chassis (sb_readonly=0)
Jan 31 03:56:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:34Z|00872|binding|INFO|Setting lport 477ad4f0-2518-4735-b590-76e12526b832 down in Southbound
Jan 31 03:56:34 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:34Z|00873|binding|INFO|Removing iface tap477ad4f0-25 ovn-installed in OVS
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:34 np0005603622 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 31 03:56:34 np0005603622 systemd-machined[190406]: Machine qemu-90-instance-000000b5 terminated.
Jan 31 03:56:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:34.939 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:74:a3 10.100.0.10'], port_security=['fa:16:3e:20:74:a3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '272230f9-fca3-4297-95c2-904c72ad5988', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0bfe11bd9d694684b527666e2c378eed', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7ea26f6f-6cf1-4f02-9a47-cac3e89cac48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5925fbf-71fd-4427-a43f-27f10b46d74c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=477ad4f0-2518-4735-b590-76e12526b832) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:56:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:34.940 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 477ad4f0-2518-4735-b590-76e12526b832 in datapath cc3d0342-b89d-4e6c-9aa7-8542822eeddd unbound from our chassis#033[00m
Jan 31 03:56:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:34.941 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3d0342-b89d-4e6c-9aa7-8542822eeddd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:56:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:34.942 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[23ff8f67-431d-4013-b692-307bd7641f6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:34.943 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd namespace which is not needed anymore#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.995 221324 INFO nova.virt.libvirt.driver [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Instance destroyed successfully.#033[00m
Jan 31 03:56:34 np0005603622 nova_compute[221301]: 2026-01-31 08:56:34.996 221324 DEBUG nova.objects.instance [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lazy-loading 'resources' on Instance uuid 272230f9-fca3-4297-95c2-904c72ad5988 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.021 221324 DEBUG nova.virt.libvirt.vif [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1192767468',display_name='tempest-TestNetworkAdvancedServerOps-server-1192767468',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1192767468',id=181,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh8eNOSEtA28nIoBnpGfGncUlsTJ6dGjCeR78o088HDWvpGieZrHJaQD2ZkpnLaDHq3NsBcKEYQLjURxwKqNK7xwxG7upclFXm84aV7n3hI2gb38w18glCQIaEF/q/95A==',key_name='tempest-TestNetworkAdvancedServerOps-1417118072',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:55:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0bfe11bd9d694684b527666e2c378eed',ramdisk_id='',reservation_id='r-gakt29bk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-840410497',owner_user_name='tempest-TestNetworkAdvancedServerOps-840410497-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:56:28Z,user_data=None,user_id='f1c6e7eff11b435a81429826a682b32f',uuid=272230f9-fca3-4297-95c2-904c72ad5988,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.021 221324 DEBUG nova.network.os_vif_util [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converting VIF {"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.022 221324 DEBUG nova.network.os_vif_util [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.022 221324 DEBUG os_vif [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.024 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap477ad4f0-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.026 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.029 221324 INFO os_vif [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:74:a3,bridge_name='br-int',has_traffic_filtering=True,id=477ad4f0-2518-4735-b590-76e12526b832,network=Network(cc3d0342-b89d-4e6c-9aa7-8542822eeddd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap477ad4f0-25')#033[00m
Jan 31 03:56:35 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [NOTICE]   (295867) : haproxy version is 2.8.14-c23fe91
Jan 31 03:56:35 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [NOTICE]   (295867) : path to executable is /usr/sbin/haproxy
Jan 31 03:56:35 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [WARNING]  (295867) : Exiting Master process...
Jan 31 03:56:35 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [ALERT]    (295867) : Current worker (295869) exited with code 143 (Terminated)
Jan 31 03:56:35 np0005603622 neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd[295863]: [WARNING]  (295867) : All workers exited. Exiting... (0)
Jan 31 03:56:35 np0005603622 systemd[1]: libpod-341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144.scope: Deactivated successfully.
Jan 31 03:56:35 np0005603622 conmon[295863]: conmon 341d93fe51548a3434f2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144.scope/container/memory.events
Jan 31 03:56:35 np0005603622 podman[295914]: 2026-01-31 08:56:35.061348791 +0000 UTC m=+0.044777219 container died 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:56:35 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144-userdata-shm.mount: Deactivated successfully.
Jan 31 03:56:35 np0005603622 systemd[1]: var-lib-containers-storage-overlay-82baaa3e682fab7573c01c13eb3852316b003a89133cd739f966ff6dd4ea0e04-merged.mount: Deactivated successfully.
Jan 31 03:56:35 np0005603622 podman[295914]: 2026-01-31 08:56:35.105498573 +0000 UTC m=+0.088926991 container cleanup 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 03:56:35 np0005603622 systemd[1]: libpod-conmon-341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144.scope: Deactivated successfully.
Jan 31 03:56:35 np0005603622 podman[295963]: 2026-01-31 08:56:35.158614545 +0000 UTC m=+0.038521552 container remove 341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.162 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[88adcd05-b43f-4ee7-b051-9b674c4e8caa]: (4, ('Sat Jan 31 08:56:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144)\n341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144\nSat Jan 31 08:56:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd (341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144)\n341d93fe51548a3434f2eed3ec6a990f60c602b30cb6cf0430a80791fadb6144\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.164 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b77ea47d-76e7-4900-b71c-f72ab8ca2910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.165 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3d0342-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.167 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:35 np0005603622 kernel: tapcc3d0342-b0: left promiscuous mode
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.177 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bec37134-11d5-40f2-a79b-87b921c36bd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.192 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a50fbd2-7704-49e5-9570-6c34a8ed9c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.193 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf149f1-a708-4a72-b941-49116242c282]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.205 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39b2c76d-0364-453e-bb7d-5dc265476c50]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903916, 'reachable_time': 16423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295977, 'error': None, 'target': 'ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.208 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc3d0342-b89d-4e6c-9aa7-8542822eeddd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:56:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:56:35.208 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd0239b-c957-4a47-9f87-275b6215e7ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:56:35 np0005603622 systemd[1]: run-netns-ovnmeta\x2dcc3d0342\x2db89d\x2d4e6c\x2d9aa7\x2d8542822eeddd.mount: Deactivated successfully.
Jan 31 03:56:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:56:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/880902348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.496 221324 INFO nova.virt.libvirt.driver [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Deleting instance files /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988_del#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.497 221324 INFO nova.virt.libvirt.driver [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Deletion of /var/lib/nova/instances/272230f9-fca3-4297-95c2-904c72ad5988_del complete#033[00m
Jan 31 03:56:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.679 221324 INFO nova.compute.manager [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.680 221324 DEBUG oslo.service.loopingcall [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.680 221324 DEBUG nova.compute.manager [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.681 221324 DEBUG nova.network.neutron [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.695 221324 DEBUG nova.compute.manager [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-changed-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.696 221324 DEBUG nova.compute.manager [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing instance network info cache due to event network-changed-477ad4f0-2518-4735-b590-76e12526b832. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.696 221324 DEBUG oslo_concurrency.lockutils [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.696 221324 DEBUG oslo_concurrency.lockutils [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:56:35 np0005603622 nova_compute[221301]: 2026-01-31 08:56:35.696 221324 DEBUG nova.network.neutron [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Refreshing network info cache for port 477ad4f0-2518-4735-b590-76e12526b832 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:56:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:35.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:36.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:36 np0005603622 nova_compute[221301]: 2026-01-31 08:56:36.860 221324 DEBUG nova.network.neutron [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:36 np0005603622 nova_compute[221301]: 2026-01-31 08:56:36.972 221324 INFO nova.compute.manager [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Took 1.29 seconds to deallocate network for instance.#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.114 221324 DEBUG nova.compute.manager [req-014222b3-18cd-47a6-87d5-bdb2bd8bc4bd req-60606eaa-2797-413c-a9d5-2af2a5ec7409 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-deleted-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.224 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.225 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.330 221324 DEBUG oslo_concurrency.processutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:56:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:56:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3584579284' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.733 221324 DEBUG oslo_concurrency.processutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.740 221324 DEBUG nova.compute.provider_tree [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.849 221324 DEBUG nova.scheduler.client.report [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.871 221324 DEBUG nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.871 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.872 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.872 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.872 221324 DEBUG nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.872 221324 WARNING nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-unplugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.872 221324 DEBUG nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.873 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "272230f9-fca3-4297-95c2-904c72ad5988-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.873 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.873 221324 DEBUG oslo_concurrency.lockutils [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.873 221324 DEBUG nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] No waiting events found dispatching network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.873 221324 WARNING nova.compute.manager [req-af8d7e3c-546f-458f-9af3-1a60bb801349 req-67528122-e084-407d-bf98-4aa0cc13b956 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Received unexpected event network-vif-plugged-477ad4f0-2518-4735-b590-76e12526b832 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:56:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:37.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:37 np0005603622 nova_compute[221301]: 2026-01-31 08:56:37.975 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:38 np0005603622 nova_compute[221301]: 2026-01-31 08:56:38.168 221324 INFO nova.scheduler.client.report [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Deleted allocations for instance 272230f9-fca3-4297-95c2-904c72ad5988#033[00m
Jan 31 03:56:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:38.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:38 np0005603622 nova_compute[221301]: 2026-01-31 08:56:38.589 221324 DEBUG nova.network.neutron [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updated VIF entry in instance network info cache for port 477ad4f0-2518-4735-b590-76e12526b832. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:56:38 np0005603622 nova_compute[221301]: 2026-01-31 08:56:38.589 221324 DEBUG nova.network.neutron [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Updating instance_info_cache with network_info: [{"id": "477ad4f0-2518-4735-b590-76e12526b832", "address": "fa:16:3e:20:74:a3", "network": {"id": "cc3d0342-b89d-4e6c-9aa7-8542822eeddd", "bridge": "br-int", "label": "tempest-network-smoke--357612236", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0bfe11bd9d694684b527666e2c378eed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap477ad4f0-25", "ovs_interfaceid": "477ad4f0-2518-4735-b590-76e12526b832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:56:39 np0005603622 nova_compute[221301]: 2026-01-31 08:56:39.208 221324 DEBUG oslo_concurrency.lockutils [None req-1ab385a7-7e5d-4d30-bd23-c582474807d8 f1c6e7eff11b435a81429826a682b32f 0bfe11bd9d694684b527666e2c378eed - - default default] Lock "272230f9-fca3-4297-95c2-904c72ad5988" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:56:39 np0005603622 nova_compute[221301]: 2026-01-31 08:56:39.218 221324 DEBUG oslo_concurrency.lockutils [req-010635d1-c7b0-4fad-aa2c-c67ba970934e req-3861469e-a6ce-4c68-a09e-6395fc4062d0 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-272230f9-fca3-4297-95c2-904c72ad5988" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:56:39 np0005603622 nova_compute[221301]: 2026-01-31 08:56:39.561 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:39 np0005603622 nova_compute[221301]: 2026-01-31 08:56:39.562 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 03:56:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:39.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:40 np0005603622 nova_compute[221301]: 2026-01-31 08:56:40.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:40 np0005603622 nova_compute[221301]: 2026-01-31 08:56:40.178 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:40.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:41.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:42.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:43.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:44.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:45 np0005603622 nova_compute[221301]: 2026-01-31 08:56:45.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:45 np0005603622 podman[296001]: 2026-01-31 08:56:45.155153929 +0000 UTC m=+0.081273767 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:56:45 np0005603622 podman[296002]: 2026-01-31 08:56:45.15516829 +0000 UTC m=+0.080257090 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 03:56:45 np0005603622 nova_compute[221301]: 2026-01-31 08:56:45.178 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:46Z|00874|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:46 np0005603622 nova_compute[221301]: 2026-01-31 08:56:46.420 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:46 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:46Z|00875|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:46 np0005603622 nova_compute[221301]: 2026-01-31 08:56:46.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:46.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:47 np0005603622 nova_compute[221301]: 2026-01-31 08:56:47.127 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603622 NetworkManager[49080]: <info>  [1769849807.1279] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Jan 31 03:56:47 np0005603622 NetworkManager[49080]: <info>  [1769849807.1287] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Jan 31 03:56:47 np0005603622 nova_compute[221301]: 2026-01-31 08:56:47.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603622 ovn_controller[130109]: 2026-01-31T08:56:47Z|00876|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:56:47 np0005603622 nova_compute[221301]: 2026-01-31 08:56:47.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:56:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:47.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:56:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:48.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:49.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:49 np0005603622 nova_compute[221301]: 2026-01-31 08:56:49.995 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849794.9941444, 272230f9-fca3-4297-95c2-904c72ad5988 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:56:49 np0005603622 nova_compute[221301]: 2026-01-31 08:56:49.995 221324 INFO nova.compute.manager [-] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:56:50 np0005603622 nova_compute[221301]: 2026-01-31 08:56:50.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:50 np0005603622 nova_compute[221301]: 2026-01-31 08:56:50.056 221324 DEBUG nova.compute.manager [None req-66cf390d-6272-49bb-97fb-664426e01d92 - - - - - -] [instance: 272230f9-fca3-4297-95c2-904c72ad5988] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:56:50 np0005603622 nova_compute[221301]: 2026-01-31 08:56:50.181 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:50.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:51.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:52.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:54 np0005603622 nova_compute[221301]: 2026-01-31 08:56:54.545 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:56:54 np0005603622 nova_compute[221301]: 2026-01-31 08:56:54.545 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 03:56:54 np0005603622 nova_compute[221301]: 2026-01-31 08:56:54.582 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 03:56:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:54.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:55 np0005603622 nova_compute[221301]: 2026-01-31 08:56:55.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:55 np0005603622 nova_compute[221301]: 2026-01-31 08:56:55.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:56:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:56:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:56:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:56.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:56:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:56:58.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:56:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:56:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:56:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:56:59.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:00 np0005603622 nova_compute[221301]: 2026-01-31 08:57:00.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:00 np0005603622 nova_compute[221301]: 2026-01-31 08:57:00.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:00.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:02.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:04.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:05 np0005603622 nova_compute[221301]: 2026-01-31 08:57:05.047 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:05 np0005603622 nova_compute[221301]: 2026-01-31 08:57:05.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:05.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:06 np0005603622 nova_compute[221301]: 2026-01-31 08:57:06.269 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:06.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:07 np0005603622 nova_compute[221301]: 2026-01-31 08:57:07.434 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:07 np0005603622 nova_compute[221301]: 2026-01-31 08:57:07.434 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:57:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:08 np0005603622 nova_compute[221301]: 2026-01-31 08:57:08.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:08.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:09 np0005603622 nova_compute[221301]: 2026-01-31 08:57:09.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:10.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:10 np0005603622 nova_compute[221301]: 2026-01-31 08:57:10.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:10 np0005603622 nova_compute[221301]: 2026-01-31 08:57:10.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:57:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:10.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.750 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.750 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:11 np0005603622 nova_compute[221301]: 2026-01-31 08:57:11.751 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:57:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:12.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:57:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.518 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:14.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.650 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.651 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.651 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.652 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.680 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.680 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.681 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.681 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:57:14 np0005603622 nova_compute[221301]: 2026-01-31 08:57:14.682 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4161333691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.131 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.355 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.355 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000b4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.519 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.520 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4022MB free_disk=20.896812438964844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.521 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.521 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.603 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 13eea30b-7972-4da3-a3dd-8d068b401d22 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.603 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.604 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.624 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.651 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.652 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.693 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:15 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.727 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 03:57:15 np0005603622 nova_compute[221301]: 2026-01-31 08:57:15.776 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:16 np0005603622 podman[296219]: 2026-01-31 08:57:16.15370382 +0000 UTC m=+0.076552131 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:57:16 np0005603622 podman[296220]: 2026-01-31 08:57:16.154196153 +0000 UTC m=+0.075438091 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 03:57:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2055147154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:16 np0005603622 nova_compute[221301]: 2026-01-31 08:57:16.364 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:16 np0005603622 nova_compute[221301]: 2026-01-31 08:57:16.370 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:16 np0005603622 nova_compute[221301]: 2026-01-31 08:57:16.395 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:16 np0005603622 nova_compute[221301]: 2026-01-31 08:57:16.420 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:57:16 np0005603622 nova_compute[221301]: 2026-01-31 08:57:16.420 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:16.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:57:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:57:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:18.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:19 np0005603622 nova_compute[221301]: 2026-01-31 08:57:19.165 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:20 np0005603622 nova_compute[221301]: 2026-01-31 08:57:20.061 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:20 np0005603622 nova_compute[221301]: 2026-01-31 08:57:20.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:20 np0005603622 nova_compute[221301]: 2026-01-31 08:57:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:57:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:20.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:20 np0005603622 nova_compute[221301]: 2026-01-31 08:57:20.977 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:20 np0005603622 nova_compute[221301]: 2026-01-31 08:57:20.978 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.016 221324 DEBUG nova.objects.instance [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.081 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.484 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.485 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.486 221324 INFO nova.compute.manager [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attaching volume a91dfbd1-af86-4b71-b911-f9af43fa9cd7 to /dev/vdb#033[00m
Jan 31 03:57:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:21Z|00877|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.541 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:21 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:21Z|00878|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.755 221324 DEBUG os_brick.utils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.757 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.767 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.767 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd65649-c658-4d8d-b02a-6a9c06f1739b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.769 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.775 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.776 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[953a1826-a85e-474e-bf33-b407c7c28ce0]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.777 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.786 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.787 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[df686588-85ff-49ea-a40c-5465ece83dac]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.789 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[4326897c-c4e4-40bc-9552-f2dee1a177f1]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.790 221324 DEBUG oslo_concurrency.processutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.811 221324 DEBUG oslo_concurrency.processutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.813 221324 DEBUG os_brick.initiator.connectors.lightos [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.813 221324 DEBUG os_brick.initiator.connectors.lightos [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.814 221324 DEBUG os_brick.initiator.connectors.lightos [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.814 221324 DEBUG os_brick.utils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:57:21 np0005603622 nova_compute[221301]: 2026-01-31 08:57:21.814 221324 DEBUG nova.virt.block_device [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating existing volume attachment record: aa1d962e-7bc0-433c-97e5-8f3d078d07b8 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:57:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:22.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:22.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:22 np0005603622 nova_compute[221301]: 2026-01-31 08:57:22.834 221324 DEBUG nova.objects.instance [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:22 np0005603622 nova_compute[221301]: 2026-01-31 08:57:22.874 221324 DEBUG nova.virt.libvirt.driver [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attempting to attach volume a91dfbd1-af86-4b71-b911-f9af43fa9cd7 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:57:22 np0005603622 nova_compute[221301]: 2026-01-31 08:57:22.879 221324 DEBUG nova.virt.libvirt.guest [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91dfbd1-af86-4b71-b911-f9af43fa9cd7">
Jan 31 03:57:22 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:57:22 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:22 np0005603622 nova_compute[221301]:  <serial>a91dfbd1-af86-4b71-b911-f9af43fa9cd7</serial>
Jan 31 03:57:22 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:22 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:57:23 np0005603622 nova_compute[221301]: 2026-01-31 08:57:23.071 221324 DEBUG nova.virt.libvirt.driver [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:23 np0005603622 nova_compute[221301]: 2026-01-31 08:57:23.071 221324 DEBUG nova.virt.libvirt.driver [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:23 np0005603622 nova_compute[221301]: 2026-01-31 08:57:23.072 221324 DEBUG nova.virt.libvirt.driver [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:23 np0005603622 nova_compute[221301]: 2026-01-31 08:57:23.072 221324 DEBUG nova.virt.libvirt.driver [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:6d:3c:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:23 np0005603622 nova_compute[221301]: 2026-01-31 08:57:23.519 221324 DEBUG oslo_concurrency.lockutils [None req-22732b7e-6ce5-44e0-a7e8-27b24986bfce cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:24.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:24 np0005603622 nova_compute[221301]: 2026-01-31 08:57:24.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:24.491 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:24 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:24.492 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:57:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:24.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:24 np0005603622 nova_compute[221301]: 2026-01-31 08:57:24.652 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:24 np0005603622 nova_compute[221301]: 2026-01-31 08:57:24.652 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:24 np0005603622 nova_compute[221301]: 2026-01-31 08:57:24.671 221324 DEBUG nova.objects.instance [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:24 np0005603622 nova_compute[221301]: 2026-01-31 08:57:24.720 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.064 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.208 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.262 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.263 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.263 221324 INFO nova.compute.manager [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attaching volume 7ad3b3fb-6194-4162-90c8-c3ac9c4017d9 to /dev/vdc#033[00m
Jan 31 03:57:25 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:25.495 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.594 221324 DEBUG os_brick.utils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.595 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.604 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.604 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[b68f89c7-b3e6-49ee-9933-2ada47f31017]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.606 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.612 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.612 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf38a5e-13cc-4173-b9c9-00352caca3f2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.614 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.619 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.005s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.619 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[13d95231-43ee-45c3-b770-8975781d6c74]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.621 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2a1418-1cbd-47c0-a1a7-fd821ef23b10]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.621 221324 DEBUG oslo_concurrency.processutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.648 221324 DEBUG oslo_concurrency.processutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.652 221324 DEBUG os_brick.initiator.connectors.lightos [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.653 221324 DEBUG os_brick.initiator.connectors.lightos [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.653 221324 DEBUG os_brick.initiator.connectors.lightos [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.654 221324 DEBUG os_brick.utils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] <== get_connector_properties: return (58ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 03:57:25 np0005603622 nova_compute[221301]: 2026-01-31 08:57:25.655 221324 DEBUG nova.virt.block_device [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating existing volume attachment record: 925cdaee-f41e-4e28-b342-9bac89a792d6 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 03:57:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:26.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1980596687' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.637 221324 DEBUG nova.objects.instance [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:26.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.666 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attempting to attach volume 7ad3b3fb-6194-4162-90c8-c3ac9c4017d9 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.669 221324 DEBUG nova.virt.libvirt.guest [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7ad3b3fb-6194-4162-90c8-c3ac9c4017d9">
Jan 31 03:57:26 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 03:57:26 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  </auth>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:26 np0005603622 nova_compute[221301]:  <serial>7ad3b3fb-6194-4162-90c8-c3ac9c4017d9</serial>
Jan 31 03:57:26 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:26 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.787 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.788 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.788 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.788 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:26 np0005603622 nova_compute[221301]: 2026-01-31 08:57:26.788 221324 DEBUG nova.virt.libvirt.driver [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] No VIF found with MAC fa:16:3e:6d:3c:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/577050675' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:27 np0005603622 nova_compute[221301]: 2026-01-31 08:57:27.155 221324 DEBUG oslo_concurrency.lockutils [None req-4944b110-a276-4c59-bdd1-8c49c47c0256 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:28.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:28.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:29 np0005603622 NetworkManager[49080]: <info>  [1769849849.7580] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Jan 31 03:57:29 np0005603622 NetworkManager[49080]: <info>  [1769849849.7590] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Jan 31 03:57:29 np0005603622 nova_compute[221301]: 2026-01-31 08:57:29.757 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:29 np0005603622 nova_compute[221301]: 2026-01-31 08:57:29.828 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:29 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:29Z|00879|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:29 np0005603622 nova_compute[221301]: 2026-01-31 08:57:29.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:57:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.065 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.346 221324 DEBUG nova.compute.manager [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.347 221324 DEBUG nova.compute.manager [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.348 221324 DEBUG oslo_concurrency.lockutils [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.348 221324 DEBUG oslo_concurrency.lockutils [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:30 np0005603622 nova_compute[221301]: 2026-01-31 08:57:30.349 221324 DEBUG nova.network.neutron [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:30.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:30.916 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:30.917 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:32.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:32.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:32 np0005603622 nova_compute[221301]: 2026-01-31 08:57:32.871 221324 DEBUG nova.compute.manager [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:32 np0005603622 nova_compute[221301]: 2026-01-31 08:57:32.872 221324 DEBUG nova.compute.manager [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:32 np0005603622 nova_compute[221301]: 2026-01-31 08:57:32.872 221324 DEBUG oslo_concurrency.lockutils [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.318 221324 DEBUG nova.network.neutron [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.319 221324 DEBUG nova.network.neutron [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.354 221324 DEBUG oslo_concurrency.lockutils [req-9b94a7be-104c-47f6-a3c9-5eb06707aa01 req-98c8de55-744d-4f9f-9f56-05f6e37b2a28 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.354 221324 DEBUG oslo_concurrency.lockutils [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.355 221324 DEBUG nova.network.neutron [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:33 np0005603622 nova_compute[221301]: 2026-01-31 08:57:33.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:34.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:34 np0005603622 nova_compute[221301]: 2026-01-31 08:57:34.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:57:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:34.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:35 np0005603622 nova_compute[221301]: 2026-01-31 08:57:35.059 221324 DEBUG nova.compute.manager [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:35 np0005603622 nova_compute[221301]: 2026-01-31 08:57:35.059 221324 DEBUG nova.compute.manager [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:35 np0005603622 nova_compute[221301]: 2026-01-31 08:57:35.060 221324 DEBUG oslo_concurrency.lockutils [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:35 np0005603622 nova_compute[221301]: 2026-01-31 08:57:35.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603622 nova_compute[221301]: 2026-01-31 08:57:35.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:36.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:36.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:37 np0005603622 nova_compute[221301]: 2026-01-31 08:57:37.722 221324 DEBUG nova.network.neutron [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:37 np0005603622 nova_compute[221301]: 2026-01-31 08:57:37.723 221324 DEBUG nova.network.neutron [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:37 np0005603622 nova_compute[221301]: 2026-01-31 08:57:37.755 221324 DEBUG oslo_concurrency.lockutils [req-e142f23e-3c46-49a8-b4ff-f5fcf32e8289 req-af7f58a0-4344-4973-8809-b11e6f7d960c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:37 np0005603622 nova_compute[221301]: 2026-01-31 08:57:37.756 221324 DEBUG oslo_concurrency.lockutils [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:37 np0005603622 nova_compute[221301]: 2026-01-31 08:57:37.756 221324 DEBUG nova.network.neutron [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:38.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:38.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.073 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.074 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.102 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.219 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.219 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.226 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.227 221324 INFO nova.compute.claims [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:57:39 np0005603622 nova_compute[221301]: 2026-01-31 08:57:39.611 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:57:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4225718197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.029 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.035 221324 DEBUG nova.compute.provider_tree [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:57:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.055 221324 DEBUG nova.scheduler.client.report [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.089 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.090 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.147 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.147 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.166 221324 INFO nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.214 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.217 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.327 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.328 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.329 221324 INFO nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Creating image(s)#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.361 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.394 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.420 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.423 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.467 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.468 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.469 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.469 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.499 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:40 np0005603622 nova_compute[221301]: 2026-01-31 08:57:40.503 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:40.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.049 221324 DEBUG nova.network.neutron [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.051 221324 DEBUG nova.network.neutron [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.068 221324 DEBUG oslo_concurrency.lockutils [req-422af385-a512-4f33-bcc6-1a941b9f4f6e req-962927e6-a393-4386-84a9-7bbbbc79d20c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.704 221324 DEBUG nova.compute.manager [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.704 221324 DEBUG nova.compute.manager [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.705 221324 DEBUG oslo_concurrency.lockutils [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.706 221324 DEBUG oslo_concurrency.lockutils [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.706 221324 DEBUG nova.network.neutron [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:41 np0005603622 nova_compute[221301]: 2026-01-31 08:57:41.997 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:42.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.140 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] resizing rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.273 221324 DEBUG nova.objects.instance [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'migration_context' on Instance uuid a36a041b-d513-4688-8abc-0ae4f25b2b9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.291 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.291 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Ensure instance console log exists: /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.292 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.292 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.292 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:42 np0005603622 nova_compute[221301]: 2026-01-31 08:57:42.322 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Successfully created port: 22060216-c399-4114-99a0-fbae3664bd28 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:57:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:42.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:43 np0005603622 nova_compute[221301]: 2026-01-31 08:57:43.919 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Successfully updated port: 22060216-c399-4114-99a0-fbae3664bd28 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:57:43 np0005603622 nova_compute[221301]: 2026-01-31 08:57:43.952 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:43 np0005603622 nova_compute[221301]: 2026-01-31 08:57:43.952 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquired lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:43 np0005603622 nova_compute[221301]: 2026-01-31 08:57:43.952 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:57:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:44.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.072 221324 DEBUG nova.compute.manager [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-changed-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.073 221324 DEBUG nova.compute.manager [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Refreshing instance network info cache due to event network-changed-22060216-c399-4114-99a0-fbae3664bd28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.073 221324 DEBUG oslo_concurrency.lockutils [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.567 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.635 221324 DEBUG nova.network.neutron [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.636 221324 DEBUG nova.network.neutron [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:44.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:44 np0005603622 nova_compute[221301]: 2026-01-31 08:57:44.679 221324 DEBUG oslo_concurrency.lockutils [req-4abac160-f5cf-4844-9b69-7e2437d954ac req-40247475-b7f0-4bcb-94f4-04c9cd002360 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.069 221324 DEBUG oslo_concurrency.lockutils [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.069 221324 DEBUG oslo_concurrency.lockutils [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.120 221324 INFO nova.compute.manager [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Detaching volume a91dfbd1-af86-4b71-b911-f9af43fa9cd7#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.126 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.405 221324 INFO nova.virt.block_device [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attempting to driver detach volume a91dfbd1-af86-4b71-b911-f9af43fa9cd7 from mountpoint /dev/vdb#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.415 221324 DEBUG nova.virt.libvirt.driver [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Attempting to detach device vdb from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.416 221324 DEBUG nova.virt.libvirt.guest [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91dfbd1-af86-4b71-b911-f9af43fa9cd7">
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <serial>a91dfbd1-af86-4b71-b911-f9af43fa9cd7</serial>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:45 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.425 221324 INFO nova.virt.libvirt.driver [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdb from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the persistent domain config.#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.425 221324 DEBUG nova.virt.libvirt.driver [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.426 221324 DEBUG nova.virt.libvirt.guest [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-a91dfbd1-af86-4b71-b911-f9af43fa9cd7">
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <serial>a91dfbd1-af86-4b71-b911-f9af43fa9cd7</serial>
Jan 31 03:57:45 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 03:57:45 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:45 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.489 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849865.4887967, 13eea30b-7972-4da3-a3dd-8d068b401d22 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.491 221324 DEBUG nova.virt.libvirt.driver [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 13eea30b-7972-4da3-a3dd-8d068b401d22 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.494 221324 INFO nova.virt.libvirt.driver [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdb from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the live domain config.#033[00m
Jan 31 03:57:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.871 221324 DEBUG nova.objects.instance [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:45 np0005603622 nova_compute[221301]: 2026-01-31 08:57:45.949 221324 DEBUG oslo_concurrency.lockutils [None req-1b474bab-8a31-4099-8ea2-ff041875a783 cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:46.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.174 221324 DEBUG nova.network.neutron [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Updating instance_info_cache with network_info: [{"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.201 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Releasing lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.202 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Instance network_info: |[{"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.203 221324 DEBUG oslo_concurrency.lockutils [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.203 221324 DEBUG nova.network.neutron [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Refreshing network info cache for port 22060216-c399-4114-99a0-fbae3664bd28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.207 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Start _get_guest_xml network_info=[{"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.212 221324 WARNING nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.218 221324 DEBUG nova.virt.libvirt.host [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.219 221324 DEBUG nova.virt.libvirt.host [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.222 221324 DEBUG nova.virt.libvirt.host [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.222 221324 DEBUG nova.virt.libvirt.host [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.223 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.224 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.224 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.224 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.224 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.225 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.225 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.225 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.225 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.226 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.226 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.226 221324 DEBUG nova.virt.hardware [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.229 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1097264978' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.674 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:46.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.698 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:46 np0005603622 nova_compute[221301]: 2026-01-31 08:57:46.703 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:47 np0005603622 podman[296625]: 2026-01-31 08:57:47.124073767 +0000 UTC m=+0.042624222 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 03:57:47 np0005603622 podman[296624]: 2026-01-31 08:57:47.146054196 +0000 UTC m=+0.068543536 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller)
Jan 31 03:57:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:57:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/718877309' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.237 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.239 221324 DEBUG nova.virt.libvirt.vif [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1398594606',display_name='tempest-TestServerMultinode-server-1398594606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1398594606',id=185,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-tucqs609',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:40Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=a36a041b-d513-4688-8abc-0ae4f25b2b9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.240 221324 DEBUG nova.network.os_vif_util [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.241 221324 DEBUG nova.network.os_vif_util [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.242 221324 DEBUG nova.objects.instance [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'pci_devices' on Instance uuid a36a041b-d513-4688-8abc-0ae4f25b2b9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.283 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <uuid>a36a041b-d513-4688-8abc-0ae4f25b2b9e</uuid>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <name>instance-000000b9</name>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestServerMultinode-server-1398594606</nova:name>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:57:46</nova:creationTime>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:user uuid="4e364ad937544559bea978006e9ff229">tempest-TestServerMultinode-893388561-project-admin</nova:user>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:project uuid="c37e7d6d634448bfb3172894ad2af105">tempest-TestServerMultinode-893388561</nova:project>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <nova:port uuid="22060216-c399-4114-99a0-fbae3664bd28">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="serial">a36a041b-d513-4688-8abc-0ae4f25b2b9e</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="uuid">a36a041b-d513-4688-8abc-0ae4f25b2b9e</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:27:75:f4"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <target dev="tap22060216-c3"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/console.log" append="off"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:57:47 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:57:47 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:57:47 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:57:47 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.284 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Preparing to wait for external event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.285 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.285 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.285 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.286 221324 DEBUG nova.virt.libvirt.vif [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1398594606',display_name='tempest-TestServerMultinode-server-1398594606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1398594606',id=185,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-tucqs609',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:57:40Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=a36a041b-d513-4688-8abc-0ae4f25b2b9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.287 221324 DEBUG nova.network.os_vif_util [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.287 221324 DEBUG nova.network.os_vif_util [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.288 221324 DEBUG os_vif [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.289 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.289 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.290 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.294 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22060216-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.295 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap22060216-c3, col_values=(('external_ids', {'iface-id': '22060216-c399-4114-99a0-fbae3664bd28', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:75:f4', 'vm-uuid': 'a36a041b-d513-4688-8abc-0ae4f25b2b9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:47 np0005603622 NetworkManager[49080]: <info>  [1769849867.2973] manager: (tap22060216-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.303 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.304 221324 INFO os_vif [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3')#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.383 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.384 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.384 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] No VIF found with MAC fa:16:3e:27:75:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.384 221324 INFO nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Using config drive#033[00m
Jan 31 03:57:47 np0005603622 nova_compute[221301]: 2026-01-31 08:57:47.422 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:48.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.179 221324 INFO nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Creating config drive at /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.183 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9qyel440 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.325 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9qyel440" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.360 221324 DEBUG nova.storage.rbd_utils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] rbd image a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.364 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.553 221324 DEBUG oslo_concurrency.processutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config a36a041b-d513-4688-8abc-0ae4f25b2b9e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.554 221324 INFO nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Deleting local config drive /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e/disk.config because it was imported into RBD.#033[00m
Jan 31 03:57:48 np0005603622 kernel: tap22060216-c3: entered promiscuous mode
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.5976] manager: (tap22060216-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/391)
Jan 31 03:57:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:48Z|00880|binding|INFO|Claiming lport 22060216-c399-4114-99a0-fbae3664bd28 for this chassis.
Jan 31 03:57:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:48Z|00881|binding|INFO|22060216-c399-4114-99a0-fbae3664bd28: Claiming fa:16:3e:27:75:f4 10.100.0.14
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:48Z|00882|binding|INFO|Setting lport 22060216-c399-4114-99a0-fbae3664bd28 ovn-installed in OVS
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:48Z|00883|binding|INFO|Setting lport 22060216-c399-4114-99a0-fbae3664bd28 up in Southbound
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.613 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:75:f4 10.100.0.14'], port_security=['fa:16:3e:27:75:f4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a36a041b-d513-4688-8abc-0ae4f25b2b9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c37e7d6d634448bfb3172894ad2af105', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd98fdedc-7ec4-4678-86fd-333fbe96f77f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5a6fc0-3df3-4c2f-84cd-adc2af316a8e, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=22060216-c399-4114-99a0-fbae3664bd28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.615 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 22060216-c399-4114-99a0-fbae3664bd28 in datapath b9195012-fef1-4e17-acdd-2b9ffc979da0 bound to our chassis#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.618 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9195012-fef1-4e17-acdd-2b9ffc979da0#033[00m
Jan 31 03:57:48 np0005603622 systemd-machined[190406]: New machine qemu-91-instance-000000b9.
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.630 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7f22428b-0c6b-4a0b-8bb3-a18f7b1c2dcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.631 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9195012-f1 in ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:57:48 np0005603622 systemd-udevd[296742]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.634 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9195012-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.634 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[35c4e3e4-d6f3-4883-a819-8a5b6c6ebf75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.635 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[814fd772-557c-46a3-b186-6050fa976357]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.6472] device (tap22060216-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.645 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[8b70c988-4e20-452a-aad1-79705251cead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.6486] device (tap22060216-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:57:48 np0005603622 systemd[1]: Started Virtual Machine qemu-91-instance-000000b9.
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.659 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[db68d4c8-1988-4b7d-90a0-026f7c1a5736]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:48.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.683 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[907de77a-4d8e-4458-a5f2-53e57d51f2cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.6912] manager: (tapb9195012-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/392)
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.690 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1513a14b-1a59-4f1c-a49f-e14fae02d98e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 systemd-udevd[296745]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.718 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[84194293-c5d5-4cc6-841f-67a1e5c1c2d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.721 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2bbcaced-75a7-4a36-9bdd-85aa72955fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.7376] device (tapb9195012-f0): carrier: link connected
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.742 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b6729f49-ec1b-4600-bb3e-b0e88b8f231a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.757 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[79a7039f-d8b1-4fed-b543-6c80763c6a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9195012-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f5:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 912007, 'reachable_time': 35201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296774, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.773 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6973a7ce-c4d6-48d4-8955-e93bded89763]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:f5e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 912007, 'tstamp': 912007}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296775, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.788 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8795217f-6c35-422b-b7f8-e391aab026c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9195012-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f5:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 912007, 'reachable_time': 35201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 296776, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.811 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f25a1dec-f62a-4e89-8d14-94830f94aaeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.852 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9a4392-8ed2-4099-8892-36ac7811b6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.853 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9195012-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.854 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.854 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9195012-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.856 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 NetworkManager[49080]: <info>  [1769849868.8574] manager: (tapb9195012-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/393)
Jan 31 03:57:48 np0005603622 kernel: tapb9195012-f0: entered promiscuous mode
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.859 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.861 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9195012-f0, col_values=(('external_ids', {'iface-id': '1553dad0-d27d-4162-94ad-0b8a3a359f3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:57:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:48Z|00884|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.862 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.864 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.864 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b4086abe-85fd-458c-8298-12ab802bbe37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.865 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-b9195012-fef1-4e17-acdd-2b9ffc979da0
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/b9195012-fef1-4e17-acdd-2b9ffc979da0.pid.haproxy
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID b9195012-fef1-4e17-acdd-2b9ffc979da0
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:57:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:57:48.866 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'env', 'PROCESS_TAG=haproxy-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9195012-fef1-4e17-acdd-2b9ffc979da0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:57:48 np0005603622 nova_compute[221301]: 2026-01-31 08:57:48.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.035 221324 DEBUG oslo_concurrency.lockutils [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.035 221324 DEBUG oslo_concurrency.lockutils [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.058 221324 DEBUG nova.compute.manager [req-da20bad2-f4de-457c-b0d0-be7c0c216ee4 req-96f9937f-1bd4-4ae3-92c6-99df825d3e0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.058 221324 DEBUG oslo_concurrency.lockutils [req-da20bad2-f4de-457c-b0d0-be7c0c216ee4 req-96f9937f-1bd4-4ae3-92c6-99df825d3e0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.059 221324 DEBUG oslo_concurrency.lockutils [req-da20bad2-f4de-457c-b0d0-be7c0c216ee4 req-96f9937f-1bd4-4ae3-92c6-99df825d3e0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.059 221324 DEBUG oslo_concurrency.lockutils [req-da20bad2-f4de-457c-b0d0-be7c0c216ee4 req-96f9937f-1bd4-4ae3-92c6-99df825d3e0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.059 221324 DEBUG nova.compute.manager [req-da20bad2-f4de-457c-b0d0-be7c0c216ee4 req-96f9937f-1bd4-4ae3-92c6-99df825d3e0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Processing event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.062 221324 INFO nova.compute.manager [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Detaching volume 7ad3b3fb-6194-4162-90c8-c3ac9c4017d9#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.183 221324 DEBUG nova.network.neutron [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Updated VIF entry in instance network info cache for port 22060216-c399-4114-99a0-fbae3664bd28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.184 221324 DEBUG nova.network.neutron [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Updating instance_info_cache with network_info: [{"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:49 np0005603622 podman[296808]: 2026-01-31 08:57:49.211869918 +0000 UTC m=+0.061192379 container create 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.214 221324 DEBUG oslo_concurrency.lockutils [req-e6adda4c-ff2e-4246-b4df-73c3a1f9e1a4 req-1fc72643-a3d1-4b7b-83e9-39c229381784 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-a36a041b-d513-4688-8abc-0ae4f25b2b9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:49 np0005603622 systemd[1]: Started libpod-conmon-2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d.scope.
Jan 31 03:57:49 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:57:49 np0005603622 podman[296808]: 2026-01-31 08:57:49.169319339 +0000 UTC m=+0.018641840 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:57:49 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ba999d7f02608c7220c65e820cda4e87db5795bc6922f6f19405f8df9fd8fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:57:49 np0005603622 podman[296808]: 2026-01-31 08:57:49.281091772 +0000 UTC m=+0.130414253 container init 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 03:57:49 np0005603622 podman[296808]: 2026-01-31 08:57:49.286731363 +0000 UTC m=+0.136053824 container start 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.310 221324 INFO nova.virt.block_device [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Attempting to driver detach volume 7ad3b3fb-6194-4162-90c8-c3ac9c4017d9 from mountpoint /dev/vdc#033[00m
Jan 31 03:57:49 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [NOTICE]   (296845) : New worker (296862) forked
Jan 31 03:57:49 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [NOTICE]   (296845) : Loading success.
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.323 221324 DEBUG nova.virt.libvirt.driver [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Attempting to detach device vdc from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.324 221324 DEBUG nova.virt.libvirt.guest [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7ad3b3fb-6194-4162-90c8-c3ac9c4017d9">
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <serial>7ad3b3fb-6194-4162-90c8-c3ac9c4017d9</serial>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:49 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.331 221324 INFO nova.virt.libvirt.driver [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdc from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the persistent domain config.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.331 221324 DEBUG nova.virt.libvirt.driver [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.331 221324 DEBUG nova.virt.libvirt.guest [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-7ad3b3fb-6194-4162-90c8-c3ac9c4017d9">
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  </source>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <target dev="vdc" bus="virtio"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <serial>7ad3b3fb-6194-4162-90c8-c3ac9c4017d9</serial>
Jan 31 03:57:49 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 31 03:57:49 np0005603622 nova_compute[221301]: </disk>
Jan 31 03:57:49 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.426 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849869.4259973, a36a041b-d513-4688-8abc-0ae4f25b2b9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.427 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] VM Started (Lifecycle Event)#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.429 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.432 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.437 221324 INFO nova.virt.libvirt.driver [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Instance spawned successfully.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.438 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.460 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.464 221324 DEBUG nova.virt.libvirt.driver [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 13eea30b-7972-4da3-a3dd-8d068b401d22 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.466 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.477 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.478 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.479 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.480 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.481 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.482 221324 DEBUG nova.virt.libvirt.driver [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.491 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.491 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849869.4261482, a36a041b-d513-4688-8abc-0ae4f25b2b9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.492 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.532 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.536 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849869.4318902, a36a041b-d513-4688-8abc-0ae4f25b2b9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.536 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.566 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.570 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.574 221324 INFO nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Took 9.25 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.575 221324 DEBUG nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.621 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.622 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769849869.461615, 13eea30b-7972-4da3-a3dd-8d068b401d22 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.625 221324 INFO nova.virt.libvirt.driver [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully detached device vdc from instance 13eea30b-7972-4da3-a3dd-8d068b401d22 from the live domain config.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.660 221324 INFO nova.compute.manager [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Took 10.48 seconds to build instance.#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.705 221324 DEBUG oslo_concurrency.lockutils [None req-c0f81821-3be9-4d34-b337-e6f036dd1367 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:49 np0005603622 nova_compute[221301]: 2026-01-31 08:57:49.979 221324 DEBUG nova.objects.instance [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'flavor' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:57:50 np0005603622 nova_compute[221301]: 2026-01-31 08:57:50.058 221324 DEBUG oslo_concurrency.lockutils [None req-2e2ec3f0-2b8d-42f5-8e64-40035d69182c cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:50 np0005603622 nova_compute[221301]: 2026-01-31 08:57:50.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.234 221324 DEBUG nova.compute.manager [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.234 221324 DEBUG oslo_concurrency.lockutils [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.235 221324 DEBUG oslo_concurrency.lockutils [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.235 221324 DEBUG oslo_concurrency.lockutils [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.236 221324 DEBUG nova.compute.manager [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] No waiting events found dispatching network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:57:51 np0005603622 nova_compute[221301]: 2026-01-31 08:57:51.236 221324 WARNING nova.compute.manager [req-aaa90d1a-b489-46ce-9b0a-be1671d46d00 req-ca1b850a-c714-432c-849f-600041582477 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received unexpected event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:57:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:52.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.292 221324 DEBUG nova.compute.manager [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.293 221324 DEBUG nova.compute.manager [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.293 221324 DEBUG oslo_concurrency.lockutils [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.293 221324 DEBUG oslo_concurrency.lockutils [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.293 221324 DEBUG nova.network.neutron [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:52 np0005603622 nova_compute[221301]: 2026-01-31 08:57:52.297 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:52.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:57:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:54.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:57:54 np0005603622 nova_compute[221301]: 2026-01-31 08:57:54.970 221324 DEBUG nova.network.neutron [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:57:54 np0005603622 nova_compute[221301]: 2026-01-31 08:57:54.970 221324 DEBUG nova.network.neutron [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:57:55 np0005603622 nova_compute[221301]: 2026-01-31 08:57:55.004 221324 DEBUG oslo_concurrency.lockutils [req-694c4aae-bc11-461c-94f2-bdb52f6f6235 req-634de954-86d7-4eb2-8711-a0bf66059ea5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:57:55 np0005603622 nova_compute[221301]: 2026-01-31 08:57:55.219 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:57:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:56 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:56Z|00885|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:56Z|00886|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603622 nova_compute[221301]: 2026-01-31 08:57:56.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:56Z|00887|binding|INFO|Releasing lport 5a0136e3-84ab-4495-80ff-8006a0a74934 from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603622 ovn_controller[130109]: 2026-01-31T08:57:56Z|00888|binding|INFO|Releasing lport 1553dad0-d27d-4162-94ad-0b8a3a359f3a from this chassis (sb_readonly=0)
Jan 31 03:57:56 np0005603622 nova_compute[221301]: 2026-01-31 08:57:56.295 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:56.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:57 np0005603622 nova_compute[221301]: 2026-01-31 08:57:57.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:57:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:57:58.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:57:58 np0005603622 nova_compute[221301]: 2026-01-31 08:57:58.203 221324 DEBUG nova.compute.manager [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:57:58 np0005603622 nova_compute[221301]: 2026-01-31 08:57:58.203 221324 DEBUG nova.compute.manager [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing instance network info cache due to event network-changed-71b92e93-ebb3-4664-a6af-06e2b936453f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:57:58 np0005603622 nova_compute[221301]: 2026-01-31 08:57:58.204 221324 DEBUG oslo_concurrency.lockutils [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:57:58 np0005603622 nova_compute[221301]: 2026-01-31 08:57:58.204 221324 DEBUG oslo_concurrency.lockutils [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:57:58 np0005603622 nova_compute[221301]: 2026-01-31 08:57:58.204 221324 DEBUG nova.network.neutron [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Refreshing network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:57:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:57:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:57:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:57:58.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:00 np0005603622 nova_compute[221301]: 2026-01-31 08:58:00.220 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:00.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:01 np0005603622 nova_compute[221301]: 2026-01-31 08:58:01.441 221324 DEBUG nova.network.neutron [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updated VIF entry in instance network info cache for port 71b92e93-ebb3-4664-a6af-06e2b936453f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:58:01 np0005603622 nova_compute[221301]: 2026-01-31 08:58:01.442 221324 DEBUG nova.network.neutron [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [{"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:01 np0005603622 nova_compute[221301]: 2026-01-31 08:58:01.470 221324 DEBUG oslo_concurrency.lockutils [req-13b99626-06d6-47d9-8b59-01cb41b3109f req-c0c83d21-27f0-45f0-87ce-6db5a4f1d725 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-13eea30b-7972-4da3-a3dd-8d068b401d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:02.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:02 np0005603622 nova_compute[221301]: 2026-01-31 08:58:02.302 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:02.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.149 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.150 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.150 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.151 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.151 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.153 221324 INFO nova.compute.manager [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Terminating instance#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.154 221324 DEBUG nova.compute.manager [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:58:03 np0005603622 kernel: tap22060216-c3 (unregistering): left promiscuous mode
Jan 31 03:58:03 np0005603622 NetworkManager[49080]: <info>  [1769849883.2150] device (tap22060216-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:58:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:03Z|00889|binding|INFO|Releasing lport 22060216-c399-4114-99a0-fbae3664bd28 from this chassis (sb_readonly=0)
Jan 31 03:58:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:03Z|00890|binding|INFO|Setting lport 22060216-c399-4114-99a0-fbae3664bd28 down in Southbound
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.228 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:03Z|00891|binding|INFO|Removing iface tap22060216-c3 ovn-installed in OVS
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.235 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:75:f4 10.100.0.14'], port_security=['fa:16:3e:27:75:f4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a36a041b-d513-4688-8abc-0ae4f25b2b9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c37e7d6d634448bfb3172894ad2af105', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd98fdedc-7ec4-4678-86fd-333fbe96f77f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f5a6fc0-3df3-4c2f-84cd-adc2af316a8e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=22060216-c399-4114-99a0-fbae3664bd28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.237 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 22060216-c399-4114-99a0-fbae3664bd28 in datapath b9195012-fef1-4e17-acdd-2b9ffc979da0 unbound from our chassis#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.240 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9195012-fef1-4e17-acdd-2b9ffc979da0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.241 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c82a767-d5f0-437c-8a33-b9d29b7a7fd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.242 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 namespace which is not needed anymore#033[00m
Jan 31 03:58:03 np0005603622 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Jan 31 03:58:03 np0005603622 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b9.scope: Consumed 13.388s CPU time.
Jan 31 03:58:03 np0005603622 systemd-machined[190406]: Machine qemu-91-instance-000000b9 terminated.
Jan 31 03:58:03 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [NOTICE]   (296845) : haproxy version is 2.8.14-c23fe91
Jan 31 03:58:03 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [NOTICE]   (296845) : path to executable is /usr/sbin/haproxy
Jan 31 03:58:03 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [WARNING]  (296845) : Exiting Master process...
Jan 31 03:58:03 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [ALERT]    (296845) : Current worker (296862) exited with code 143 (Terminated)
Jan 31 03:58:03 np0005603622 neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0[296823]: [WARNING]  (296845) : All workers exited. Exiting... (0)
Jan 31 03:58:03 np0005603622 systemd[1]: libpod-2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d.scope: Deactivated successfully.
Jan 31 03:58:03 np0005603622 podman[296907]: 2026-01-31 08:58:03.380235394 +0000 UTC m=+0.069095110 container died 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.389 221324 INFO nova.virt.libvirt.driver [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Instance destroyed successfully.#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.389 221324 DEBUG nova.objects.instance [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lazy-loading 'resources' on Instance uuid a36a041b-d513-4688-8abc-0ae4f25b2b9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:03 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d-userdata-shm.mount: Deactivated successfully.
Jan 31 03:58:03 np0005603622 systemd[1]: var-lib-containers-storage-overlay-e8ba999d7f02608c7220c65e820cda4e87db5795bc6922f6f19405f8df9fd8fb-merged.mount: Deactivated successfully.
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.424 221324 DEBUG nova.virt.libvirt.vif [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1398594606',display_name='tempest-TestServerMultinode-server-1398594606',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1398594606',id=185,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:57:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c37e7d6d634448bfb3172894ad2af105',ramdisk_id='',reservation_id='r-tucqs609',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-893388561',owner_user_name='tempest-TestServerMultinode-893388561-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:57:49Z,user_data=None,user_id='4e364ad937544559bea978006e9ff229',uuid=a36a041b-d513-4688-8abc-0ae4f25b2b9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.425 221324 DEBUG nova.network.os_vif_util [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converting VIF {"id": "22060216-c399-4114-99a0-fbae3664bd28", "address": "fa:16:3e:27:75:f4", "network": {"id": "b9195012-fef1-4e17-acdd-2b9ffc979da0", "bridge": "br-int", "label": "tempest-TestServerMultinode-1076760224-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ce62b246a60455e8ec83f770113c52c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22060216-c3", "ovs_interfaceid": "22060216-c399-4114-99a0-fbae3664bd28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.425 221324 DEBUG nova.network.os_vif_util [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.426 221324 DEBUG os_vif [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.428 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.428 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22060216-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.435 221324 INFO os_vif [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:75:f4,bridge_name='br-int',has_traffic_filtering=True,id=22060216-c399-4114-99a0-fbae3664bd28,network=Network(b9195012-fef1-4e17-acdd-2b9ffc979da0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22060216-c3')#033[00m
Jan 31 03:58:03 np0005603622 podman[296907]: 2026-01-31 08:58:03.447545246 +0000 UTC m=+0.136404972 container cleanup 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 03:58:03 np0005603622 systemd[1]: libpod-conmon-2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d.scope: Deactivated successfully.
Jan 31 03:58:03 np0005603622 podman[296960]: 2026-01-31 08:58:03.543616128 +0000 UTC m=+0.077080904 container remove 2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.547 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2d509c-0fde-448f-aadb-e98f9a1289a0]: (4, ('Sat Jan 31 08:58:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 (2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d)\n2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d\nSat Jan 31 08:58:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 (2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d)\n2e8f9ffc9379a74917766f0ed3876f364c58e5c2552dd0d714eceaf2d544470d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.548 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb026ac-0645-4e8f-be81-78c7b380fb66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.549 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9195012-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.551 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 kernel: tapb9195012-f0: left promiscuous mode
Jan 31 03:58:03 np0005603622 nova_compute[221301]: 2026-01-31 08:58:03.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.560 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7f816a73-5aae-4789-a2dd-8036188b55b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.578 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b6cec538-9c4b-4c13-9412-54780ee41650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.579 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8adb90d8-35f7-4a55-8581-7f09ec2cd5bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.590 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e929ec-03ba-4365-aacf-572f5ac3b13f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 912001, 'reachable_time': 27110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296977, 'error': None, 'target': 'ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:03 np0005603622 systemd[1]: run-netns-ovnmeta\x2db9195012\x2dfef1\x2d4e17\x2dacdd\x2d2b9ffc979da0.mount: Deactivated successfully.
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.593 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9195012-fef1-4e17-acdd-2b9ffc979da0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:58:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:03.593 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[1a913a6c-99cb-4028-bfc6-e543e20e9719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.013 221324 INFO nova.virt.libvirt.driver [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Deleting instance files /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e_del#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.014 221324 INFO nova.virt.libvirt.driver [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Deletion of /var/lib/nova/instances/a36a041b-d513-4688-8abc-0ae4f25b2b9e_del complete#033[00m
Jan 31 03:58:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.107 221324 INFO nova.compute.manager [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Took 0.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.107 221324 DEBUG oslo.service.loopingcall [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.108 221324 DEBUG nova.compute.manager [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.108 221324 DEBUG nova.network.neutron [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.139 221324 DEBUG nova.compute.manager [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-unplugged-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.139 221324 DEBUG oslo_concurrency.lockutils [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.140 221324 DEBUG oslo_concurrency.lockutils [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.140 221324 DEBUG oslo_concurrency.lockutils [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.141 221324 DEBUG nova.compute.manager [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] No waiting events found dispatching network-vif-unplugged-22060216-c399-4114-99a0-fbae3664bd28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:04 np0005603622 nova_compute[221301]: 2026-01-31 08:58:04.141 221324 DEBUG nova.compute.manager [req-5160abab-b01f-4dd0-ba39-088df4fb6c31 req-c441080c-9f1a-45e8-91b3-4ecf69ca763c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-unplugged-22060216-c399-4114-99a0-fbae3664bd28 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:58:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:04.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.139 221324 DEBUG nova.network.neutron [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.167 221324 INFO nova.compute.manager [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Took 1.06 seconds to deallocate network for instance.#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.224 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.290 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.291 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.420 221324 DEBUG oslo_concurrency.processutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e375 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/96972787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.876 221324 DEBUG oslo_concurrency.processutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.884 221324 DEBUG nova.compute.provider_tree [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.906 221324 DEBUG nova.scheduler.client.report [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.945 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:05 np0005603622 nova_compute[221301]: 2026-01-31 08:58:05.990 221324 INFO nova.scheduler.client.report [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Deleted allocations for instance a36a041b-d513-4688-8abc-0ae4f25b2b9e#033[00m
Jan 31 03:58:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:06.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.097 221324 DEBUG oslo_concurrency.lockutils [None req-17f9eb98-0a51-4bee-a053-fbd76a05555a 4e364ad937544559bea978006e9ff229 c37e7d6d634448bfb3172894ad2af105 - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.316 221324 DEBUG nova.compute.manager [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.317 221324 DEBUG oslo_concurrency.lockutils [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.317 221324 DEBUG oslo_concurrency.lockutils [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.317 221324 DEBUG oslo_concurrency.lockutils [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "a36a041b-d513-4688-8abc-0ae4f25b2b9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.317 221324 DEBUG nova.compute.manager [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] No waiting events found dispatching network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.317 221324 WARNING nova.compute.manager [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received unexpected event network-vif-plugged-22060216-c399-4114-99a0-fbae3664bd28 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 03:58:06 np0005603622 nova_compute[221301]: 2026-01-31 08:58:06.318 221324 DEBUG nova.compute.manager [req-4ac53ec7-d8a4-4e11-ab67-52f6b54dbf35 req-96094b5f-7c79-41c7-8f35-02be88e70807 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Received event network-vif-deleted-22060216-c399-4114-99a0-fbae3664bd28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 31 03:58:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:08.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:08 np0005603622 nova_compute[221301]: 2026-01-31 08:58:08.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:08 np0005603622 nova_compute[221301]: 2026-01-31 08:58:08.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:58:08 np0005603622 nova_compute[221301]: 2026-01-31 08:58:08.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:08.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:58:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1707889347' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:58:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:58:09 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1707889347' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.428 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.429 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.429 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.429 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.429 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.430 221324 INFO nova.compute.manager [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Terminating instance#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.431 221324 DEBUG nova.compute.manager [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:58:09 np0005603622 kernel: tap71b92e93-eb (unregistering): left promiscuous mode
Jan 31 03:58:09 np0005603622 NetworkManager[49080]: <info>  [1769849889.5272] device (tap71b92e93-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.530 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:09Z|00892|binding|INFO|Releasing lport 71b92e93-ebb3-4664-a6af-06e2b936453f from this chassis (sb_readonly=0)
Jan 31 03:58:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:09Z|00893|binding|INFO|Setting lport 71b92e93-ebb3-4664-a6af-06e2b936453f down in Southbound
Jan 31 03:58:09 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:09Z|00894|binding|INFO|Removing iface tap71b92e93-eb ovn-installed in OVS
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:09.543 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:3c:68 10.100.0.9'], port_security=['fa:16:3e:6d:3c:68 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '13eea30b-7972-4da3-a3dd-8d068b401d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06b5fc9cfd4c49abb2d8b9f2f8a82c1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9d7b4c6b-30ca-4a01-b275-d4aa9d87b845', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe6e8b31-5a27-4e0f-b157-3b33899fa37b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=71b92e93-ebb3-4664-a6af-06e2b936453f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:09.545 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 71b92e93-ebb3-4664-a6af-06e2b936453f in datapath 405bd95c-1bad-49fb-83bf-a97a0c66786e unbound from our chassis#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.543 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:09.546 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405bd95c-1bad-49fb-83bf-a97a0c66786e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 03:58:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:09.547 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[420fcc07-1e4e-4295-adfc-991abd9e1378]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:09 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:09.548 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e namespace which is not needed anymore#033[00m
Jan 31 03:58:09 np0005603622 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 31 03:58:09 np0005603622 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b4.scope: Consumed 19.463s CPU time.
Jan 31 03:58:09 np0005603622 systemd-machined[190406]: Machine qemu-88-instance-000000b4 terminated.
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.704 221324 INFO nova.virt.libvirt.driver [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Instance destroyed successfully.#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.705 221324 DEBUG nova.objects.instance [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lazy-loading 'resources' on Instance uuid 13eea30b-7972-4da3-a3dd-8d068b401d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:09 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [NOTICE]   (294778) : haproxy version is 2.8.14-c23fe91
Jan 31 03:58:09 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [NOTICE]   (294778) : path to executable is /usr/sbin/haproxy
Jan 31 03:58:09 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [WARNING]  (294778) : Exiting Master process...
Jan 31 03:58:09 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [ALERT]    (294778) : Current worker (294780) exited with code 143 (Terminated)
Jan 31 03:58:09 np0005603622 neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e[294774]: [WARNING]  (294778) : All workers exited. Exiting... (0)
Jan 31 03:58:09 np0005603622 systemd[1]: libpod-980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1.scope: Deactivated successfully.
Jan 31 03:58:09 np0005603622 podman[297025]: 2026-01-31 08:58:09.720968231 +0000 UTC m=+0.099208617 container died 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.802 221324 DEBUG nova.virt.libvirt.vif [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:55:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-2129503496',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-2129503496',id=180,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEXzK6zUN8P2oqgqYwcegkodZ7bCeyyyhmYXIteBKXOhNEu+drS3qyKalg8BzkpjD3Rc/+FviAhlBApTbimNmOyPmM7IztIR2VGri6qDWFeRA0jXOdg2vS/Kgt0ALKH9cg==',key_name='tempest-TestInstancesWithCinderVolumes-176277168',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:55:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='06b5fc9cfd4c49abb2d8b9f2f8a82c1f',ramdisk_id='',reservation_id='r-9xt0sci9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-2132464628',owner_user_name='tempest-TestInstancesWithCinderVolumes-2132464628-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:55:54Z,user_data=None,user_id='cfaebb011a374541b083e772a6c83f25',uuid=13eea30b-7972-4da3-a3dd-8d068b401d22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.803 221324 DEBUG nova.network.os_vif_util [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converting VIF {"id": "71b92e93-ebb3-4664-a6af-06e2b936453f", "address": "fa:16:3e:6d:3c:68", "network": {"id": "405bd95c-1bad-49fb-83bf-a97a0c66786e", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-161168058-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06b5fc9cfd4c49abb2d8b9f2f8a82c1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b92e93-eb", "ovs_interfaceid": "71b92e93-ebb3-4664-a6af-06e2b936453f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.803 221324 DEBUG nova.network.os_vif_util [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.804 221324 DEBUG os_vif [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.805 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.806 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71b92e93-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:58:09 np0005603622 nova_compute[221301]: 2026-01-31 08:58:09.812 221324 INFO os_vif [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:3c:68,bridge_name='br-int',has_traffic_filtering=True,id=71b92e93-ebb3-4664-a6af-06e2b936453f,network=Network(405bd95c-1bad-49fb-83bf-a97a0c66786e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b92e93-eb')#033[00m
Jan 31 03:58:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay-919da9e74d20990373f80da3e5e4b8056da418ab3a1f32f19e6999fca16181f9-merged.mount: Deactivated successfully.
Jan 31 03:58:10 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1-userdata-shm.mount: Deactivated successfully.
Jan 31 03:58:10 np0005603622 podman[297025]: 2026-01-31 08:58:10.049693801 +0000 UTC m=+0.427934167 container cleanup 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 03:58:10 np0005603622 systemd[1]: libpod-conmon-980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1.scope: Deactivated successfully.
Jan 31 03:58:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:10.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:10 np0005603622 podman[297082]: 2026-01-31 08:58:10.162401769 +0000 UTC m=+0.093417262 container remove 980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.167 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aec0e081-e6e0-4b1c-b038-f7149fa0148b]: (4, ('Sat Jan 31 08:58:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e (980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1)\n980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1\nSat Jan 31 08:58:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e (980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1)\n980c291c2e3b48a3fc9a781ca03a6e64819326b8e4663be616ee49382fb385b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.169 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2185c3eb-4ebd-4d0c-9a75-e0665bc66870]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.170 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405bd95c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.172 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:10 np0005603622 kernel: tap405bd95c-10: left promiscuous mode
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.179 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.181 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[76b4df26-3f30-4773-b55b-fea79e789fcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.203 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e01df7-4d37-4743-aca9-11eb347a45d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.205 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbc50f3-9b40-4611-b827-91c1add4804e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.220 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0a14045f-0b43-4d51-a080-01164f6993b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 900501, 'reachable_time': 20004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297098, 'error': None, 'target': 'ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.222 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405bd95c-1bad-49fb-83bf-a97a0c66786e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 03:58:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:10.222 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec1e214-996b-4278-a1dd-3543dff7c902]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:10 np0005603622 systemd[1]: run-netns-ovnmeta\x2d405bd95c\x2d1bad\x2d49fb\x2d83bf\x2da97a0c66786e.mount: Deactivated successfully.
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.238 221324 INFO nova.virt.libvirt.driver [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Deleting instance files /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22_del#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.239 221324 INFO nova.virt.libvirt.driver [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Deletion of /var/lib/nova/instances/13eea30b-7972-4da3-a3dd-8d068b401d22_del complete#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.351 221324 INFO nova.compute.manager [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.353 221324 DEBUG oslo.service.loopingcall [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.353 221324 DEBUG nova.compute.manager [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.354 221324 DEBUG nova.network.neutron [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:58:10 np0005603622 nova_compute[221301]: 2026-01-31 08:58:10.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:10.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.478 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.478 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.942 221324 DEBUG nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-unplugged-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.942 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.942 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.942 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.943 221324 DEBUG nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] No waiting events found dispatching network-vif-unplugged-71b92e93-ebb3-4664-a6af-06e2b936453f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.943 221324 DEBUG nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-unplugged-71b92e93-ebb3-4664-a6af-06e2b936453f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.943 221324 DEBUG nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.943 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.944 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.944 221324 DEBUG oslo_concurrency.lockutils [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.944 221324 DEBUG nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] No waiting events found dispatching network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:11 np0005603622 nova_compute[221301]: 2026-01-31 08:58:11.944 221324 WARNING nova.compute.manager [req-879d5134-eb1d-46aa-8c8a-3a9bab55980f req-9c0745fb-8746-48c1-b9cb-b701fb66b910 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received unexpected event network-vif-plugged-71b92e93-ebb3-4664-a6af-06e2b936453f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:58:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:12.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:12 np0005603622 nova_compute[221301]: 2026-01-31 08:58:12.422 221324 DEBUG nova.network.neutron [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:12 np0005603622 nova_compute[221301]: 2026-01-31 08:58:12.474 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:12.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:12 np0005603622 nova_compute[221301]: 2026-01-31 08:58:12.737 221324 INFO nova.compute.manager [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Took 2.38 seconds to deallocate network for instance.#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.147 221324 INFO nova.compute.manager [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Took 0.41 seconds to detach 1 volumes for instance.#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.368 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.369 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.439 221324 DEBUG oslo_concurrency.processutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:13 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3196689908' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.899 221324 DEBUG oslo_concurrency.processutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.905 221324 DEBUG nova.compute.provider_tree [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.935 221324 DEBUG nova.scheduler.client.report [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:13 np0005603622 nova_compute[221301]: 2026-01-31 08:58:13.979 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.022 221324 INFO nova.scheduler.client.report [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Deleted allocations for instance 13eea30b-7972-4da3-a3dd-8d068b401d22#033[00m
Jan 31 03:58:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.124 221324 DEBUG nova.compute.manager [req-b1460b80-af82-48f1-b4a9-12b3b8c133e2 req-a2de678e-a96e-4e0d-aa4d-678ab0c480cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Received event network-vif-deleted-71b92e93-ebb3-4664-a6af-06e2b936453f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.153 221324 DEBUG oslo_concurrency.lockutils [None req-8aa7e101-12be-4c61-99e3-c08ed3b24e6b cfaebb011a374541b083e772a6c83f25 06b5fc9cfd4c49abb2d8b9f2f8a82c1f - - default default] Lock "13eea30b-7972-4da3-a3dd-8d068b401d22" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.423 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.424 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.424 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.424 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.425 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:14.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.810 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1971631406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:14 np0005603622 nova_compute[221301]: 2026-01-31 08:58:14.925 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.120 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.121 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4246MB free_disk=20.947315216064453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.121 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.122 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.243 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.243 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.265 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:15 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1504422982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.666 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.672 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.691 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.717 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:58:15 np0005603622 nova_compute[221301]: 2026-01-31 08:58:15.718 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:16.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:16 np0005603622 nova_compute[221301]: 2026-01-31 08:58:16.719 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:17 np0005603622 nova_compute[221301]: 2026-01-31 08:58:17.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 31 03:58:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:18.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:18 np0005603622 podman[297167]: 2026-01-31 08:58:18.139321636 +0000 UTC m=+0.061823676 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 03:58:18 np0005603622 podman[297166]: 2026-01-31 08:58:18.155753876 +0000 UTC m=+0.085443478 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 03:58:18 np0005603622 nova_compute[221301]: 2026-01-31 08:58:18.387 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849883.3864048, a36a041b-d513-4688-8abc-0ae4f25b2b9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:18 np0005603622 nova_compute[221301]: 2026-01-31 08:58:18.387 221324 INFO nova.compute.manager [-] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:58:18 np0005603622 nova_compute[221301]: 2026-01-31 08:58:18.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:18 np0005603622 nova_compute[221301]: 2026-01-31 08:58:18.415 221324 DEBUG nova.compute.manager [None req-6a2d7735-97a5-4446-80e2-41efc3528b0a - - - - - -] [instance: a36a041b-d513-4688-8abc-0ae4f25b2b9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:18.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:19 np0005603622 nova_compute[221301]: 2026-01-31 08:58:19.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:20.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:20 np0005603622 nova_compute[221301]: 2026-01-31 08:58:20.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:20 np0005603622 nova_compute[221301]: 2026-01-31 08:58:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:58:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:58:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:20 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:58:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:20.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:22.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:22.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:24.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:24 np0005603622 nova_compute[221301]: 2026-01-31 08:58:24.704 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849889.702542, 13eea30b-7972-4da3-a3dd-8d068b401d22 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:24 np0005603622 nova_compute[221301]: 2026-01-31 08:58:24.704 221324 INFO nova.compute.manager [-] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:58:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:24.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:24 np0005603622 nova_compute[221301]: 2026-01-31 08:58:24.783 221324 DEBUG nova.compute.manager [None req-10b178ae-fa43-45e9-ade4-b4a433b247a5 - - - - - -] [instance: 13eea30b-7972-4da3-a3dd-8d068b401d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:24 np0005603622 nova_compute[221301]: 2026-01-31 08:58:24.816 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:25 np0005603622 nova_compute[221301]: 2026-01-31 08:58:25.232 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:26.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:26 np0005603622 nova_compute[221301]: 2026-01-31 08:58:26.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:26.250 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:26.251 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:58:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:26.252 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:58:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:26.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:28.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:28.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:29 np0005603622 nova_compute[221301]: 2026-01-31 08:58:29.818 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 03:58:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3856530744' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 03:58:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 03:58:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3856530744' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 03:58:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:30.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:30 np0005603622 nova_compute[221301]: 2026-01-31 08:58:30.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:30.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:30.917 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:32.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:34.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:34 np0005603622 nova_compute[221301]: 2026-01-31 08:58:34.822 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:35 np0005603622 nova_compute[221301]: 2026-01-31 08:58:35.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:36.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:36.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:38.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.584 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.584 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.607 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.751 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.752 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.763 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:58:38 np0005603622 nova_compute[221301]: 2026-01-31 08:58:38.764 221324 INFO nova.compute.claims [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:58:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:38.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.004 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:58:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3310859741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.428 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.434 221324 DEBUG nova.compute.provider_tree [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.454 221324 DEBUG nova.scheduler.client.report [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.495 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.496 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.578 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.579 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.614 221324 INFO nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.645 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.825 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.832 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.833 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.834 221324 INFO nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Creating image(s)#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.866 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.891 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.918 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.922 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.976 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.977 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.978 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:39 np0005603622 nova_compute[221301]: 2026-01-31 08:58:39.978 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.001 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.005 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.025 221324 DEBUG nova.policy [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '802db4935c5046888527023acbf8d3f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '790b9059378345aca91bca183d37da86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:58:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:40.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.236 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.298 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.375 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] resizing rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.490 221324 DEBUG nova.objects.instance [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'migration_context' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.515 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.516 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Ensure instance console log exists: /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.516 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.516 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:40 np0005603622 nova_compute[221301]: 2026-01-31 08:58:40.517 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:40.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:41 np0005603622 nova_compute[221301]: 2026-01-31 08:58:41.453 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Successfully created port: 318c754a-d9a0-4e76-9f79-be26ad06e2f7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:58:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:42.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.420 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Successfully updated port: 318c754a-d9a0-4e76-9f79-be26ad06e2f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.441 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.442 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquired lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.442 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.659 221324 DEBUG nova.compute.manager [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-changed-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.659 221324 DEBUG nova.compute.manager [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Refreshing instance network info cache due to event network-changed-318c754a-d9a0-4e76-9f79-be26ad06e2f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:58:43 np0005603622 nova_compute[221301]: 2026-01-31 08:58:43.660 221324 DEBUG oslo_concurrency.lockutils [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:58:44 np0005603622 nova_compute[221301]: 2026-01-31 08:58:44.132 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:58:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:44.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:44.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:44 np0005603622 nova_compute[221301]: 2026-01-31 08:58:44.849 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.590 221324 DEBUG nova.network.neutron [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.633 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Releasing lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.633 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance network_info: |[{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.633 221324 DEBUG oslo_concurrency.lockutils [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.634 221324 DEBUG nova.network.neutron [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Refreshing network info cache for port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.636 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Start _get_guest_xml network_info=[{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.640 221324 WARNING nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.645 221324 DEBUG nova.virt.libvirt.host [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.645 221324 DEBUG nova.virt.libvirt.host [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.650 221324 DEBUG nova.virt.libvirt.host [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.650 221324 DEBUG nova.virt.libvirt.host [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.652 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.652 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.652 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.652 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.653 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.654 221324 DEBUG nova.virt.hardware [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:58:45 np0005603622 nova_compute[221301]: 2026-01-31 08:58:45.656 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:58:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/584474774' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.079 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.113 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.119 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:46.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:58:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2499044970' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.561 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.563 221324 DEBUG nova.virt.libvirt.vif [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1156523753',display_name='tempest-TestServerAdvancedOps-server-1156523753',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1156523753',id=188,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='790b9059378345aca91bca183d37da86',ramdisk_id='',reservation_id='r-o65vaun9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1774138384',owner_user_name='tempest-TestServerAdvancedOps-1774138384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:58:39Z,user_data=None,user_id='802db4935c5046888527023acbf8d3f5',uuid=7e6bef46-1686-44a0-a3c7-3a5855268d1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.564 221324 DEBUG nova.network.os_vif_util [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converting VIF {"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.565 221324 DEBUG nova.network.os_vif_util [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.566 221324 DEBUG nova.objects.instance [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.602 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <uuid>7e6bef46-1686-44a0-a3c7-3a5855268d1b</uuid>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <name>instance-000000bc</name>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestServerAdvancedOps-server-1156523753</nova:name>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:58:45</nova:creationTime>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:user uuid="802db4935c5046888527023acbf8d3f5">tempest-TestServerAdvancedOps-1774138384-project-member</nova:user>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:project uuid="790b9059378345aca91bca183d37da86">tempest-TestServerAdvancedOps-1774138384</nova:project>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <nova:port uuid="318c754a-d9a0-4e76-9f79-be26ad06e2f7">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="serial">7e6bef46-1686-44a0-a3c7-3a5855268d1b</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="uuid">7e6bef46-1686-44a0-a3c7-3a5855268d1b</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:32:ae:3e"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <target dev="tap318c754a-d9"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/console.log" append="off"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:58:46 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:58:46 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:58:46 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:58:46 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.603 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Preparing to wait for external event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.603 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.603 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.604 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.604 221324 DEBUG nova.virt.libvirt.vif [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1156523753',display_name='tempest-TestServerAdvancedOps-server-1156523753',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1156523753',id=188,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='790b9059378345aca91bca183d37da86',ramdisk_id='',reservation_id='r-o65vaun9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1774138384',owner_user_name='tempest-TestServerAdvancedOps-1774138384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:58:39Z,user_data=None,user_id='802db4935c5046888527023acbf8d3f5',uuid=7e6bef46-1686-44a0-a3c7-3a5855268d1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.604 221324 DEBUG nova.network.os_vif_util [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converting VIF {"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.605 221324 DEBUG nova.network.os_vif_util [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.605 221324 DEBUG os_vif [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.606 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.606 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.607 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.611 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap318c754a-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.611 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap318c754a-d9, col_values=(('external_ids', {'iface-id': '318c754a-d9a0-4e76-9f79-be26ad06e2f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:ae:3e', 'vm-uuid': '7e6bef46-1686-44a0-a3c7-3a5855268d1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:46 np0005603622 NetworkManager[49080]: <info>  [1769849926.6140] manager: (tap318c754a-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.615 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.620 221324 INFO os_vif [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9')#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.693 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.694 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.694 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] No VIF found with MAC fa:16:3e:32:ae:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.695 221324 INFO nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Using config drive#033[00m
Jan 31 03:58:46 np0005603622 nova_compute[221301]: 2026-01-31 08:58:46.718 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:46.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.514 221324 INFO nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Creating config drive at /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.520 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjic4ycx5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.656 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjic4ycx5" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.685 221324 DEBUG nova.storage.rbd_utils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] rbd image 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.690 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.717 221324 DEBUG nova.network.neutron [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updated VIF entry in instance network info cache for port 318c754a-d9a0-4e76-9f79-be26ad06e2f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.718 221324 DEBUG nova.network.neutron [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:58:47 np0005603622 nova_compute[221301]: 2026-01-31 08:58:47.745 221324 DEBUG oslo_concurrency.lockutils [req-8ba4f8f1-2023-4910-afe9-3fbfa8bdd61e req-2073c926-ef89-4ee1-9d89-48200f56b398 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:58:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:48.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.154 221324 DEBUG oslo_concurrency.processutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config 7e6bef46-1686-44a0-a3c7-3a5855268d1b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.154 221324 INFO nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Deleting local config drive /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b/disk.config because it was imported into RBD.#033[00m
Jan 31 03:58:48 np0005603622 kernel: tap318c754a-d9: entered promiscuous mode
Jan 31 03:58:48 np0005603622 NetworkManager[49080]: <info>  [1769849928.1926] manager: (tap318c754a-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/395)
Jan 31 03:58:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:48Z|00895|binding|INFO|Claiming lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 for this chassis.
Jan 31 03:58:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:48Z|00896|binding|INFO|318c754a-d9a0-4e76-9f79-be26ad06e2f7: Claiming fa:16:3e:32:ae:3e 10.100.0.13
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.192 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:48.209 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:48.210 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b bound to our chassis#033[00m
Jan 31 03:58:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:48.211 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:58:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:48.211 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5ecd40-6758-408c-bbac-73a43f3a4a76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:48Z|00897|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 ovn-installed in OVS
Jan 31 03:58:48 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:48Z|00898|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 up in Southbound
Jan 31 03:58:48 np0005603622 nova_compute[221301]: 2026-01-31 08:58:48.217 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:48 np0005603622 systemd-machined[190406]: New machine qemu-92-instance-000000bc.
Jan 31 03:58:48 np0005603622 systemd[1]: Started Virtual Machine qemu-92-instance-000000bc.
Jan 31 03:58:48 np0005603622 systemd-udevd[297744]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:58:48 np0005603622 NetworkManager[49080]: <info>  [1769849928.2523] device (tap318c754a-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:58:48 np0005603622 NetworkManager[49080]: <info>  [1769849928.2531] device (tap318c754a-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:58:48 np0005603622 podman[297708]: 2026-01-31 08:58:48.277993628 +0000 UTC m=+0.086995790 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 03:58:48 np0005603622 podman[297701]: 2026-01-31 08:58:48.277993558 +0000 UTC m=+0.086916418 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller)
Jan 31 03:58:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:48.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.725 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849929.7253811, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.726 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.755 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.760 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849929.7263653, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.760 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.782 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.786 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:58:49 np0005603622 nova_compute[221301]: 2026-01-31 08:58:49.811 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:58:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:50.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:50 np0005603622 nova_compute[221301]: 2026-01-31 08:58:50.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:50.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:51 np0005603622 nova_compute[221301]: 2026-01-31 08:58:51.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:52.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:52.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.664 221324 DEBUG nova.compute.manager [req-93821d17-2450-4d6f-8129-79d5072460ac req-78700133-e6f4-4b1b-864b-1c8826cd54b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.665 221324 DEBUG oslo_concurrency.lockutils [req-93821d17-2450-4d6f-8129-79d5072460ac req-78700133-e6f4-4b1b-864b-1c8826cd54b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.666 221324 DEBUG oslo_concurrency.lockutils [req-93821d17-2450-4d6f-8129-79d5072460ac req-78700133-e6f4-4b1b-864b-1c8826cd54b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.667 221324 DEBUG oslo_concurrency.lockutils [req-93821d17-2450-4d6f-8129-79d5072460ac req-78700133-e6f4-4b1b-864b-1c8826cd54b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.667 221324 DEBUG nova.compute.manager [req-93821d17-2450-4d6f-8129-79d5072460ac req-78700133-e6f4-4b1b-864b-1c8826cd54b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Processing event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.668 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.672 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849933.6716762, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.672 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.675 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.681 221324 INFO nova.virt.libvirt.driver [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance spawned successfully.#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.682 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.701 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.705 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.730 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.731 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.731 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.732 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.732 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.733 221324 DEBUG nova.virt.libvirt.driver [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.759 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.886 221324 INFO nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Took 14.05 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.886 221324 DEBUG nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:53 np0005603622 nova_compute[221301]: 2026-01-31 08:58:53.974 221324 INFO nova.compute.manager [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Took 15.26 seconds to build instance.#033[00m
Jan 31 03:58:54 np0005603622 nova_compute[221301]: 2026-01-31 08:58:54.006 221324 DEBUG oslo_concurrency.lockutils [None req-6c1e951e-1ed5-410c-9f49-1b5157b1e298 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:54.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:58:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:54.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.945 221324 DEBUG nova.compute.manager [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.946 221324 DEBUG oslo_concurrency.lockutils [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.946 221324 DEBUG oslo_concurrency.lockutils [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.947 221324 DEBUG oslo_concurrency.lockutils [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.948 221324 DEBUG nova.compute.manager [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:58:55 np0005603622 nova_compute[221301]: 2026-01-31 08:58:55.948 221324 WARNING nova.compute.manager [req-c669e852-1173-4eb9-a747-2cff0d786311 req-1d0990e1-d9e8-4ad5-bfaf-31f924cab6c3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state active and task_state None.#033[00m
Jan 31 03:58:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:56.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:56 np0005603622 nova_compute[221301]: 2026-01-31 08:58:56.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:58:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:58:58.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.346 221324 DEBUG nova.objects.instance [None req-a99afeca-0e4d-4eb5-b8d4-b8c261122465 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.377 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849938.3773365, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.378 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.402 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.408 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:58:58 np0005603622 nova_compute[221301]: 2026-01-31 08:58:58.450 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 03:58:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:58:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:58:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:58:58.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:58:59 np0005603622 kernel: tap318c754a-d9 (unregistering): left promiscuous mode
Jan 31 03:58:59 np0005603622 NetworkManager[49080]: <info>  [1769849939.4388] device (tap318c754a-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:58:59 np0005603622 nova_compute[221301]: 2026-01-31 08:58:59.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:59Z|00899|binding|INFO|Releasing lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 from this chassis (sb_readonly=0)
Jan 31 03:58:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:59Z|00900|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 down in Southbound
Jan 31 03:58:59 np0005603622 nova_compute[221301]: 2026-01-31 08:58:59.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:59 np0005603622 ovn_controller[130109]: 2026-01-31T08:58:59Z|00901|binding|INFO|Removing iface tap318c754a-d9 ovn-installed in OVS
Jan 31 03:58:59 np0005603622 nova_compute[221301]: 2026-01-31 08:58:59.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:59 np0005603622 nova_compute[221301]: 2026-01-31 08:58:59.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:58:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:59.453 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:58:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:59.455 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b unbound from our chassis#033[00m
Jan 31 03:58:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:59.456 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:58:59 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:58:59.457 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c81f34-d69a-4d29-a671-c29b38903312]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:58:59 np0005603622 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 31 03:58:59 np0005603622 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000bc.scope: Consumed 6.145s CPU time.
Jan 31 03:58:59 np0005603622 systemd-machined[190406]: Machine qemu-92-instance-000000bc terminated.
Jan 31 03:58:59 np0005603622 nova_compute[221301]: 2026-01-31 08:58:59.572 221324 DEBUG nova.compute.manager [None req-a99afeca-0e4d-4eb5-b8d4-b8c261122465 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:00.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:00 np0005603622 nova_compute[221301]: 2026-01-31 08:59:00.243 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:00.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.898 221324 INFO nova.compute.manager [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Resuming#033[00m
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.898 221324 DEBUG nova.objects.instance [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'flavor' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.966 221324 DEBUG oslo_concurrency.lockutils [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.966 221324 DEBUG oslo_concurrency.lockutils [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquired lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:01 np0005603622 nova_compute[221301]: 2026-01-31 08:59:01.966 221324 DEBUG nova.network.neutron [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.113 221324 DEBUG nova.compute.manager [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.114 221324 DEBUG oslo_concurrency.lockutils [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.114 221324 DEBUG oslo_concurrency.lockutils [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.115 221324 DEBUG oslo_concurrency.lockutils [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.115 221324 DEBUG nova.compute.manager [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:02 np0005603622 nova_compute[221301]: 2026-01-31 08:59:02.116 221324 WARNING nova.compute.manager [req-55eb4f65-1ae3-4925-9dd6-d5e068ead95d req-526fddf7-dd48-4f14-ac7f-16145a944df4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 31 03:59:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:02.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:02.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:04.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.426 221324 DEBUG nova.compute.manager [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.426 221324 DEBUG oslo_concurrency.lockutils [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.426 221324 DEBUG oslo_concurrency.lockutils [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.427 221324 DEBUG oslo_concurrency.lockutils [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.427 221324 DEBUG nova.compute.manager [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:04 np0005603622 nova_compute[221301]: 2026-01-31 08:59:04.427 221324 WARNING nova.compute.manager [req-1dca2f7d-7f6d-467a-a543-c75f547d41ce req-cb9f439a-1aef-4c47-9af0-bf5f65f8fe5a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 31 03:59:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:04.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.111 221324 DEBUG nova.network.neutron [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.145 221324 DEBUG oslo_concurrency.lockutils [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Releasing lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.150 221324 DEBUG nova.virt.libvirt.vif [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1156523753',display_name='tempest-TestServerAdvancedOps-server-1156523753',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1156523753',id=188,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='790b9059378345aca91bca183d37da86',ramdisk_id='',reservation_id='r-o65vaun9',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1774138384',owner_user_name='tempest-TestServerAdvancedOps-1774138384-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:58:59Z,user_data=None,user_id='802db4935c5046888527023acbf8d3f5',uuid=7e6bef46-1686-44a0-a3c7-3a5855268d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.150 221324 DEBUG nova.network.os_vif_util [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converting VIF {"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.151 221324 DEBUG nova.network.os_vif_util [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.151 221324 DEBUG os_vif [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.152 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.152 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.152 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.155 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.156 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap318c754a-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.156 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap318c754a-d9, col_values=(('external_ids', {'iface-id': '318c754a-d9a0-4e76-9f79-be26ad06e2f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:ae:3e', 'vm-uuid': '7e6bef46-1686-44a0-a3c7-3a5855268d1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.156 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.157 221324 INFO os_vif [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9')#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.198 221324 DEBUG nova.objects.instance [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.244 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 kernel: tap318c754a-d9: entered promiscuous mode
Jan 31 03:59:05 np0005603622 NetworkManager[49080]: <info>  [1769849945.2788] manager: (tap318c754a-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/396)
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.280 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:05Z|00902|binding|INFO|Claiming lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 for this chassis.
Jan 31 03:59:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:05Z|00903|binding|INFO|318c754a-d9a0-4e76-9f79-be26ad06e2f7: Claiming fa:16:3e:32:ae:3e 10.100.0.13
Jan 31 03:59:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:05Z|00904|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 ovn-installed in OVS
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.287 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.291 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:05 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:05Z|00905|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 up in Southbound
Jan 31 03:59:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:05.297 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:05.298 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b bound to our chassis#033[00m
Jan 31 03:59:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:05.299 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:59:05 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:05.300 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[12497428-f452-4079-89ab-3dfcfb7292c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:05 np0005603622 systemd-udevd[297840]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:59:05 np0005603622 systemd-machined[190406]: New machine qemu-93-instance-000000bc.
Jan 31 03:59:05 np0005603622 NetworkManager[49080]: <info>  [1769849945.3156] device (tap318c754a-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:59:05 np0005603622 NetworkManager[49080]: <info>  [1769849945.3160] device (tap318c754a-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:59:05 np0005603622 systemd[1]: Started Virtual Machine qemu-93-instance-000000bc.
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.730 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 7e6bef46-1686-44a0-a3c7-3a5855268d1b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.731 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849945.7304592, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.732 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.742 221324 DEBUG nova.compute.manager [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.743 221324 DEBUG nova.objects.instance [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.764 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.768 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.936 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.937 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849945.7341325, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.937 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.943 221324 INFO nova.virt.libvirt.driver [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance running successfully.#033[00m
Jan 31 03:59:05 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.947 221324 DEBUG nova.virt.libvirt.guest [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:59:05 np0005603622 nova_compute[221301]: 2026-01-31 08:59:05.948 221324 DEBUG nova.compute.manager [None req-1c4337e4-7d49-4a86-9d7a-a2dcc63abf24 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:06 np0005603622 nova_compute[221301]: 2026-01-31 08:59:06.000 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:06 np0005603622 nova_compute[221301]: 2026-01-31 08:59:06.005 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:06 np0005603622 nova_compute[221301]: 2026-01-31 08:59:06.139 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:59:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:06.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:06 np0005603622 nova_compute[221301]: 2026-01-31 08:59:06.673 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:06.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:08.011 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:08 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:08.012 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 03:59:08 np0005603622 nova_compute[221301]: 2026-01-31 08:59:08.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:08.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:08.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.483 221324 DEBUG nova.objects.instance [None req-c1ba9f33-cd16-4f9f-b8d5-c6f97a42f887 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.513 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849949.512904, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.513 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.544 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.547 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:09 np0005603622 nova_compute[221301]: 2026-01-31 08:59:09.574 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 31 03:59:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:10.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:10 np0005603622 kernel: tap318c754a-d9 (unregistering): left promiscuous mode
Jan 31 03:59:10 np0005603622 NetworkManager[49080]: <info>  [1769849950.4741] device (tap318c754a-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:59:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:10Z|00906|binding|INFO|Releasing lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 from this chassis (sb_readonly=0)
Jan 31 03:59:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:10Z|00907|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 down in Southbound
Jan 31 03:59:10 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:10Z|00908|binding|INFO|Removing iface tap318c754a-d9 ovn-installed in OVS
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:10.492 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:10.493 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b unbound from our chassis#033[00m
Jan 31 03:59:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:10.493 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:59:10 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:10.494 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f6549253-6d76-40b7-8c64-6e03f9092bb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:10 np0005603622 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 31 03:59:10 np0005603622 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000bc.scope: Consumed 4.170s CPU time.
Jan 31 03:59:10 np0005603622 systemd-machined[190406]: Machine qemu-93-instance-000000bc terminated.
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.549 221324 DEBUG nova.compute.manager [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.549 221324 DEBUG oslo_concurrency.lockutils [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.550 221324 DEBUG oslo_concurrency.lockutils [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.550 221324 DEBUG oslo_concurrency.lockutils [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.550 221324 DEBUG nova.compute.manager [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.550 221324 WARNING nova.compute.manager [req-33389c6d-dd7d-45d7-9b3f-f91132d8fb19 req-3ea893d9-f599-42b5-9960-1d836fba4822 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state active and task_state suspending.#033[00m
Jan 31 03:59:10 np0005603622 nova_compute[221301]: 2026-01-31 08:59:10.675 221324 DEBUG nova.compute.manager [None req-c1ba9f33-cd16-4f9f-b8d5-c6f97a42f887 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:10.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:11 np0005603622 nova_compute[221301]: 2026-01-31 08:59:11.675 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:12.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.763 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.764 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.764 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.765 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.765 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.765 221324 WARNING nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.765 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.765 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.766 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.766 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.766 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.766 221324 WARNING nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.766 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.767 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.767 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.767 221324 DEBUG oslo_concurrency.lockutils [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.767 221324 DEBUG nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.767 221324 WARNING nova.compute.manager [req-559f6dcf-6f5e-4d8c-bf3f-fb44b4cc8852 req-ea2062cf-dd00-4168-b556-f9726aafa13d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state None.#033[00m
Jan 31 03:59:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:12.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.882 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.883 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.883 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 03:59:12 np0005603622 nova_compute[221301]: 2026-01-31 08:59:12.883 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:13 np0005603622 nova_compute[221301]: 2026-01-31 08:59:13.212 221324 INFO nova.compute.manager [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Resuming#033[00m
Jan 31 03:59:13 np0005603622 nova_compute[221301]: 2026-01-31 08:59:13.214 221324 DEBUG nova.objects.instance [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'flavor' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:13 np0005603622 nova_compute[221301]: 2026-01-31 08:59:13.266 221324 DEBUG oslo_concurrency.lockutils [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.290229) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953290327, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 2420, "num_deletes": 253, "total_data_size": 5420333, "memory_usage": 5506224, "flush_reason": "Manual Compaction"}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953348357, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 3539115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 77133, "largest_seqno": 79548, "table_properties": {"data_size": 3529531, "index_size": 5949, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21069, "raw_average_key_size": 20, "raw_value_size": 3509874, "raw_average_value_size": 3458, "num_data_blocks": 258, "num_entries": 1015, "num_filter_entries": 1015, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849761, "oldest_key_time": 1769849761, "file_creation_time": 1769849953, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 58200 microseconds, and 6350 cpu microseconds.
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.348418) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 3539115 bytes OK
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.348447) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.356666) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.356691) EVENT_LOG_v1 {"time_micros": 1769849953356685, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.356708) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 5409680, prev total WAL file size 5409680, number of live WAL files 2.
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.357443) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(3456KB)], [159(10179KB)]
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953357474, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13962963, "oldest_snapshot_seqno": -1}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 10003 keys, 12048223 bytes, temperature: kUnknown
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953589879, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 12048223, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11985133, "index_size": 36993, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25029, "raw_key_size": 263284, "raw_average_key_size": 26, "raw_value_size": 11811735, "raw_average_value_size": 1180, "num_data_blocks": 1404, "num_entries": 10003, "num_filter_entries": 10003, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769849953, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.590193) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 12048223 bytes
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.608545) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.0 rd, 51.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 9.9 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 10530, records dropped: 527 output_compression: NoCompression
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.608581) EVENT_LOG_v1 {"time_micros": 1769849953608567, "job": 102, "event": "compaction_finished", "compaction_time_micros": 232523, "compaction_time_cpu_micros": 21706, "output_level": 6, "num_output_files": 1, "total_output_size": 12048223, "num_input_records": 10530, "num_output_records": 10003, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953609273, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769849953610270, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.357392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.610545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.610552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.610555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.610558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:13 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-08:59:13.610560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 03:59:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:14.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:14.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:15.013 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.443 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.506 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.506 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.507 221324 DEBUG oslo_concurrency.lockutils [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquired lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.507 221324 DEBUG nova.network.neutron [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.509 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.509 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.509 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.510 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.646 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.646 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.647 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.647 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 03:59:15 np0005603622 nova_compute[221301]: 2026-01-31 08:59:15.647 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:16 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2127911267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.118 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:16.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.322 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.323 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.478 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.479 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4235MB free_disk=20.946487426757812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.480 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.480 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.679 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:16.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.946 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 7e6bef46-1686-44a0-a3c7-3a5855268d1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.947 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 03:59:16 np0005603622 nova_compute[221301]: 2026-01-31 08:59:16.947 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.081 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3455106327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.682 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.601s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.690 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.727 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.786 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 03:59:17 np0005603622 nova_compute[221301]: 2026-01-31 08:59:17.786 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:18.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.244 221324 DEBUG nova.network.neutron [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [{"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.281 221324 DEBUG oslo_concurrency.lockutils [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Releasing lock "refresh_cache-7e6bef46-1686-44a0-a3c7-3a5855268d1b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.288 221324 DEBUG nova.virt.libvirt.vif [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1156523753',display_name='tempest-TestServerAdvancedOps-server-1156523753',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1156523753',id=188,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='790b9059378345aca91bca183d37da86',ramdisk_id='',reservation_id='r-o65vaun9',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1774138384',owner_user_name='tempest-TestServerAdvancedOps-1774138384-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:59:10Z,user_data=None,user_id='802db4935c5046888527023acbf8d3f5',uuid=7e6bef46-1686-44a0-a3c7-3a5855268d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.288 221324 DEBUG nova.network.os_vif_util [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converting VIF {"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.289 221324 DEBUG nova.network.os_vif_util [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.289 221324 DEBUG os_vif [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.290 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.291 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.291 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.295 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.295 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap318c754a-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.296 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap318c754a-d9, col_values=(('external_ids', {'iface-id': '318c754a-d9a0-4e76-9f79-be26ad06e2f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:ae:3e', 'vm-uuid': '7e6bef46-1686-44a0-a3c7-3a5855268d1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.297 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.297 221324 INFO os_vif [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9')#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.424 221324 DEBUG nova.objects.instance [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:18 np0005603622 kernel: tap318c754a-d9: entered promiscuous mode
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:18 np0005603622 NetworkManager[49080]: <info>  [1769849958.5082] manager: (tap318c754a-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/397)
Jan 31 03:59:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:18Z|00909|binding|INFO|Claiming lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 for this chassis.
Jan 31 03:59:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:18Z|00910|binding|INFO|318c754a-d9a0-4e76-9f79-be26ad06e2f7: Claiming fa:16:3e:32:ae:3e 10.100.0.13
Jan 31 03:59:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:18Z|00911|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 ovn-installed in OVS
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.517 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:18 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:18Z|00912|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 up in Southbound
Jan 31 03:59:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:18.520 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:18.521 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b bound to our chassis#033[00m
Jan 31 03:59:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:18.522 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:59:18 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:18.523 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3be640bf-6dce-4790-a493-82d115e07e66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:18 np0005603622 systemd-machined[190406]: New machine qemu-94-instance-000000bc.
Jan 31 03:59:18 np0005603622 systemd[1]: Started Virtual Machine qemu-94-instance-000000bc.
Jan 31 03:59:18 np0005603622 systemd-udevd[297993]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:59:18 np0005603622 NetworkManager[49080]: <info>  [1769849958.5725] device (tap318c754a-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:59:18 np0005603622 NetworkManager[49080]: <info>  [1769849958.5736] device (tap318c754a-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:59:18 np0005603622 podman[297969]: 2026-01-31 08:59:18.604048236 +0000 UTC m=+0.071909967 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:59:18 np0005603622 podman[297968]: 2026-01-31 08:59:18.620055994 +0000 UTC m=+0.088061188 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.782 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:18.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.970 221324 DEBUG nova.compute.manager [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.971 221324 DEBUG oslo_concurrency.lockutils [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.971 221324 DEBUG oslo_concurrency.lockutils [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.972 221324 DEBUG oslo_concurrency.lockutils [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.972 221324 DEBUG nova.compute.manager [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:18 np0005603622 nova_compute[221301]: 2026-01-31 08:59:18.972 221324 WARNING nova.compute.manager [req-9171025d-f730-4388-943e-d0f2c3155787 req-de98448c-e51e-41ad-b8d7-66feffb12c16 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.173 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 7e6bef46-1686-44a0-a3c7-3a5855268d1b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.174 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849959.1731856, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.175 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Started (Lifecycle Event)#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.186 221324 DEBUG nova.compute.manager [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.186 221324 DEBUG nova.objects.instance [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.209 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.213 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.217 221324 INFO nova.virt.libvirt.driver [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance running successfully.#033[00m
Jan 31 03:59:19 np0005603622 virtqemud[220925]: argument unsupported: QEMU guest agent is not configured
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.219 221324 DEBUG nova.virt.libvirt.guest [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.219 221324 DEBUG nova.compute.manager [None req-babb3f14-8849-435c-9ba7-1d59c66f052b 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.267 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.268 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849959.177846, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.268 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.304 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:19 np0005603622 nova_compute[221301]: 2026-01-31 08:59:19.308 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:20.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.614 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.614 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.615 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.615 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.615 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.616 221324 INFO nova.compute.manager [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Terminating instance#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.617 221324 DEBUG nova.compute.manager [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 03:59:20 np0005603622 kernel: tap318c754a-d9 (unregistering): left promiscuous mode
Jan 31 03:59:20 np0005603622 NetworkManager[49080]: <info>  [1769849960.6900] device (tap318c754a-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 03:59:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:20Z|00913|binding|INFO|Releasing lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 from this chassis (sb_readonly=0)
Jan 31 03:59:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:20Z|00914|binding|INFO|Setting lport 318c754a-d9a0-4e76-9f79-be26ad06e2f7 down in Southbound
Jan 31 03:59:20 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:20Z|00915|binding|INFO|Removing iface tap318c754a-d9 ovn-installed in OVS
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.695 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:20.702 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:ae:3e 10.100.0.13'], port_security=['fa:16:3e:32:ae:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7e6bef46-1686-44a0-a3c7-3a5855268d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1477e535-c8a3-448f-9ac2-f508e9938a2b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '790b9059378345aca91bca183d37da86', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4e133553-d1da-478a-9016-9fcf44a0856b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ddedc6-3689-44db-9f7c-74899d692b4c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=318c754a-d9a0-4e76-9f79-be26ad06e2f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.703 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:20.704 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 318c754a-d9a0-4e76-9f79-be26ad06e2f7 in datapath 1477e535-c8a3-448f-9ac2-f508e9938a2b unbound from our chassis#033[00m
Jan 31 03:59:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:20.705 139793 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1477e535-c8a3-448f-9ac2-f508e9938a2b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 31 03:59:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:20.707 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[50d52a3f-35f6-4fd9-adb9-fd2ad4f17270]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:20 np0005603622 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 31 03:59:20 np0005603622 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000bc.scope: Consumed 1.941s CPU time.
Jan 31 03:59:20 np0005603622 systemd-machined[190406]: Machine qemu-94-instance-000000bc terminated.
Jan 31 03:59:20 np0005603622 kernel: tap318c754a-d9: entered promiscuous mode
Jan 31 03:59:20 np0005603622 kernel: tap318c754a-d9 (unregistering): left promiscuous mode
Jan 31 03:59:20 np0005603622 NetworkManager[49080]: <info>  [1769849960.8405] manager: (tap318c754a-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/398)
Jan 31 03:59:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:59:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:20.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.845 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.857 221324 INFO nova.virt.libvirt.driver [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Instance destroyed successfully.#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.857 221324 DEBUG nova.objects.instance [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lazy-loading 'resources' on Instance uuid 7e6bef46-1686-44a0-a3c7-3a5855268d1b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.910 221324 DEBUG nova.virt.libvirt.vif [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1156523753',display_name='tempest-TestServerAdvancedOps-server-1156523753',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1156523753',id=188,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:58:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='790b9059378345aca91bca183d37da86',ramdisk_id='',reservation_id='r-o65vaun9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-1774138384',owner_user_name='tempest-TestServerAdvancedOps-1774138384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T08:59:19Z,user_data=None,user_id='802db4935c5046888527023acbf8d3f5',uuid=7e6bef46-1686-44a0-a3c7-3a5855268d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.912 221324 DEBUG nova.network.os_vif_util [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converting VIF {"id": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "address": "fa:16:3e:32:ae:3e", "network": {"id": "1477e535-c8a3-448f-9ac2-f508e9938a2b", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-79261136-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "790b9059378345aca91bca183d37da86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap318c754a-d9", "ovs_interfaceid": "318c754a-d9a0-4e76-9f79-be26ad06e2f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.912 221324 DEBUG nova.network.os_vif_util [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.913 221324 DEBUG os_vif [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.916 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap318c754a-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:20 np0005603622 nova_compute[221301]: 2026-01-31 08:59:20.920 221324 INFO os_vif [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:ae:3e,bridge_name='br-int',has_traffic_filtering=True,id=318c754a-d9a0-4e76-9f79-be26ad06e2f7,network=Network(1477e535-c8a3-448f-9ac2-f508e9938a2b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap318c754a-d9')#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.137 221324 DEBUG nova.compute.manager [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.138 221324 DEBUG oslo_concurrency.lockutils [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.138 221324 DEBUG oslo_concurrency.lockutils [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.139 221324 DEBUG oslo_concurrency.lockutils [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.139 221324 DEBUG nova.compute.manager [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.139 221324 WARNING nova.compute.manager [req-2da557b3-2f85-4ec4-bde2-638187da3a18 req-bf2b071e-be20-41a1-86d1-bf1a4cefea02 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.633 221324 INFO nova.virt.libvirt.driver [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Deleting instance files /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b_del#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.634 221324 INFO nova.virt.libvirt.driver [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Deletion of /var/lib/nova/instances/7e6bef46-1686-44a0-a3c7-3a5855268d1b_del complete#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.762 221324 INFO nova.compute.manager [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Took 1.14 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.763 221324 DEBUG oslo.service.loopingcall [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.764 221324 DEBUG nova.compute.manager [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 03:59:21 np0005603622 nova_compute[221301]: 2026-01-31 08:59:21.764 221324 DEBUG nova.network.neutron [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 03:59:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:22.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:22.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.432 221324 DEBUG nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-unplugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.433 221324 DEBUG nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.434 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.434 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.434 221324 DEBUG oslo_concurrency.lockutils [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.434 221324 DEBUG nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] No waiting events found dispatching network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.434 221324 WARNING nova.compute.manager [req-696c1c4c-f3a3-4a64-8f0c-a353384da261 req-a6f1cce7-3bca-4b0b-b68e-14b25e43dfbd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received unexpected event network-vif-plugged-318c754a-d9a0-4e76-9f79-be26ad06e2f7 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.856 221324 DEBUG nova.network.neutron [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:23 np0005603622 nova_compute[221301]: 2026-01-31 08:59:23.914 221324 INFO nova.compute.manager [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Took 2.15 seconds to deallocate network for instance.#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.037 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.037 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.099 221324 DEBUG oslo_concurrency.processutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:24.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/614973072' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.526 221324 DEBUG oslo_concurrency.processutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.532 221324 DEBUG nova.compute.provider_tree [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.571 221324 DEBUG nova.scheduler.client.report [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.641 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.764 221324 INFO nova.scheduler.client.report [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Deleted allocations for instance 7e6bef46-1686-44a0-a3c7-3a5855268d1b#033[00m
Jan 31 03:59:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:24.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:24 np0005603622 nova_compute[221301]: 2026-01-31 08:59:24.901 221324 DEBUG oslo_concurrency.lockutils [None req-775d8ce4-7d7e-4a04-8c77-dcb9913fa447 802db4935c5046888527023acbf8d3f5 790b9059378345aca91bca183d37da86 - - default default] Lock "7e6bef46-1686-44a0-a3c7-3a5855268d1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:25 np0005603622 nova_compute[221301]: 2026-01-31 08:59:25.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:25 np0005603622 nova_compute[221301]: 2026-01-31 08:59:25.655 221324 DEBUG nova.compute.manager [req-fa84e61b-9242-40c2-9a9d-e425ee0786f6 req-899a9622-eace-413d-8804-3dd356fb806c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Received event network-vif-deleted-318c754a-d9a0-4e76-9f79-be26ad06e2f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:25 np0005603622 nova_compute[221301]: 2026-01-31 08:59:25.917 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:26.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:26 np0005603622 nova_compute[221301]: 2026-01-31 08:59:26.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:26.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:27 np0005603622 podman[298297]: 2026-01-31 08:59:27.163224071 +0000 UTC m=+0.060052008 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True)
Jan 31 03:59:27 np0005603622 podman[298297]: 2026-01-31 08:59:27.273709139 +0000 UTC m=+0.170537096 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 03:59:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:28.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:28.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 03:59:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 03:59:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:30.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:30 np0005603622 nova_compute[221301]: 2026-01-31 08:59:30.267 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:30.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:30.918 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:30 np0005603622 nova_compute[221301]: 2026-01-31 08:59:30.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:32.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 31 03:59:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:32.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:34.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:34.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:35 np0005603622 nova_compute[221301]: 2026-01-31 08:59:35.269 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:35 np0005603622 nova_compute[221301]: 2026-01-31 08:59:35.855 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769849960.8542004, 7e6bef46-1686-44a0-a3c7-3a5855268d1b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:35 np0005603622 nova_compute[221301]: 2026-01-31 08:59:35.856 221324 INFO nova.compute.manager [-] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] VM Stopped (Lifecycle Event)#033[00m
Jan 31 03:59:35 np0005603622 nova_compute[221301]: 2026-01-31 08:59:35.891 221324 DEBUG nova.compute.manager [None req-307483fc-3c91-48fe-84e5-390bf6cfc211 - - - - - -] [instance: 7e6bef46-1686-44a0-a3c7-3a5855268d1b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 03:59:35 np0005603622 nova_compute[221301]: 2026-01-31 08:59:35.922 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:36.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:36.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:37 np0005603622 nova_compute[221301]: 2026-01-31 08:59:37.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 03:59:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:38.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.258 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.259 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.293 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.400 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.401 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.415 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.415 221324 INFO nova.compute.claims [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 03:59:38 np0005603622 nova_compute[221301]: 2026-01-31 08:59:38.599 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:38.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 03:59:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3277482350' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.153 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.160 221324 DEBUG nova.compute.provider_tree [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.299 221324 DEBUG nova.scheduler.client.report [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.404 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.405 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.576 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.577 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.785 221324 INFO nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.834 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 03:59:39 np0005603622 nova_compute[221301]: 2026-01-31 08:59:39.941 221324 DEBUG nova.policy [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b97882caec944ca3b26c230fcebff775', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.024 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.025 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.026 221324 INFO nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Creating image(s)#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.056 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.094 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.122 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.126 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "77591e51c65a9b32a7dd2836400e7a772ef7a831" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.128 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "77591e51c65a9b32a7dd2836400e7a772ef7a831" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:40.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.658 221324 DEBUG nova.virt.libvirt.imagebackend [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/be36fc17-1fc9-4dc7-9df3-92082489a43b/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/be36fc17-1fc9-4dc7-9df3-92082489a43b/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 03:59:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:40.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:40 np0005603622 nova_compute[221301]: 2026-01-31 08:59:40.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:41 np0005603622 nova_compute[221301]: 2026-01-31 08:59:41.137 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Successfully created port: 778471ec-ff3e-4e08-915a-231eee582d31 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 03:59:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:42.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.300 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.387 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.part --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.388 221324 DEBUG nova.virt.images [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] be36fc17-1fc9-4dc7-9df3-92082489a43b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.395 221324 DEBUG nova.privsep.utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.396 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.part /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.417 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Successfully updated port: 778471ec-ff3e-4e08-915a-231eee582d31 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.457 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.457 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.457 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.749 221324 DEBUG nova.compute.manager [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.750 221324 DEBUG nova.compute.manager [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.750 221324 DEBUG oslo_concurrency.lockutils [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 03:59:42 np0005603622 nova_compute[221301]: 2026-01-31 08:59:42.791 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 03:59:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 03:59:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:42.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.481 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.part /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.converted" returned: 0 in 1.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.485 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.545 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831.converted --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.547 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "77591e51c65a9b32a7dd2836400e7a772ef7a831" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.584 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:43 np0005603622 nova_compute[221301]: 2026-01-31 08:59:43.589 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831 fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:44.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:44.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.269 221324 DEBUG nova.network.neutron [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.332 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.333 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance network_info: |[{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.333 221324 DEBUG oslo_concurrency.lockutils [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.334 221324 DEBUG nova.network.neutron [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 03:59:45 np0005603622 nova_compute[221301]: 2026-01-31 08:59:45.927 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:46 np0005603622 nova_compute[221301]: 2026-01-31 08:59:46.017 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831 fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:46 np0005603622 nova_compute[221301]: 2026-01-31 08:59:46.093 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] resizing rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 03:59:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:46.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:46.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:47 np0005603622 nova_compute[221301]: 2026-01-31 08:59:47.880 221324 DEBUG nova.network.neutron [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 03:59:47 np0005603622 nova_compute[221301]: 2026-01-31 08:59:47.881 221324 DEBUG nova.network.neutron [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 03:59:47 np0005603622 nova_compute[221301]: 2026-01-31 08:59:47.908 221324 DEBUG oslo_concurrency.lockutils [req-f036f099-5c36-4615-9614-e4dd6e8c3d45 req-9c626182-b51e-4ed5-914e-784735ef6680 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.067 221324 DEBUG nova.objects.instance [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'migration_context' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.084 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.085 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Ensure instance console log exists: /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.085 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.086 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.086 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.088 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start _get_guest_xml network_info=[{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T08:59:30Z,direct_url=<?>,disk_format='qcow2',id=be36fc17-1fc9-4dc7-9df3-92082489a43b,min_disk=0,min_ram=0,name='tempest-scenario-img--1945406638',owner='3d3e35e8f8144cd58baa994b70ef9668',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T08:59:32Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': 'be36fc17-1fc9-4dc7-9df3-92082489a43b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.096 221324 WARNING nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.107 221324 DEBUG nova.virt.libvirt.host [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.107 221324 DEBUG nova.virt.libvirt.host [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.112 221324 DEBUG nova.virt.libvirt.host [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.112 221324 DEBUG nova.virt.libvirt.host [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.114 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.114 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T08:59:30Z,direct_url=<?>,disk_format='qcow2',id=be36fc17-1fc9-4dc7-9df3-92082489a43b,min_disk=0,min_ram=0,name='tempest-scenario-img--1945406638',owner='3d3e35e8f8144cd58baa994b70ef9668',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T08:59:32Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.114 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.115 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.115 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.115 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.115 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.116 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.116 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.116 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.116 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.116 221324 DEBUG nova.virt.hardware [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.119 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:48.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:59:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/272999330' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.533 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.561 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:48 np0005603622 nova_compute[221301]: 2026-01-31 08:59:48.564 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:48.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 03:59:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3823787373' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 03:59:49 np0005603622 podman[298869]: 2026-01-31 08:59:49.140936487 +0000 UTC m=+0.052892457 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 03:59:49 np0005603622 podman[298868]: 2026-01-31 08:59:49.159964986 +0000 UTC m=+0.075256695 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.239 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.675s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.241 221324 DEBUG nova.virt.libvirt.vif [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:59:39Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.242 221324 DEBUG nova.network.os_vif_util [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.243 221324 DEBUG nova.network.os_vif_util [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.244 221324 DEBUG nova.objects.instance [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'pci_devices' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.378 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] End _get_guest_xml xml=<domain type="kvm">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <uuid>fd777b00-402f-4a2e-8a77-2b61c7cf5484</uuid>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <name>instance-000000bf</name>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestMinimumBasicScenario-server-2037009249</nova:name>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 08:59:48</nova:creationTime>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:user uuid="b97882caec944ca3b26c230fcebff775">tempest-TestMinimumBasicScenario-1062170996-project-member</nova:user>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:project uuid="3d3e35e8f8144cd58baa994b70ef9668">tempest-TestMinimumBasicScenario-1062170996</nova:project>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="be36fc17-1fc9-4dc7-9df3-92082489a43b"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <nova:port uuid="778471ec-ff3e-4e08-915a-231eee582d31">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <system>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="serial">fd777b00-402f-4a2e-8a77-2b61c7cf5484</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="uuid">fd777b00-402f-4a2e-8a77-2b61c7cf5484</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </system>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <os>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </os>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <features>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </features>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </clock>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  <devices>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:82:3e:6f"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <target dev="tap778471ec-ff"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </interface>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/console.log" append="off"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </serial>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <video>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </video>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </rng>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 03:59:49 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 03:59:49 np0005603622 nova_compute[221301]:  </devices>
Jan 31 03:59:49 np0005603622 nova_compute[221301]: </domain>
Jan 31 03:59:49 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.379 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Preparing to wait for external event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.380 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.380 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.380 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.381 221324 DEBUG nova.virt.libvirt.vif [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T08:59:39Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.382 221324 DEBUG nova.network.os_vif_util [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.382 221324 DEBUG nova.network.os_vif_util [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.383 221324 DEBUG os_vif [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.384 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.385 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.389 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.389 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap778471ec-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.390 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap778471ec-ff, col_values=(('external_ids', {'iface-id': '778471ec-ff3e-4e08-915a-231eee582d31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:3e:6f', 'vm-uuid': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:49 np0005603622 NetworkManager[49080]: <info>  [1769849989.3947] manager: (tap778471ec-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/399)
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.396 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.403 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.405 221324 INFO os_vif [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff')#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.533 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.533 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.534 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No VIF found with MAC fa:16:3e:82:3e:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.535 221324 INFO nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Using config drive#033[00m
Jan 31 03:59:49 np0005603622 nova_compute[221301]: 2026-01-31 08:59:49.575 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.161 221324 INFO nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Creating config drive at /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.166 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_dw8snin execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:50.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.276 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.296 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp_dw8snin" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.334 221324 DEBUG nova.storage.rbd_utils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.340 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.509 221324 DEBUG oslo_concurrency.processutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.511 221324 INFO nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Deleting local config drive /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/disk.config because it was imported into RBD.#033[00m
Jan 31 03:59:50 np0005603622 kernel: tap778471ec-ff: entered promiscuous mode
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.5524] manager: (tap778471ec-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/400)
Jan 31 03:59:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:50Z|00916|binding|INFO|Claiming lport 778471ec-ff3e-4e08-915a-231eee582d31 for this chassis.
Jan 31 03:59:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:50Z|00917|binding|INFO|778471ec-ff3e-4e08-915a-231eee582d31: Claiming fa:16:3e:82:3e:6f 10.100.0.9
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.580 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:3e:6f 10.100.0.9'], port_security=['fa:16:3e:82:3e:6f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=778471ec-ff3e-4e08-915a-231eee582d31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.582 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 778471ec-ff3e-4e08-915a-231eee582d31 in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f bound to our chassis#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.584 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f#033[00m
Jan 31 03:59:50 np0005603622 systemd-udevd[298988]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 03:59:50 np0005603622 systemd-machined[190406]: New machine qemu-95-instance-000000bf.
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.598 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e90d782a-a3ea-485e-ab2b-9a01b3a2ad86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.600 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap372c2b16-d1 in ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.602 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap372c2b16-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.602 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e04abe74-0fba-4877-9522-5676cc2c7557]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.6042] device (tap778471ec-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.6050] device (tap778471ec-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.606 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d67c9adb-856c-4ecc-afc2-d3b57d6faa3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 systemd[1]: Started Virtual Machine qemu-95-instance-000000bf.
Jan 31 03:59:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:50Z|00918|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 ovn-installed in OVS
Jan 31 03:59:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:50Z|00919|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 up in Southbound
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.622 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[653e0dcf-96fd-4797-8fc1-37d72bd2c9ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.636 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a29110e1-df51-43be-b25d-546166c71718]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.661 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5b5f7d-9d97-4de0-8dd6-52b82853d593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.6681] manager: (tap372c2b16-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/401)
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.667 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[237f8599-5d2c-4cf4-abbc-c5a5d8c8df24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.691 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f1071149-0747-402b-a850-041e98917a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.696 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[614654e1-867f-4929-9386-fa5680faf4bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.7100] device (tap372c2b16-d0): carrier: link connected
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.714 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[79c2b4f0-c6df-47b5-9735-4c7d1fbbf682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.728 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7cf1f7-7065-49e9-8eb5-35092a21d1e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 275], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 924205, 'reachable_time': 23850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299021, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[833ef1c7-4521-4a19-8c24-d387930f8674]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:3847'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 924205, 'tstamp': 924205}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299022, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.755 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[40e28d9f-6221-4a0d-95a7-d97501012409]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 275], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 924205, 'reachable_time': 23850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299023, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.780 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[70f1e810-dcd7-412e-809e-37b3b0e29b66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.840 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[43ad4979-50fd-4361-98b9-59530bf342aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.842 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.842 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.842 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap372c2b16-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 NetworkManager[49080]: <info>  [1769849990.8834] manager: (tap372c2b16-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Jan 31 03:59:50 np0005603622 kernel: tap372c2b16-d0: entered promiscuous mode
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.886 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.888 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap372c2b16-d0, col_values=(('external_ids', {'iface-id': 'a077e97d-4d5f-408b-a598-1a37aa5f7556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.889 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_controller[130109]: 2026-01-31T08:59:50Z|00920|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.892 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.893 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[95cf24a8-fc9e-4ad5-88a3-ffa82b4ac6e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.894 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 03:59:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 03:59:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 03:59:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:50.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 03:59:50 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 08:59:50.896 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'env', 'PROCESS_TAG=haproxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/372c2b16-dcb3-40d9-a594-5fc185a4de0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.940 221324 DEBUG nova.compute.manager [req-4549cbff-6059-4c1b-bac5-01335891c070 req-425e8145-3e70-4a02-bbce-705d79ca7927 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.940 221324 DEBUG oslo_concurrency.lockutils [req-4549cbff-6059-4c1b-bac5-01335891c070 req-425e8145-3e70-4a02-bbce-705d79ca7927 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.941 221324 DEBUG oslo_concurrency.lockutils [req-4549cbff-6059-4c1b-bac5-01335891c070 req-425e8145-3e70-4a02-bbce-705d79ca7927 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.941 221324 DEBUG oslo_concurrency.lockutils [req-4549cbff-6059-4c1b-bac5-01335891c070 req-425e8145-3e70-4a02-bbce-705d79ca7927 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:50 np0005603622 nova_compute[221301]: 2026-01-31 08:59:50.941 221324 DEBUG nova.compute.manager [req-4549cbff-6059-4c1b-bac5-01335891c070 req-425e8145-3e70-4a02-bbce-705d79ca7927 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Processing event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 03:59:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:51 np0005603622 podman[299055]: 2026-01-31 08:59:51.209701298 +0000 UTC m=+0.025186195 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 03:59:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:52.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:52 np0005603622 podman[299055]: 2026-01-31 08:59:52.648924448 +0000 UTC m=+1.464409295 container create 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 03:59:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:52.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:52 np0005603622 systemd[1]: Started libpod-conmon-685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8.scope.
Jan 31 03:59:52 np0005603622 systemd[1]: Started libcrun container.
Jan 31 03:59:52 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc7856ba62f38bae8350f411f04ab16d0c105d4924770d38696cdc196ab476cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.016 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.018 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849993.0157292, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.018 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Started (Lifecycle Event)#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.022 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.026 221324 INFO nova.virt.libvirt.driver [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance spawned successfully.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.026 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 03:59:53 np0005603622 podman[299055]: 2026-01-31 08:59:53.072699762 +0000 UTC m=+1.888184659 container init 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.074 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.078 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:53 np0005603622 podman[299055]: 2026-01-31 08:59:53.082943926 +0000 UTC m=+1.898428763 container start 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 03:59:53 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [NOTICE]   (299116) : New worker (299118) forked
Jan 31 03:59:53 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [NOTICE]   (299116) : Loading success.
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.133 221324 DEBUG nova.compute.manager [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.134 221324 DEBUG oslo_concurrency.lockutils [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.134 221324 DEBUG oslo_concurrency.lockutils [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.134 221324 DEBUG oslo_concurrency.lockutils [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.135 221324 DEBUG nova.compute.manager [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.135 221324 WARNING nova.compute.manager [req-fa4f2bb3-6b5c-4472-ad55-53fd5d5e82ba req-ad09374e-dc52-49cc-8c0d-3b9099b9c112 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state building and task_state spawning.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.153 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.154 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849993.0171506, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.154 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Paused (Lifecycle Event)#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.160 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.161 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.162 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.162 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.163 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.163 221324 DEBUG nova.virt.libvirt.driver [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.174 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.178 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769849993.0217566, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.179 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Resumed (Lifecycle Event)#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.210 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.213 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.226 221324 INFO nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Took 13.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.226 221324 DEBUG nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.233 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.299 221324 INFO nova.compute.manager [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Took 14.94 seconds to build instance.#033[00m
Jan 31 03:59:53 np0005603622 nova_compute[221301]: 2026-01-31 08:59:53.319 221324 DEBUG oslo_concurrency.lockutils [None req-b0e6030f-2157-4e99-816a-507d4dd28931 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 03:59:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:54.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:54 np0005603622 nova_compute[221301]: 2026-01-31 08:59:54.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:54.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:55 np0005603622 nova_compute[221301]: 2026-01-31 08:59:55.332 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 03:59:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 03:59:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 03:59:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:56.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 03:59:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:56.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:08:59:58.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 03:59:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 03:59:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:08:59:58.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 03:59:59 np0005603622 nova_compute[221301]: 2026-01-31 08:59:59.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:00 np0005603622 nova_compute[221301]: 2026-01-31 09:00:00.026 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:00 np0005603622 nova_compute[221301]: 2026-01-31 09:00:00.027 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 04:00:00 np0005603622 nova_compute[221301]: 2026-01-31 09:00:00.189 221324 DEBUG nova.objects.instance [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:00.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:00 np0005603622 nova_compute[221301]: 2026-01-31 09:00:00.336 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:00 np0005603622 nova_compute[221301]: 2026-01-31 09:00:00.443 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:00.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.038 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.039 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.040 221324 INFO nova.compute.manager [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Attaching volume b7cfb970-aa00-4ab7-b895-9c828f0058d5 to /dev/vdb#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.281 221324 DEBUG os_brick.utils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.285 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.301 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.301 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[ba730ab8-051d-4ab2-8744-56bf9848dadf]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.304 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.312 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.312 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[07433219-7498-4c15-9848-e2723b9b430d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.315 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.324 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.324 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c7a01f-a6b1-4e6e-96cf-72a5671ea173]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.327 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[7dce5014-36f5-420b-bf6f-2896f983daed]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.328 221324 DEBUG oslo_concurrency.processutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.350 221324 DEBUG oslo_concurrency.processutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.352 221324 DEBUG os_brick.initiator.connectors.lightos [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.352 221324 DEBUG os_brick.initiator.connectors.lightos [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.352 221324 DEBUG os_brick.initiator.connectors.lightos [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.353 221324 DEBUG os_brick.utils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] <== get_connector_properties: return (70ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:00:01 np0005603622 nova_compute[221301]: 2026-01-31 09:00:01.353 221324 DEBUG nova.virt.block_device [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating existing volume attachment record: 8f74dbe0-01a0-43a5-af76-f10e47f68a95 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:00:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:02.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:02.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:03 np0005603622 nova_compute[221301]: 2026-01-31 09:00:03.279 221324 DEBUG nova.objects.instance [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:03 np0005603622 nova_compute[221301]: 2026-01-31 09:00:03.631 221324 DEBUG nova.virt.libvirt.driver [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Attempting to attach volume b7cfb970-aa00-4ab7-b895-9c828f0058d5 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 04:00:03 np0005603622 nova_compute[221301]: 2026-01-31 09:00:03.634 221324 DEBUG nova.virt.libvirt.guest [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-b7cfb970-aa00-4ab7-b895-9c828f0058d5">
Jan 31 04:00:03 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 04:00:03 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  </auth>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:00:03 np0005603622 nova_compute[221301]:  <serial>b7cfb970-aa00-4ab7-b895-9c828f0058d5</serial>
Jan 31 04:00:03 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:00:03 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 04:00:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:04.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:04 np0005603622 nova_compute[221301]: 2026-01-31 09:00:04.270 221324 DEBUG nova.virt.libvirt.driver [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:00:04 np0005603622 nova_compute[221301]: 2026-01-31 09:00:04.271 221324 DEBUG nova.virt.libvirt.driver [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:00:04 np0005603622 nova_compute[221301]: 2026-01-31 09:00:04.271 221324 DEBUG nova.virt.libvirt.driver [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:00:04 np0005603622 nova_compute[221301]: 2026-01-31 09:00:04.271 221324 DEBUG nova.virt.libvirt.driver [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No VIF found with MAC fa:16:3e:82:3e:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:00:04 np0005603622 nova_compute[221301]: 2026-01-31 09:00:04.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:04.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 31 04:00:05 np0005603622 nova_compute[221301]: 2026-01-31 09:00:05.339 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:05 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:05Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:3e:6f 10.100.0.9
Jan 31 04:00:05 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:05Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:3e:6f 10.100.0.9
Jan 31 04:00:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 31 04:00:06 np0005603622 nova_compute[221301]: 2026-01-31 09:00:06.203 221324 DEBUG oslo_concurrency.lockutils [None req-14a8529b-e7f9-4ef7-a387-5290f68e4f12 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:06.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:06.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.445936) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007446039, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 880, "num_deletes": 255, "total_data_size": 1599347, "memory_usage": 1631120, "flush_reason": "Manual Compaction"}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007461925, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1054346, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79553, "largest_seqno": 80428, "table_properties": {"data_size": 1050233, "index_size": 1828, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9388, "raw_average_key_size": 19, "raw_value_size": 1041785, "raw_average_value_size": 2161, "num_data_blocks": 79, "num_entries": 482, "num_filter_entries": 482, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769849954, "oldest_key_time": 1769849954, "file_creation_time": 1769850007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 16045 microseconds, and 4828 cpu microseconds.
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.461989) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1054346 bytes OK
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.462013) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.469815) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.469850) EVENT_LOG_v1 {"time_micros": 1769850007469841, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.469872) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 1594840, prev total WAL file size 1594840, number of live WAL files 2.
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.470505) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303134' seq:72057594037927935, type:22 .. '6C6F676D0033323635' seq:0, type:0; will stop at (end)
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1029KB)], [162(11MB)]
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007470613, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13102569, "oldest_snapshot_seqno": -1}
Jan 31 04:00:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:07.562 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:07 np0005603622 nova_compute[221301]: 2026-01-31 09:00:07.563 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:07.564 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:00:07 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:07.564 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 9957 keys, 12971651 bytes, temperature: kUnknown
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007587843, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 12971651, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12907439, "index_size": 38243, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24901, "raw_key_size": 263335, "raw_average_key_size": 26, "raw_value_size": 12733463, "raw_average_value_size": 1278, "num_data_blocks": 1454, "num_entries": 9957, "num_filter_entries": 9957, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.588197) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 12971651 bytes
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.592842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.6 rd, 110.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.5 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(24.7) write-amplify(12.3) OK, records in: 10485, records dropped: 528 output_compression: NoCompression
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.592885) EVENT_LOG_v1 {"time_micros": 1769850007592865, "job": 104, "event": "compaction_finished", "compaction_time_micros": 117354, "compaction_time_cpu_micros": 38668, "output_level": 6, "num_output_files": 1, "total_output_size": 12971651, "num_input_records": 10485, "num_output_records": 9957, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007593158, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850007594654, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.470342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.594718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.594722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.594724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.594726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:07 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:00:07.594728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:00:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:08.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:08.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:09 np0005603622 nova_compute[221301]: 2026-01-31 09:00:09.447 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 31 04:00:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:10.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:10 np0005603622 nova_compute[221301]: 2026-01-31 09:00:10.339 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:10 np0005603622 nova_compute[221301]: 2026-01-31 09:00:10.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:10 np0005603622 nova_compute[221301]: 2026-01-31 09:00:10.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:00:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:10.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:12.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:12 np0005603622 nova_compute[221301]: 2026-01-31 09:00:12.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:12 np0005603622 nova_compute[221301]: 2026-01-31 09:00:12.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:00:12 np0005603622 nova_compute[221301]: 2026-01-31 09:00:12.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:00:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 31 04:00:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:12.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:13 np0005603622 nova_compute[221301]: 2026-01-31 09:00:13.203 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:13 np0005603622 nova_compute[221301]: 2026-01-31 09:00:13.204 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:13 np0005603622 nova_compute[221301]: 2026-01-31 09:00:13.204 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:00:13 np0005603622 nova_compute[221301]: 2026-01-31 09:00:13.204 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:14.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:14 np0005603622 nova_compute[221301]: 2026-01-31 09:00:14.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:14.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:15 np0005603622 nova_compute[221301]: 2026-01-31 09:00:15.342 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:16.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:17 np0005603622 NetworkManager[49080]: <info>  [1769850017.1887] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Jan 31 04:00:17 np0005603622 NetworkManager[49080]: <info>  [1769850017.1909] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:17 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:17Z|00921|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.283 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.341 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.405 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.406 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.407 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.407 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.407 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.408 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.444 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.446 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.446 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:00:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1687472447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.842 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.938 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.939 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:17 np0005603622 nova_compute[221301]: 2026-01-31 09:00:17.939 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.074 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.074 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4010MB free_disk=20.806140899658203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.075 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.075 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:18.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.286 221324 DEBUG nova.compute.manager [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.287 221324 DEBUG nova.compute.manager [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.287 221324 DEBUG oslo_concurrency.lockutils [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.287 221324 DEBUG oslo_concurrency.lockutils [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.287 221324 DEBUG nova.network.neutron [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.305 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.305 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.306 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.397 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:00:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3981366244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.820 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.824 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:00:18 np0005603622 nova_compute[221301]: 2026-01-31 09:00:18.877 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:00:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:18.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:19 np0005603622 nova_compute[221301]: 2026-01-31 09:00:19.051 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:00:19 np0005603622 nova_compute[221301]: 2026-01-31 09:00:19.051 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:19 np0005603622 nova_compute[221301]: 2026-01-31 09:00:19.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:20 np0005603622 podman[299201]: 2026-01-31 09:00:20.142603759 +0000 UTC m=+0.056212776 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 04:00:20 np0005603622 podman[299200]: 2026-01-31 09:00:20.171995126 +0000 UTC m=+0.088005207 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:00:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:20.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:20 np0005603622 nova_compute[221301]: 2026-01-31 09:00:20.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:21 np0005603622 nova_compute[221301]: 2026-01-31 09:00:21.046 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:21 np0005603622 nova_compute[221301]: 2026-01-31 09:00:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:21 np0005603622 nova_compute[221301]: 2026-01-31 09:00:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:22.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:22 np0005603622 nova_compute[221301]: 2026-01-31 09:00:22.342 221324 DEBUG nova.compute.manager [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:22 np0005603622 nova_compute[221301]: 2026-01-31 09:00:22.342 221324 DEBUG nova.compute.manager [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:00:22 np0005603622 nova_compute[221301]: 2026-01-31 09:00:22.343 221324 DEBUG oslo_concurrency.lockutils [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:22.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:23 np0005603622 nova_compute[221301]: 2026-01-31 09:00:23.269 221324 DEBUG nova.network.neutron [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:00:23 np0005603622 nova_compute[221301]: 2026-01-31 09:00:23.270 221324 DEBUG nova.network.neutron [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:23 np0005603622 nova_compute[221301]: 2026-01-31 09:00:23.309 221324 DEBUG oslo_concurrency.lockutils [req-ca37bfa6-6137-4668-b856-1bc3caec8aa3 req-ed15e353-f932-4403-ade0-8df06be5a7c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:23 np0005603622 nova_compute[221301]: 2026-01-31 09:00:23.311 221324 DEBUG oslo_concurrency.lockutils [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:23 np0005603622 nova_compute[221301]: 2026-01-31 09:00:23.311 221324 DEBUG nova.network.neutron [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:00:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:24.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:24 np0005603622 nova_compute[221301]: 2026-01-31 09:00:24.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:24.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:25 np0005603622 nova_compute[221301]: 2026-01-31 09:00:25.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:25 np0005603622 nova_compute[221301]: 2026-01-31 09:00:25.570 221324 DEBUG nova.compute.manager [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:25 np0005603622 nova_compute[221301]: 2026-01-31 09:00:25.571 221324 DEBUG nova.compute.manager [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:00:25 np0005603622 nova_compute[221301]: 2026-01-31 09:00:25.571 221324 DEBUG oslo_concurrency.lockutils [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:25 np0005603622 nova_compute[221301]: 2026-01-31 09:00:25.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:26.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:26.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:28 np0005603622 nova_compute[221301]: 2026-01-31 09:00:28.005 221324 DEBUG nova.network.neutron [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:00:28 np0005603622 nova_compute[221301]: 2026-01-31 09:00:28.006 221324 DEBUG nova.network.neutron [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:28 np0005603622 nova_compute[221301]: 2026-01-31 09:00:28.029 221324 DEBUG oslo_concurrency.lockutils [req-40e2c9f0-90f7-4d55-8c2e-44baf75949e0 req-e1c91d0c-e6fa-4bb3-93d0-8ff07565efb1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:28 np0005603622 nova_compute[221301]: 2026-01-31 09:00:28.031 221324 DEBUG oslo_concurrency.lockutils [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:28 np0005603622 nova_compute[221301]: 2026-01-31 09:00:28.031 221324 DEBUG nova.network.neutron [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:00:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:28.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:28.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:29 np0005603622 nova_compute[221301]: 2026-01-31 09:00:29.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:30 np0005603622 nova_compute[221301]: 2026-01-31 09:00:30.187 221324 DEBUG nova.network.neutron [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:00:30 np0005603622 nova_compute[221301]: 2026-01-31 09:00:30.188 221324 DEBUG nova.network.neutron [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:30 np0005603622 nova_compute[221301]: 2026-01-31 09:00:30.215 221324 DEBUG oslo_concurrency.lockutils [req-0932fa0a-51d5-4b12-b00b-64d0ea0c828e req-55a83389-d33b-4202-846a-b5096d957029 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:30.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:30 np0005603622 nova_compute[221301]: 2026-01-31 09:00:30.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:30.919 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:30.919 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:30.920 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:30.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:32.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:32.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:33 np0005603622 nova_compute[221301]: 2026-01-31 09:00:33.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:34.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:34 np0005603622 nova_compute[221301]: 2026-01-31 09:00:34.463 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:34.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:35 np0005603622 nova_compute[221301]: 2026-01-31 09:00:35.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:36.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:36 np0005603622 nova_compute[221301]: 2026-01-31 09:00:36.437 221324 DEBUG nova.compute.manager [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:36 np0005603622 nova_compute[221301]: 2026-01-31 09:00:36.438 221324 DEBUG nova.compute.manager [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:00:36 np0005603622 nova_compute[221301]: 2026-01-31 09:00:36.438 221324 DEBUG oslo_concurrency.lockutils [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:36 np0005603622 nova_compute[221301]: 2026-01-31 09:00:36.438 221324 DEBUG oslo_concurrency.lockutils [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:36 np0005603622 nova_compute[221301]: 2026-01-31 09:00:36.439 221324 DEBUG nova.network.neutron [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:00:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:36.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:00:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:00:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:38.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:39 np0005603622 nova_compute[221301]: 2026-01-31 09:00:39.198 221324 DEBUG nova.network.neutron [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:00:39 np0005603622 nova_compute[221301]: 2026-01-31 09:00:39.199 221324 DEBUG nova.network.neutron [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:39 np0005603622 nova_compute[221301]: 2026-01-31 09:00:39.220 221324 DEBUG oslo_concurrency.lockutils [req-fde3a6c4-a1ff-4e99-b91b-5c79f4d895d5 req-cec763fa-2826-4e04-adb3-9ee6094e8001 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:39 np0005603622 nova_compute[221301]: 2026-01-31 09:00:39.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 31 04:00:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:40Z|00922|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:00:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:40.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.366 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.682 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.683 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.683 221324 INFO nova.compute.manager [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Rebooting instance#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.708 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.709 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:00:40 np0005603622 nova_compute[221301]: 2026-01-31 09:00:40.710 221324 DEBUG nova.network.neutron [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:00:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:40.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:41 np0005603622 nova_compute[221301]: 2026-01-31 09:00:41.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 31 04:00:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:42.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:00:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:42.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:00:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:43 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:00:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:44.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:44 np0005603622 nova_compute[221301]: 2026-01-31 09:00:44.471 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:44.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.137 221324 DEBUG nova.network.neutron [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.168 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.170 221324 DEBUG nova.compute.manager [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.368 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 kernel: tap778471ec-ff (unregistering): left promiscuous mode
Jan 31 04:00:45 np0005603622 NetworkManager[49080]: <info>  [1769850045.4008] device (tap778471ec-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:45Z|00923|binding|INFO|Releasing lport 778471ec-ff3e-4e08-915a-231eee582d31 from this chassis (sb_readonly=0)
Jan 31 04:00:45 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:45Z|00924|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 down in Southbound
Jan 31 04:00:45 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:45Z|00925|binding|INFO|Removing iface tap778471ec-ff ovn-installed in OVS
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.410 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.417 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.428 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:3e:6f 10.100.0.9'], port_security=['fa:16:3e:82:3e:6f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 df68c117-592c-4147-9e69-0223735d7906', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=778471ec-ff3e-4e08-915a-231eee582d31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.429 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 778471ec-ff3e-4e08-915a-231eee582d31 in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.431 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.432 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d62bccf9-0931-442b-ab2c-f5a75507c482]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.432 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace which is not needed anymore#033[00m
Jan 31 04:00:45 np0005603622 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000bf.scope: Deactivated successfully.
Jan 31 04:00:45 np0005603622 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000bf.scope: Consumed 15.309s CPU time.
Jan 31 04:00:45 np0005603622 systemd-machined[190406]: Machine qemu-95-instance-000000bf terminated.
Jan 31 04:00:45 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [NOTICE]   (299116) : haproxy version is 2.8.14-c23fe91
Jan 31 04:00:45 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [NOTICE]   (299116) : path to executable is /usr/sbin/haproxy
Jan 31 04:00:45 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [WARNING]  (299116) : Exiting Master process...
Jan 31 04:00:45 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [ALERT]    (299116) : Current worker (299118) exited with code 143 (Terminated)
Jan 31 04:00:45 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299107]: [WARNING]  (299116) : All workers exited. Exiting... (0)
Jan 31 04:00:45 np0005603622 systemd[1]: libpod-685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8.scope: Deactivated successfully.
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.570 221324 INFO nova.virt.libvirt.driver [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance destroyed successfully.#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.572 221324 DEBUG nova.objects.instance [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'resources' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:45 np0005603622 podman[299449]: 2026-01-31 09:00:45.57355028 +0000 UTC m=+0.061835246 container died 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:00:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay-fc7856ba62f38bae8350f411f04ab16d0c105d4924770d38696cdc196ab476cb-merged.mount: Deactivated successfully.
Jan 31 04:00:45 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8-userdata-shm.mount: Deactivated successfully.
Jan 31 04:00:45 np0005603622 podman[299449]: 2026-01-31 09:00:45.601728915 +0000 UTC m=+0.090013871 container cleanup 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.602 221324 DEBUG nova.virt.libvirt.vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:59:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:00:45Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.603 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.605 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.606 221324 DEBUG os_vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.608 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap778471ec-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.614 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:00:45 np0005603622 systemd[1]: libpod-conmon-685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8.scope: Deactivated successfully.
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.617 221324 INFO os_vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff')#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.629 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start _get_guest_xml network_info=[{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=be36fc17-1fc9-4dc7-9df3-92082489a43b,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': 'be36fc17-1fc9-4dc7-9df3-92082489a43b'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': None, 'attachment_id': '8f74dbe0-01a0-43a5-af76-f10e47f68a95', 'mount_device': '/dev/vdb', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b7cfb970-aa00-4ab7-b895-9c828f0058d5', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b7cfb970-aa00-4ab7-b895-9c828f0058d5', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484', 'attached_at': '', 'detached_at': '', 'volume_id': 'b7cfb970-aa00-4ab7-b895-9c828f0058d5', 'serial': 'b7cfb970-aa00-4ab7-b895-9c828f0058d5'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.639 221324 WARNING nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.644 221324 DEBUG nova.virt.libvirt.host [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.644 221324 DEBUG nova.virt.libvirt.host [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.647 221324 DEBUG nova.virt.libvirt.host [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.647 221324 DEBUG nova.virt.libvirt.host [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.648 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.649 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=be36fc17-1fc9-4dc7-9df3-92082489a43b,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.649 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.649 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.649 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.650 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.650 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.650 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.650 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.651 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.651 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.651 221324 DEBUG nova.virt.hardware [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.651 221324 DEBUG nova.objects.instance [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:45 np0005603622 podman[299490]: 2026-01-31 09:00:45.654459116 +0000 UTC m=+0.036927209 container remove 685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.658 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9eefd71c-f566-44c9-8d8c-d26c3721106a]: (4, ('Sat Jan 31 09:00:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8)\n685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8\nSat Jan 31 09:00:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8)\n685ead17eca17f36e7a80b23ae0d4df83bc859a18053556c11a20ff7c9878bc8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.660 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3b6483-bb66-49d2-90e2-e5e2e597421e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.661 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:45 np0005603622 kernel: tap372c2b16-d0: left promiscuous mode
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.668 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.671 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[50e405de-aa3c-4b4a-83e7-205a920fd293]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 nova_compute[221301]: 2026-01-31 09:00:45.673 221324 DEBUG oslo_concurrency.processutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.689 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1185909a-7a78-46bb-b2dc-5bd4b7043829]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.691 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6702656b-eb07-43c0-a1a9-1b768759ac8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.701 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fb256eb4-9cd5-46a6-94a0-0f769407d70c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 924199, 'reachable_time': 32457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299509, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.704 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:00:45 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:45.704 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[bd22af32-5336-4927-ac01-26a519faae8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:45 np0005603622 systemd[1]: run-netns-ovnmeta\x2d372c2b16\x2ddcb3\x2d40d9\x2da594\x2d5fc185a4de0f.mount: Deactivated successfully.
Jan 31 04:00:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:00:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4189566669' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.104 221324 DEBUG oslo_concurrency.processutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.152 221324 DEBUG oslo_concurrency.processutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:00:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:46.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:00:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3504531332' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.540 221324 DEBUG oslo_concurrency.processutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.388s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.571 221324 DEBUG nova.virt.libvirt.vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:59:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:00:45Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.571 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.572 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.573 221324 DEBUG nova.objects.instance [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'pci_devices' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.590 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <uuid>fd777b00-402f-4a2e-8a77-2b61c7cf5484</uuid>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <name>instance-000000bf</name>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestMinimumBasicScenario-server-2037009249</nova:name>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:00:45</nova:creationTime>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:user uuid="b97882caec944ca3b26c230fcebff775">tempest-TestMinimumBasicScenario-1062170996-project-member</nova:user>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:project uuid="3d3e35e8f8144cd58baa994b70ef9668">tempest-TestMinimumBasicScenario-1062170996</nova:project>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="be36fc17-1fc9-4dc7-9df3-92082489a43b"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <nova:port uuid="778471ec-ff3e-4e08-915a-231eee582d31">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="serial">fd777b00-402f-4a2e-8a77-2b61c7cf5484</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="uuid">fd777b00-402f-4a2e-8a77-2b61c7cf5484</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/fd777b00-402f-4a2e-8a77-2b61c7cf5484_disk.config">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-b7cfb970-aa00-4ab7-b895-9c828f0058d5">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <target dev="vdb" bus="virtio"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <serial>b7cfb970-aa00-4ab7-b895-9c828f0058d5</serial>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:82:3e:6f"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <target dev="tap778471ec-ff"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484/console.log" append="off"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:00:46 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:00:46 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:00:46 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:00:46 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.591 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.591 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.591 221324 DEBUG nova.virt.libvirt.driver [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.592 221324 DEBUG nova.virt.libvirt.vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:59:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:00:45Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.592 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.593 221324 DEBUG nova.network.os_vif_util [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.593 221324 DEBUG os_vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.594 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.594 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.594 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.596 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.597 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap778471ec-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.597 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap778471ec-ff, col_values=(('external_ids', {'iface-id': '778471ec-ff3e-4e08-915a-231eee582d31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:3e:6f', 'vm-uuid': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.598 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.5996] manager: (tap778471ec-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.602 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.604 221324 INFO os_vif [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff')#033[00m
Jan 31 04:00:46 np0005603622 kernel: tap778471ec-ff: entered promiscuous mode
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.6526] manager: (tap778471ec-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Jan 31 04:00:46 np0005603622 systemd-udevd[299428]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:00:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:46Z|00926|binding|INFO|Claiming lport 778471ec-ff3e-4e08-915a-231eee582d31 for this chassis.
Jan 31 04:00:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:46Z|00927|binding|INFO|778471ec-ff3e-4e08-915a-231eee582d31: Claiming fa:16:3e:82:3e:6f 10.100.0.9
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.655 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.661 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:46Z|00928|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 ovn-installed in OVS
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.662 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.664 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.6653] device (tap778471ec-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.6660] device (tap778471ec-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:00:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:46Z|00929|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 up in Southbound
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.670 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:3e:6f 10.100.0.9'], port_security=['fa:16:3e:82:3e:6f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 df68c117-592c-4147-9e69-0223735d7906', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=778471ec-ff3e-4e08-915a-231eee582d31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.671 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 778471ec-ff3e-4e08-915a-231eee582d31 in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f bound to our chassis#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.672 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.682 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b11873be-48ba-4a0d-97aa-7f15a2573f1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.682 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap372c2b16-d1 in ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.685 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap372c2b16-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:00:46 np0005603622 systemd-machined[190406]: New machine qemu-96-instance-000000bf.
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.685 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1830a639-564e-404b-9552-f3f42a3adff7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.685 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3776de-9a29-4f4b-9028-a91b763f9d08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.694 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[05b5469c-2960-4ca9-9529-bb00e9fc4bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 systemd[1]: Started Virtual Machine qemu-96-instance-000000bf.
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.704 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39ac8069-bec3-4938-ad2c-bdbf13054620]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.728 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccdc5f8-b5c0-45cf-81a7-67f1b5016211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.734 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[032f5e35-608f-43f8-9003-41e2c96821f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.7354] manager: (tap372c2b16-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/407)
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.763 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a68df24a-19d5-4691-a656-b7dbba43eb5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.766 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ff03751a-c0fa-45ac-8e05-82dd15a55858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.7837] device (tap372c2b16-d0): carrier: link connected
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.787 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d5727e2d-cd24-4c22-8799-8721e15249c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.800 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[89a8ba55-60ce-4e15-bbd1-10b460baead5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 278], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 929812, 'reachable_time': 28532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299616, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.813 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7b68c9-3d7d-4f5a-a176-edebfdcb086b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:3847'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 929812, 'tstamp': 929812}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299617, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.826 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[93df95d9-acfb-4c8e-a8c9-18e27658b25b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 278], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 929812, 'reachable_time': 28532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299618, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.848 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[00c7f10f-251a-48b6-95df-fbfa1dd5105d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.893 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9be5869b-cbdb-4a35-852c-7999836f38b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.895 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.895 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.896 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap372c2b16-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 kernel: tap372c2b16-d0: entered promiscuous mode
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.898 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 NetworkManager[49080]: <info>  [1769850046.8995] manager: (tap372c2b16-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.903 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap372c2b16-d0, col_values=(('external_ids', {'iface-id': 'a077e97d-4d5f-408b-a598-1a37aa5f7556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:00:46Z|00930|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:00:46 np0005603622 nova_compute[221301]: 2026-01-31 09:00:46.916 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.917 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.918 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[caeafbac-69a0-4351-a905-a821734b2101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.918 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:00:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:46.920 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'env', 'PROCESS_TAG=haproxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/372c2b16-dcb3-40d9-a594-5fc185a4de0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:00:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:46.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:47 np0005603622 podman[299650]: 2026-01-31 09:00:47.272653706 +0000 UTC m=+0.050034350 container create be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:00:47 np0005603622 systemd[1]: Started libpod-conmon-be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187.scope.
Jan 31 04:00:47 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:00:47 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17b65ed93ac2201ab439fa3706f08e3335d36585f96a4084dd905b795c6eba3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:00:47 np0005603622 podman[299650]: 2026-01-31 09:00:47.338421927 +0000 UTC m=+0.115802601 container init be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 04:00:47 np0005603622 podman[299650]: 2026-01-31 09:00:47.343809761 +0000 UTC m=+0.121190405 container start be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:00:47 np0005603622 podman[299650]: 2026-01-31 09:00:47.25413071 +0000 UTC m=+0.031511374 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:00:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [NOTICE]   (299687) : New worker (299704) forked
Jan 31 04:00:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [NOTICE]   (299687) : Loading success.
Jan 31 04:00:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.520 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for fd777b00-402f-4a2e-8a77-2b61c7cf5484 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.521 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850047.5190597, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.521 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.524 221324 DEBUG nova.compute.manager [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.529 221324 INFO nova.virt.libvirt.driver [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance rebooted successfully.#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.530 221324 DEBUG nova.compute.manager [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.576 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.580 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.624 221324 DEBUG nova.compute.manager [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.625 221324 DEBUG oslo_concurrency.lockutils [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.625 221324 DEBUG oslo_concurrency.lockutils [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.625 221324 DEBUG oslo_concurrency.lockutils [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.625 221324 DEBUG nova.compute.manager [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.626 221324 WARNING nova.compute.manager [req-8f4f4cdf-32f7-4d56-9570-ebb0a7e49cc4 req-73153ee2-df1c-47fd-a851-e0c0910bc006 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.633 221324 DEBUG oslo_concurrency.lockutils [None req-3fe5457d-396a-46a6-997a-6c17ba3a7410 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.636 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850047.5206246, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.637 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Started (Lifecycle Event)#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.676 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:00:47 np0005603622 nova_compute[221301]: 2026-01-31 09:00:47.680 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:00:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:48.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:48 np0005603622 nova_compute[221301]: 2026-01-31 09:00:48.551 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:49.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.797 221324 DEBUG nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.797 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 DEBUG nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 WARNING nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 DEBUG nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.798 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.799 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.799 221324 DEBUG oslo_concurrency.lockutils [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.799 221324 DEBUG nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:49 np0005603622 nova_compute[221301]: 2026-01-31 09:00:49.799 221324 WARNING nova.compute.manager [req-465684f9-5d65-42fb-9012-1ecdc639d69c req-51d8af69-c35c-4fbb-965d-3fb6d1cce88a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:00:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:50.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:50 np0005603622 nova_compute[221301]: 2026-01-31 09:00:50.372 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:51.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:51 np0005603622 podman[299742]: 2026-01-31 09:00:51.130411261 +0000 UTC m=+0.054911451 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 04:00:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:51 np0005603622 podman[299741]: 2026-01-31 09:00:51.191696351 +0000 UTC m=+0.114292180 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 04:00:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:51.594 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:00:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:51.596 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.597 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.599 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.974 221324 DEBUG nova.compute.manager [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.975 221324 DEBUG oslo_concurrency.lockutils [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.975 221324 DEBUG oslo_concurrency.lockutils [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.975 221324 DEBUG oslo_concurrency.lockutils [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.976 221324 DEBUG nova.compute.manager [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:00:51 np0005603622 nova_compute[221301]: 2026-01-31 09:00:51.976 221324 WARNING nova.compute.manager [req-28bb18ec-627f-40eb-81bb-7ac05d075172 req-79d607b2-39f7-44a2-9252-2ffe409104bf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:00:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:52.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.397 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.398 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.398 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.399 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.399 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.399 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.471 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.471 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Image id be36fc17-1fc9-4dc7-9df3-92082489a43b yields fingerprint 77591e51c65a9b32a7dd2836400e7a772ef7a831 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.472 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image be36fc17-1fc9-4dc7-9df3-92082489a43b at (/var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831): checking#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.472 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] image be36fc17-1fc9-4dc7-9df3-92082489a43b at (/var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.476 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.477 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] fd777b00-402f-4a2e-8a77-2b61c7cf5484 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.477 221324 WARNING nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.477 221324 WARNING nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.477 221324 WARNING nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.478 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Active base files: /var/lib/nova/instances/_base/77591e51c65a9b32a7dd2836400e7a772ef7a831#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.478 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Removable base files: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813 /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.479 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.479 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/04f9862a6688592352a3324f53b6597ee300b813#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.479 221324 INFO nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/365f9823d2619ef09948bdeed685488da63755b5#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.479 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.479 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 31 04:00:52 np0005603622 nova_compute[221301]: 2026-01-31 09:00:52.480 221324 DEBUG nova.virt.libvirt.imagecache [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 31 04:00:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:53.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:54 np0005603622 nova_compute[221301]: 2026-01-31 09:00:54.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:54.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:55.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:55 np0005603622 nova_compute[221301]: 2026-01-31 09:00:55.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:55 np0005603622 nova_compute[221301]: 2026-01-31 09:00:55.748 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:00:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:56.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:00:56 np0005603622 nova_compute[221301]: 2026-01-31 09:00:56.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:00:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:57.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:00:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:00:58.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:00:58 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:00:58.598 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:00:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:00:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:00:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:00:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:00 np0005603622 ovn_controller[130109]: 2026-01-31T09:01:00Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:3e:6f 10.100.0.9
Jan 31 04:01:00 np0005603622 nova_compute[221301]: 2026-01-31 09:01:00.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:00.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:01.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:01 np0005603622 nova_compute[221301]: 2026-01-31 09:01:01.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:02.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:03.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:04.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:05.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:05 np0005603622 nova_compute[221301]: 2026-01-31 09:01:05.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:06.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:06 np0005603622 nova_compute[221301]: 2026-01-31 09:01:06.647 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:01:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:07.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:01:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:08.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:09.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:10 np0005603622 nova_compute[221301]: 2026-01-31 09:01:10.380 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:10.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:11.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:11 np0005603622 nova_compute[221301]: 2026-01-31 09:01:11.480 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:11 np0005603622 nova_compute[221301]: 2026-01-31 09:01:11.480 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:01:11 np0005603622 nova_compute[221301]: 2026-01-31 09:01:11.650 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:01:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:12.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.867 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.867 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.868 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:01:12 np0005603622 nova_compute[221301]: 2026-01-31 09:01:12.868 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:13.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:15.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:15 np0005603622 nova_compute[221301]: 2026-01-31 09:01:15.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:16.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:16 np0005603622 nova_compute[221301]: 2026-01-31 09:01:16.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:17.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.775 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.794 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.794 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.794 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.795 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.795 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.795 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.815 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.816 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.816 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.816 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:01:17 np0005603622 nova_compute[221301]: 2026-01-31 09:01:17.816 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4242174710' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.214 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.298 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.298 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.298 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000bf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:01:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:18.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.430 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.432 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4102MB free_disk=20.921878814697266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.521 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.521 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.522 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:01:18 np0005603622 nova_compute[221301]: 2026-01-31 09:01:18.575 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1495222277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:19 np0005603622 nova_compute[221301]: 2026-01-31 09:01:19.014 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:19 np0005603622 nova_compute[221301]: 2026-01-31 09:01:19.020 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:19.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:19 np0005603622 nova_compute[221301]: 2026-01-31 09:01:19.060 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:19 np0005603622 nova_compute[221301]: 2026-01-31 09:01:19.086 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:01:19 np0005603622 nova_compute[221301]: 2026-01-31 09:01:19.087 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:20 np0005603622 nova_compute[221301]: 2026-01-31 09:01:20.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:20.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:21.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:21 np0005603622 nova_compute[221301]: 2026-01-31 09:01:21.082 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:21 np0005603622 nova_compute[221301]: 2026-01-31 09:01:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:21 np0005603622 nova_compute[221301]: 2026-01-31 09:01:21.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:22 np0005603622 podman[299841]: 2026-01-31 09:01:22.136900304 +0000 UTC m=+0.053057972 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:01:22 np0005603622 podman[299840]: 2026-01-31 09:01:22.150661452 +0000 UTC m=+0.072117772 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:01:22 np0005603622 nova_compute[221301]: 2026-01-31 09:01:22.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:22.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:23.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:23 np0005603622 nova_compute[221301]: 2026-01-31 09:01:23.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:24.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:25.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:25 np0005603622 nova_compute[221301]: 2026-01-31 09:01:25.418 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.000 221324 DEBUG nova.compute.manager [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.001 221324 DEBUG nova.compute.manager [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.001 221324 DEBUG oslo_concurrency.lockutils [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.002 221324 DEBUG oslo_concurrency.lockutils [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.002 221324 DEBUG nova.network.neutron [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:01:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:26.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:26 np0005603622 nova_compute[221301]: 2026-01-31 09:01:26.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:27.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:28.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:29.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:30 np0005603622 nova_compute[221301]: 2026-01-31 09:01:30.197 221324 DEBUG nova.network.neutron [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:01:30 np0005603622 nova_compute[221301]: 2026-01-31 09:01:30.197 221324 DEBUG nova.network.neutron [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:30 np0005603622 nova_compute[221301]: 2026-01-31 09:01:30.419 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:30.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:30 np0005603622 nova_compute[221301]: 2026-01-31 09:01:30.507 221324 DEBUG oslo_concurrency.lockutils [req-cb8984f7-81dd-4bed-b04d-a295c72c8fd9 req-46abfbb6-87dd-4466-9872-163124c3a8cb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:30.920 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:30.920 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:30.921 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:31.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:31 np0005603622 nova_compute[221301]: 2026-01-31 09:01:31.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:01:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:32.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:01:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:33.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:34.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:35.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:35 np0005603622 nova_compute[221301]: 2026-01-31 09:01:35.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:36.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.553 221324 DEBUG nova.compute.manager [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-changed-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.553 221324 DEBUG nova.compute.manager [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing instance network info cache due to event network-changed-778471ec-ff3e-4e08-915a-231eee582d31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.554 221324 DEBUG oslo_concurrency.lockutils [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.554 221324 DEBUG oslo_concurrency.lockutils [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.554 221324 DEBUG nova.network.neutron [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Refreshing network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:01:36 np0005603622 nova_compute[221301]: 2026-01-31 09:01:36.714 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:37.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:38.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:39 np0005603622 nova_compute[221301]: 2026-01-31 09:01:39.075 221324 DEBUG nova.network.neutron [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updated VIF entry in instance network info cache for port 778471ec-ff3e-4e08-915a-231eee582d31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:01:39 np0005603622 nova_compute[221301]: 2026-01-31 09:01:39.076 221324 DEBUG nova.network.neutron [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [{"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:39.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:39 np0005603622 nova_compute[221301]: 2026-01-31 09:01:39.348 221324 DEBUG oslo_concurrency.lockutils [req-6e5e10c6-6907-4849-a5f8-feda616f17eb req-98686f67-02c1-4623-89dd-808068147ff5 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-fd777b00-402f-4a2e-8a77-2b61c7cf5484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:01:40 np0005603622 nova_compute[221301]: 2026-01-31 09:01:40.425 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:40.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:41.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:41 np0005603622 nova_compute[221301]: 2026-01-31 09:01:41.426 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:41 np0005603622 nova_compute[221301]: 2026-01-31 09:01:41.426 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:01:41 np0005603622 nova_compute[221301]: 2026-01-31 09:01:41.719 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.420 221324 DEBUG oslo_concurrency.lockutils [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.421 221324 DEBUG oslo_concurrency.lockutils [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:42.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.498 221324 INFO nova.compute.manager [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Detaching volume b7cfb970-aa00-4ab7-b895-9c828f0058d5#033[00m
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.987 221324 INFO nova.virt.block_device [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Attempting to driver detach volume b7cfb970-aa00-4ab7-b895-9c828f0058d5 from mountpoint /dev/vdb#033[00m
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.998 221324 DEBUG nova.virt.libvirt.driver [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Attempting to detach device vdb from instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 04:01:42 np0005603622 nova_compute[221301]: 2026-01-31 09:01:42.998 221324 DEBUG nova.virt.libvirt.guest [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:01:42 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:01:42 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-b7cfb970-aa00-4ab7-b895-9c828f0058d5">
Jan 31 04:01:42 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:42 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:42 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:42 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:01:42 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <serial>b7cfb970-aa00-4ab7-b895-9c828f0058d5</serial>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:01:43 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.106 221324 INFO nova.virt.libvirt.driver [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully detached device vdb from instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 from the persistent domain config.#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.106 221324 DEBUG nova.virt.libvirt.driver [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.107 221324 DEBUG nova.virt.libvirt.guest [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-b7cfb970-aa00-4ab7-b895-9c828f0058d5">
Jan 31 04:01:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <serial>b7cfb970-aa00-4ab7-b895-9c828f0058d5</serial>
Jan 31 04:01:43 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 31 04:01:43 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:01:43 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:01:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.164 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.215 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769850103.2145088, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.216 221324 DEBUG nova.virt.libvirt.driver [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.218 221324 INFO nova.virt.libvirt.driver [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully detached device vdb from instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 from the live domain config.#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.761 221324 DEBUG nova.objects.instance [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:43 np0005603622 nova_compute[221301]: 2026-01-31 09:01:43.827 221324 DEBUG oslo_concurrency.lockutils [None req-38399795-6908-4ddd-8227-225b7d018486 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:44.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:01:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:01:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:45 np0005603622 nova_compute[221301]: 2026-01-31 09:01:45.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:01:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:46.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.700 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.701 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.701 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.702 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.702 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.703 221324 INFO nova.compute.manager [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Terminating instance#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.704 221324 DEBUG nova.compute.manager [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.722 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 kernel: tap778471ec-ff (unregistering): left promiscuous mode
Jan 31 04:01:46 np0005603622 NetworkManager[49080]: <info>  [1769850106.8363] device (tap778471ec-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:01:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:01:46Z|00931|binding|INFO|Releasing lport 778471ec-ff3e-4e08-915a-231eee582d31 from this chassis (sb_readonly=0)
Jan 31 04:01:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:01:46Z|00932|binding|INFO|Setting lport 778471ec-ff3e-4e08-915a-231eee582d31 down in Southbound
Jan 31 04:01:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:01:46Z|00933|binding|INFO|Removing iface tap778471ec-ff ovn-installed in OVS
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.846 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.874 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:46.876 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:3e:6f 10.100.0.9'], port_security=['fa:16:3e:82:3e:6f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd777b00-402f-4a2e-8a77-2b61c7cf5484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=778471ec-ff3e-4e08-915a-231eee582d31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:46.877 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 778471ec-ff3e-4e08-915a-231eee582d31 in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:01:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:46.879 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:01:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:46.881 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[07624a8a-fa4a-42a7-8b17-34668adbbe6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:46 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:46.881 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace which is not needed anymore#033[00m
Jan 31 04:01:46 np0005603622 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000bf.scope: Deactivated successfully.
Jan 31 04:01:46 np0005603622 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000bf.scope: Consumed 15.925s CPU time.
Jan 31 04:01:46 np0005603622 systemd-machined[190406]: Machine qemu-96-instance-000000bf terminated.
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.936 221324 INFO nova.virt.libvirt.driver [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Instance destroyed successfully.#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.937 221324 DEBUG nova.objects.instance [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'resources' on Instance uuid fd777b00-402f-4a2e-8a77-2b61c7cf5484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.961 221324 DEBUG nova.virt.libvirt.vif [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T08:59:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2037009249',display_name='tempest-TestMinimumBasicScenario-server-2037009249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2037009249',id=191,image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG2CSeIfukotaly0u8Scx5VRDfS/5VQoZ+GmuWQA0f4RO3nieXzBWi6xcpOtR13R9YeVfMpSFWckgvknkMWPxQDOKBZZlOnVFCYvLnmixnYBttsMr77nYzPf84/NYI6UZg==',key_name='tempest-TestMinimumBasicScenario-718281137',keypairs=<?>,launch_index=0,launched_at=2026-01-31T08:59:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-wpr0dgsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='be36fc17-1fc9-4dc7-9df3-92082489a43b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:00:47Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=fd777b00-402f-4a2e-8a77-2b61c7cf5484,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.962 221324 DEBUG nova.network.os_vif_util [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "778471ec-ff3e-4e08-915a-231eee582d31", "address": "fa:16:3e:82:3e:6f", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap778471ec-ff", "ovs_interfaceid": "778471ec-ff3e-4e08-915a-231eee582d31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.962 221324 DEBUG nova.network.os_vif_util [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.962 221324 DEBUG os_vif [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.964 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap778471ec-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.966 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.967 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:46 np0005603622 nova_compute[221301]: 2026-01-31 09:01:46.969 221324 INFO os_vif [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:3e:6f,bridge_name='br-int',has_traffic_filtering=True,id=778471ec-ff3e-4e08-915a-231eee582d31,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap778471ec-ff')#033[00m
Jan 31 04:01:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [NOTICE]   (299687) : haproxy version is 2.8.14-c23fe91
Jan 31 04:01:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [NOTICE]   (299687) : path to executable is /usr/sbin/haproxy
Jan 31 04:01:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [WARNING]  (299687) : Exiting Master process...
Jan 31 04:01:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [ALERT]    (299687) : Current worker (299704) exited with code 143 (Terminated)
Jan 31 04:01:47 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[299665]: [WARNING]  (299687) : All workers exited. Exiting... (0)
Jan 31 04:01:47 np0005603622 systemd[1]: libpod-be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187.scope: Deactivated successfully.
Jan 31 04:01:47 np0005603622 podman[300058]: 2026-01-31 09:01:47.028647181 +0000 UTC m=+0.059867184 container died be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:01:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187-userdata-shm.mount: Deactivated successfully.
Jan 31 04:01:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay-17b65ed93ac2201ab439fa3706f08e3335d36585f96a4084dd905b795c6eba3a-merged.mount: Deactivated successfully.
Jan 31 04:01:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:47.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:47 np0005603622 podman[300058]: 2026-01-31 09:01:47.142953781 +0000 UTC m=+0.174173824 container cleanup be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 31 04:01:47 np0005603622 systemd[1]: libpod-conmon-be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187.scope: Deactivated successfully.
Jan 31 04:01:47 np0005603622 podman[300104]: 2026-01-31 09:01:47.235644543 +0000 UTC m=+0.071300850 container remove be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.242 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[24a36752-b818-4ca4-832c-c0642afe823d]: (4, ('Sat Jan 31 09:01:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187)\nbe33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187\nSat Jan 31 09:01:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (be33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187)\nbe33c3733aaa2578f704286b7b7823835623e2357131dde19db50382e042c187\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.245 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02c8b83f-7b22-4b23-8e49-0684901406f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.247 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:01:47 np0005603622 kernel: tap372c2b16-d0: left promiscuous mode
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.256 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.260 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15ea99f2-fe9c-4aaf-bebd-d70baf103196]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.273 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c045347-fa86-41e9-bdc8-300d144cc721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.275 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8b0c4a-ef29-4178-8539-7d23526dd754]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.286 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4759e0-c399-4551-96ec-d495189abed4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 929806, 'reachable_time': 18520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300120, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.290 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:01:47 np0005603622 systemd[1]: run-netns-ovnmeta\x2d372c2b16\x2ddcb3\x2d40d9\x2da594\x2d5fc185a4de0f.mount: Deactivated successfully.
Jan 31 04:01:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:47.291 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[b0570f6e-0886-409e-b524-cf26702df4d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.639 221324 DEBUG nova.compute.manager [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.640 221324 DEBUG oslo_concurrency.lockutils [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.640 221324 DEBUG oslo_concurrency.lockutils [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.641 221324 DEBUG oslo_concurrency.lockutils [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.641 221324 DEBUG nova.compute.manager [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:01:47 np0005603622 nova_compute[221301]: 2026-01-31 09:01:47.641 221324 DEBUG nova.compute.manager [req-d316e5e8-3350-46da-a8b3-608a4fc84066 req-162c9503-2392-4d8c-866e-3dc804844c0e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-unplugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:01:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:48.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.745 221324 INFO nova.virt.libvirt.driver [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Deleting instance files /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484_del#033[00m
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.746 221324 INFO nova.virt.libvirt.driver [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Deletion of /var/lib/nova/instances/fd777b00-402f-4a2e-8a77-2b61c7cf5484_del complete#033[00m
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.909 221324 INFO nova.compute.manager [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Took 2.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.910 221324 DEBUG oslo.service.loopingcall [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.910 221324 DEBUG nova.compute.manager [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:01:48 np0005603622 nova_compute[221301]: 2026-01-31 09:01:48.910 221324 DEBUG nova.network.neutron [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:01:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:01:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:49.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:01:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:49 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.076 221324 DEBUG nova.compute.manager [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.077 221324 DEBUG oslo_concurrency.lockutils [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.077 221324 DEBUG oslo_concurrency.lockutils [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.077 221324 DEBUG oslo_concurrency.lockutils [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.078 221324 DEBUG nova.compute.manager [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] No waiting events found dispatching network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.078 221324 WARNING nova.compute.manager [req-e8ca49ba-83fb-4c42-8f0e-43da629c85d2 req-32f9f01d-1081-4492-a9be-17b28d193134 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received unexpected event network-vif-plugged-778471ec-ff3e-4e08-915a-231eee582d31 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:50.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.607 221324 DEBUG nova.compute.manager [req-2f96c23d-5f92-431f-b966-4bfb71e2c344 req-c505325b-ae02-4a47-a62d-a3714a4244eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Received event network-vif-deleted-778471ec-ff3e-4e08-915a-231eee582d31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.608 221324 INFO nova.compute.manager [req-2f96c23d-5f92-431f-b966-4bfb71e2c344 req-c505325b-ae02-4a47-a62d-a3714a4244eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Neutron deleted interface 778471ec-ff3e-4e08-915a-231eee582d31; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.609 221324 DEBUG nova.network.neutron [req-2f96c23d-5f92-431f-b966-4bfb71e2c344 req-c505325b-ae02-4a47-a62d-a3714a4244eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.621 221324 DEBUG nova.network.neutron [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.746 221324 INFO nova.compute.manager [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Took 1.84 seconds to deallocate network for instance.#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.757 221324 DEBUG nova.compute.manager [req-2f96c23d-5f92-431f-b966-4bfb71e2c344 req-c505325b-ae02-4a47-a62d-a3714a4244eb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Detach interface failed, port_id=778471ec-ff3e-4e08-915a-231eee582d31, reason: Instance fd777b00-402f-4a2e-8a77-2b61c7cf5484 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.881 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.881 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:01:50 np0005603622 nova_compute[221301]: 2026-01-31 09:01:50.945 221324 DEBUG oslo_concurrency.processutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:01:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:51.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:01:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/340076743' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.358 221324 DEBUG oslo_concurrency.processutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.366 221324 DEBUG nova.compute.provider_tree [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.412 221324 DEBUG nova.scheduler.client.report [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.454 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.532 221324 INFO nova.scheduler.client.report [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Deleted allocations for instance fd777b00-402f-4a2e-8a77-2b61c7cf5484#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.722 221324 DEBUG oslo_concurrency.lockutils [None req-3367e3ff-2118-43e9-97ce-f52913aac347 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "fd777b00-402f-4a2e-8a77-2b61c7cf5484" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.800 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:51.800 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:01:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:01:51.801 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:01:51 np0005603622 nova_compute[221301]: 2026-01-31 09:01:51.967 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:52.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:53 np0005603622 podman[300197]: 2026-01-31 09:01:53.140448438 +0000 UTC m=+0.060210223 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:01:53 np0005603622 podman[300196]: 2026-01-31 09:01:53.172020903 +0000 UTC m=+0.091836830 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:01:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 31 04:01:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:54.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:55 np0005603622 nova_compute[221301]: 2026-01-31 09:01:55.431 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:01:56 np0005603622 nova_compute[221301]: 2026-01-31 09:01:56.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:56 np0005603622 nova_compute[221301]: 2026-01-31 09:01:56.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:01:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:56.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:56 np0005603622 nova_compute[221301]: 2026-01-31 09:01:56.589 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:01:56 np0005603622 nova_compute[221301]: 2026-01-31 09:01:56.969 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:01:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:57.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 31 04:01:58 np0005603622 nova_compute[221301]: 2026-01-31 09:01:58.153 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:01:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:01:58.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:01:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:01:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:01:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:01:59.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:00 np0005603622 nova_compute[221301]: 2026-01-31 09:02:00.434 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:00.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:01.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:01.804 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:01 np0005603622 nova_compute[221301]: 2026-01-31 09:02:01.936 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850106.9343383, fd777b00-402f-4a2e-8a77-2b61c7cf5484 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:01 np0005603622 nova_compute[221301]: 2026-01-31 09:02:01.937 221324 INFO nova.compute.manager [-] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:02:01 np0005603622 nova_compute[221301]: 2026-01-31 09:02:01.971 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:01 np0005603622 nova_compute[221301]: 2026-01-31 09:02:01.992 221324 DEBUG nova.compute.manager [None req-f6529de1-9179-4eb7-8fe0-c13af4c73ddb - - - - - -] [instance: fd777b00-402f-4a2e-8a77-2b61c7cf5484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:02:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 61K writes, 243K keys, 61K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.04 MB/s#012Cumulative WAL: 61K writes, 22K syncs, 2.75 writes per sync, written: 0.23 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7872 writes, 30K keys, 7872 commit groups, 1.0 writes per commit group, ingest: 29.88 MB, 0.05 MB/s#012Interval WAL: 7871 writes, 3051 syncs, 2.58 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 
Jan 31 04:02:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:02.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 31 04:02:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:03.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.380 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.381 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.436 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:02:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:04.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.619 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.619 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.630 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.631 221324 INFO nova.compute.claims [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:02:04 np0005603622 nova_compute[221301]: 2026-01-31 09:02:04.848 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:05.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3682377593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.274 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.281 221324 DEBUG nova.compute.provider_tree [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.313 221324 DEBUG nova.scheduler.client.report [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.387 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.387 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.950 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:02:05 np0005603622 nova_compute[221301]: 2026-01-31 09:02:05.951 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:02:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e391 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:06 np0005603622 nova_compute[221301]: 2026-01-31 09:02:06.184 221324 DEBUG nova.policy [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b97882caec944ca3b26c230fcebff775', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:02:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:06.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:06 np0005603622 nova_compute[221301]: 2026-01-31 09:02:06.499 221324 INFO nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:02:06 np0005603622 nova_compute[221301]: 2026-01-31 09:02:06.636 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:02:06 np0005603622 nova_compute[221301]: 2026-01-31 09:02:06.974 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:07.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.203 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.205 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.205 221324 INFO nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Creating image(s)#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.237 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.272 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.313 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.318 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "4b94976e1d9a0c9bec88a9c53727eacadd37cabb" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.319 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "4b94976e1d9a0c9bec88a9c53727eacadd37cabb" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:07 np0005603622 nova_compute[221301]: 2026-01-31 09:02:07.661 221324 DEBUG nova.virt.libvirt.imagebackend [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/4ac04f66-def8-4f84-bf0b-3cd9f30ead2e/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/4ac04f66-def8-4f84-bf0b-3cd9f30ead2e/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 04:02:08 np0005603622 nova_compute[221301]: 2026-01-31 09:02:08.106 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Successfully created port: 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:02:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:08.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 31 04:02:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:09.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:09 np0005603622 nova_compute[221301]: 2026-01-31 09:02:09.617 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:09 np0005603622 nova_compute[221301]: 2026-01-31 09:02:09.681 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:09 np0005603622 nova_compute[221301]: 2026-01-31 09:02:09.682 221324 DEBUG nova.virt.images [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] 4ac04f66-def8-4f84-bf0b-3cd9f30ead2e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 31 04:02:09 np0005603622 nova_compute[221301]: 2026-01-31 09:02:09.698 221324 DEBUG nova.privsep.utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 31 04:02:09 np0005603622 nova_compute[221301]: 2026-01-31 09:02:09.700 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.part /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:10 np0005603622 nova_compute[221301]: 2026-01-31 09:02:10.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:10.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:11.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.280 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.part /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.converted" returned: 0 in 1.580s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.284 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.346 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb.converted --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.348 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "4b94976e1d9a0c9bec88a9c53727eacadd37cabb" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.645 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.650 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:11 np0005603622 nova_compute[221301]: 2026-01-31 09:02:11.976 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:12 np0005603622 nova_compute[221301]: 2026-01-31 09:02:12.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:12 np0005603622 nova_compute[221301]: 2026-01-31 09:02:12.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:02:12 np0005603622 nova_compute[221301]: 2026-01-31 09:02:12.412 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/4b94976e1d9a0c9bec88a9c53727eacadd37cabb 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.762s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:12.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:12 np0005603622 nova_compute[221301]: 2026-01-31 09:02:12.513 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] resizing rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.023 221324 DEBUG nova.objects.instance [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'migration_context' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.155 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.155 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Ensure instance console log exists: /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.156 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.157 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.157 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:02:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:13.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.604 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Successfully updated port: 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.895 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 31 04:02:13 np0005603622 nova_compute[221301]: 2026-01-31 09:02:13.896 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.148 221324 DEBUG nova.compute.manager [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.149 221324 DEBUG nova.compute.manager [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.149 221324 DEBUG oslo_concurrency.lockutils [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.149 221324 DEBUG oslo_concurrency.lockutils [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.149 221324 DEBUG nova.network.neutron [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:14.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:14 np0005603622 nova_compute[221301]: 2026-01-31 09:02:14.831 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:15.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:15 np0005603622 nova_compute[221301]: 2026-01-31 09:02:15.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:16 np0005603622 nova_compute[221301]: 2026-01-31 09:02:16.327 221324 DEBUG nova.network.neutron [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:02:16 np0005603622 nova_compute[221301]: 2026-01-31 09:02:16.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:16 np0005603622 nova_compute[221301]: 2026-01-31 09:02:16.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:02:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:02:16 np0005603622 nova_compute[221301]: 2026-01-31 09:02:16.980 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:02:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:17.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:02:17 np0005603622 nova_compute[221301]: 2026-01-31 09:02:17.269 221324 DEBUG nova.network.neutron [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:17 np0005603622 nova_compute[221301]: 2026-01-31 09:02:17.324 221324 DEBUG oslo_concurrency.lockutils [req-81c70391-c31a-4bf4-b3d6-22cd7a1d647e req-ddb35406-9d98-44e3-be0c-876541d0f2c6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:17 np0005603622 nova_compute[221301]: 2026-01-31 09:02:17.325 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:17 np0005603622 nova_compute[221301]: 2026-01-31 09:02:17.326 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:02:17 np0005603622 nova_compute[221301]: 2026-01-31 09:02:17.933 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.501 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.502 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.503 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.503 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.503 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/962253559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:18 np0005603622 nova_compute[221301]: 2026-01-31 09:02:18.941 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.163 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.164 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4287MB free_disk=20.921737670898438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.164 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.165 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:19.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.350 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.351 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.351 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.378 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.399 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.400 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.436 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.479 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.566 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4008166292' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.986 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:19 np0005603622 nova_compute[221301]: 2026-01-31 09:02:19.993 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:20 np0005603622 nova_compute[221301]: 2026-01-31 09:02:20.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.053 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.116 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.116 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.137 221324 DEBUG nova.network.neutron [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:21.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.184 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.184 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance network_info: |[{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.189 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Start _get_guest_xml network_info=[{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T09:01:54Z,direct_url=<?>,disk_format='qcow2',id=4ac04f66-def8-4f84-bf0b-3cd9f30ead2e,min_disk=0,min_ram=0,name='tempest-scenario-img--576066856',owner='3d3e35e8f8144cd58baa994b70ef9668',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T09:01:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '4ac04f66-def8-4f84-bf0b-3cd9f30ead2e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.194 221324 WARNING nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.199 221324 DEBUG nova.virt.libvirt.host [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.200 221324 DEBUG nova.virt.libvirt.host [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.205 221324 DEBUG nova.virt.libvirt.host [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.205 221324 DEBUG nova.virt.libvirt.host [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.207 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.207 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T09:01:54Z,direct_url=<?>,disk_format='qcow2',id=4ac04f66-def8-4f84-bf0b-3cd9f30ead2e,min_disk=0,min_ram=0,name='tempest-scenario-img--576066856',owner='3d3e35e8f8144cd58baa994b70ef9668',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T09:01:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.208 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.208 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.208 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.209 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.209 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.209 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.210 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.210 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.210 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.211 221324 DEBUG nova.virt.hardware [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.214 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.562 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.563 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.602 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:02:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1944266769' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.658 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.682 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.687 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.753 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.754 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.767 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.768 221324 INFO nova.compute.claims [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:02:21 np0005603622 nova_compute[221301]: 2026-01-31 09:02:21.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.042 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.118 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:22.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3093516945' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.905 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.907 221324 DEBUG nova.virt.libvirt.vif [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-658263724',display_name='tempest-TestMinimumBasicScenario-server-658263724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-658263724',id=196,image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGA+WuywXL+55/vx7uj7PTx1cz1/ItP/f4HuT/p0jrGyBXYVxdICigpiBOn5njD5HFLs9ouHXHjfSOv771OgNM07YTfPB0CvsdhhrfkbZLffGXg9UUWxiE7KGrMJRQCH+g==',key_name='tempest-TestMinimumBasicScenario-2070658277',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-gck4f39w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:06Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.907 221324 DEBUG nova.network.os_vif_util [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.909 221324 DEBUG nova.network.os_vif_util [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:22 np0005603622 nova_compute[221301]: 2026-01-31 09:02:22.910 221324 DEBUG nova.objects.instance [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'pci_devices' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/253071788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:23 np0005603622 nova_compute[221301]: 2026-01-31 09:02:23.025 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.983s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:23 np0005603622 nova_compute[221301]: 2026-01-31 09:02:23.033 221324 DEBUG nova.compute.provider_tree [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:23.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:23 np0005603622 nova_compute[221301]: 2026-01-31 09:02:23.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:23 np0005603622 nova_compute[221301]: 2026-01-31 09:02:23.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:02:24 np0005603622 podman[300574]: 2026-01-31 09:02:24.130246904 +0000 UTC m=+0.046939748 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:02:24 np0005603622 podman[300573]: 2026-01-31 09:02:24.166431952 +0000 UTC m=+0.080543497 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.379 221324 DEBUG nova.scheduler.client.report [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.385 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <uuid>83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6</uuid>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <name>instance-000000c4</name>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestMinimumBasicScenario-server-658263724</nova:name>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:02:21</nova:creationTime>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:user uuid="b97882caec944ca3b26c230fcebff775">tempest-TestMinimumBasicScenario-1062170996-project-member</nova:user>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:project uuid="3d3e35e8f8144cd58baa994b70ef9668">tempest-TestMinimumBasicScenario-1062170996</nova:project>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="4ac04f66-def8-4f84-bf0b-3cd9f30ead2e"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <nova:port uuid="3154ceb6-2afa-43f3-bb3a-ddf285ae43eb">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="serial">83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="uuid">83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:f7:04:1c"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <target dev="tap3154ceb6-2a"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/console.log" append="off"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:02:24 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:02:24 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:02:24 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:02:24 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.386 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Preparing to wait for external event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.387 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.387 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.387 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.388 221324 DEBUG nova.virt.libvirt.vif [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-658263724',display_name='tempest-TestMinimumBasicScenario-server-658263724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-658263724',id=196,image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGA+WuywXL+55/vx7uj7PTx1cz1/ItP/f4HuT/p0jrGyBXYVxdICigpiBOn5njD5HFLs9ouHXHjfSOv771OgNM07YTfPB0CvsdhhrfkbZLffGXg9UUWxiE7KGrMJRQCH+g==',key_name='tempest-TestMinimumBasicScenario-2070658277',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-gck4f39w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:06Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.388 221324 DEBUG nova.network.os_vif_util [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.389 221324 DEBUG nova.network.os_vif_util [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.389 221324 DEBUG os_vif [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.390 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.391 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.395 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3154ceb6-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.395 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3154ceb6-2a, col_values=(('external_ids', {'iface-id': '3154ceb6-2afa-43f3-bb3a-ddf285ae43eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:04:1c', 'vm-uuid': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:24 np0005603622 NetworkManager[49080]: <info>  [1769850144.4003] manager: (tap3154ceb6-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.400 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.405 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.405 221324 INFO os_vif [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a')#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.436 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.437 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:02:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:24.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.524 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.524 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.524 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No VIF found with MAC fa:16:3e:f7:04:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.525 221324 INFO nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Using config drive#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.549 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.641 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.641 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.699 221324 INFO nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.742 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:02:24 np0005603622 nova_compute[221301]: 2026-01-31 09:02:24.818 221324 INFO nova.virt.block_device [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Booting with volume snapshot 714b73f4-e266-4c1a-8a63-cf91334700b6 at /dev/vda#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.051 221324 DEBUG nova.policy [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:02:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:25.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.398 221324 INFO nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Creating config drive at /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.401 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmparma1qkm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.445 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.532 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmparma1qkm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.561 221324 DEBUG nova.storage.rbd_utils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] rbd image 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:25 np0005603622 nova_compute[221301]: 2026-01-31 09:02:25.567 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.159752) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146159805, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1813, "num_deletes": 255, "total_data_size": 3911132, "memory_usage": 3967328, "flush_reason": "Manual Compaction"}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146176498, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 2554761, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80433, "largest_seqno": 82241, "table_properties": {"data_size": 2547338, "index_size": 4301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16335, "raw_average_key_size": 20, "raw_value_size": 2532193, "raw_average_value_size": 3185, "num_data_blocks": 189, "num_entries": 795, "num_filter_entries": 795, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850008, "oldest_key_time": 1769850008, "file_creation_time": 1769850146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 16800 microseconds, and 5665 cpu microseconds.
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.176554) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 2554761 bytes OK
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.176573) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.179253) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.179350) EVENT_LOG_v1 {"time_micros": 1769850146179334, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.179387) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 3902914, prev total WAL file size 3902914, number of live WAL files 2.
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.180506) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(2494KB)], [165(12MB)]
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146180571, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 15526412, "oldest_snapshot_seqno": -1}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 10225 keys, 13698466 bytes, temperature: kUnknown
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146266244, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 13698466, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13631632, "index_size": 40161, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 269724, "raw_average_key_size": 26, "raw_value_size": 13452169, "raw_average_value_size": 1315, "num_data_blocks": 1532, "num_entries": 10225, "num_filter_entries": 10225, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.266521) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 13698466 bytes
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.268787) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.0 rd, 159.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.4 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(11.4) write-amplify(5.4) OK, records in: 10752, records dropped: 527 output_compression: NoCompression
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.268835) EVENT_LOG_v1 {"time_micros": 1769850146268819, "job": 106, "event": "compaction_finished", "compaction_time_micros": 85783, "compaction_time_cpu_micros": 31321, "output_level": 6, "num_output_files": 1, "total_output_size": 13698466, "num_input_records": 10752, "num_output_records": 10225, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146269531, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850146271700, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.180427) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.271801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.271806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.271807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.271809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:02:26.271810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:02:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:26.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.808 221324 DEBUG oslo_concurrency.processutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.241s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.809 221324 INFO nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Deleting local config drive /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6/disk.config because it was imported into RBD.#033[00m
Jan 31 04:02:26 np0005603622 kernel: tap3154ceb6-2a: entered promiscuous mode
Jan 31 04:02:26 np0005603622 NetworkManager[49080]: <info>  [1769850146.8651] manager: (tap3154ceb6-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/410)
Jan 31 04:02:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:26Z|00934|binding|INFO|Claiming lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for this chassis.
Jan 31 04:02:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:26Z|00935|binding|INFO|3154ceb6-2afa-43f3-bb3a-ddf285ae43eb: Claiming fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.866 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:26Z|00936|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb ovn-installed in OVS
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.871 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603622 nova_compute[221301]: 2026-01-31 09:02:26.873 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:26 np0005603622 systemd-machined[190406]: New machine qemu-97-instance-000000c4.
Jan 31 04:02:26 np0005603622 systemd[1]: Started Virtual Machine qemu-97-instance-000000c4.
Jan 31 04:02:26 np0005603622 systemd-udevd[300694]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:02:26 np0005603622 NetworkManager[49080]: <info>  [1769850146.9385] device (tap3154ceb6-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:02:26 np0005603622 NetworkManager[49080]: <info>  [1769850146.9399] device (tap3154ceb6-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:02:27 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:27Z|00937|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb up in Southbound
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.104 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.106 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f bound to our chassis#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.108 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.121 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5928c0c4-ec59-40b6-9407-d80dab4a1ea3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.122 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap372c2b16-d1 in ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.124 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap372c2b16-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.124 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cb70e1a2-b0b8-45cf-9003-ea8610e76d3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.125 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[18535222-eac6-43a1-aac7-ccfa998f43f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.140 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4e2f16-eedd-40a9-9e6c-c8ef3e69a704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:02:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 82K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1720 writes, 8581 keys, 1720 commit groups, 1.0 writes per commit group, ingest: 16.43 MB, 0.03 MB/s#012Interval WAL: 1720 writes, 1720 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     35.9      2.88              0.33        53    0.054       0      0       0.0       0.0#012  L6      1/0   13.06 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.2     83.8     71.7      7.47              1.70        52    0.144    390K    28K       0.0       0.0#012 Sum      1/0   13.06 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.2     60.5     61.8     10.35              2.03       105    0.099    390K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.1     49.6     50.9      1.90              0.25        14    0.136     72K   3662       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0     83.8     71.7      7.47              1.70        52    0.144    390K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     36.0      2.88              0.33        52    0.055       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.101, interval 0.013#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.62 GB write, 0.11 MB/s write, 0.61 GB read, 0.10 MB/s read, 10.3 seconds#012Interval compaction: 0.09 GB write, 0.16 MB/s write, 0.09 GB read, 0.16 MB/s read, 1.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 68.61 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000474 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3893,65.79 MB,21.6404%) FilterBlock(105,1.05 MB,0.346289%) IndexBlock(105,1.77 MB,0.582128%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.166 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6a0c53-7ef4-4140-9164-99df81b3677a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:27.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.195 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[61a3bb33-7d31-47d9-b889-f03efa74f785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.200 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6a843ded-dc18-4048-986a-5e1bd26ced2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 NetworkManager[49080]: <info>  [1769850147.2014] manager: (tap372c2b16-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/411)
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.229 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9db3500c-dada-4c2a-9cd4-11be22a0c82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.233 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ba1ba3-7e4f-45b3-b96f-0e1c839abf57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 NetworkManager[49080]: <info>  [1769850147.2510] device (tap372c2b16-d0): carrier: link connected
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.252 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8d2285-6d21-42a0-ae39-a407bd83cf52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.268 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a39a9f9a-7eb7-4ba2-9fc3-c51ecc8a6711]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 281], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 939859, 'reachable_time': 36581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300745, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.282 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[41469caf-f20c-4104-ad70-5f936cd158df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:3847'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 939859, 'tstamp': 939859}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300746, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.296 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2aedd182-406f-4840-803c-995518066950]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 281], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 939859, 'reachable_time': 36581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 300747, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.317 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd4a226-15b9-44b8-b0b0-80261cea9e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.359 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[23d98bfd-dd9e-4667-bf88-eba2f36f596d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.360 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.360 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.361 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap372c2b16-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:27 np0005603622 NetworkManager[49080]: <info>  [1769850147.3635] manager: (tap372c2b16-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:27 np0005603622 kernel: tap372c2b16-d0: entered promiscuous mode
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.365 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.365 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap372c2b16-d0, col_values=(('external_ids', {'iface-id': 'a077e97d-4d5f-408b-a598-1a37aa5f7556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.366 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:27 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:27Z|00938|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.376 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.377 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6f832b61-ef0a-4bb5-9ccb-ab68a7cd1ccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.377 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:02:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:27.378 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'env', 'PROCESS_TAG=haproxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/372c2b16-dcb3-40d9-a594-5fc185a4de0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:02:27 np0005603622 podman[300797]: 2026-01-31 09:02:27.691408946 +0000 UTC m=+0.052494756 container create 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:02:27 np0005603622 systemd[1]: Started libpod-conmon-2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933.scope.
Jan 31 04:02:27 np0005603622 podman[300797]: 2026-01-31 09:02:27.661500286 +0000 UTC m=+0.022586096 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:02:27 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:02:27 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66103a895ade2f7563253136d89e67217ebbee68dd4d10b936a4e6870a9fa82e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:02:27 np0005603622 podman[300797]: 2026-01-31 09:02:27.786817861 +0000 UTC m=+0.147903701 container init 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 31 04:02:27 np0005603622 podman[300797]: 2026-01-31 09:02:27.791748563 +0000 UTC m=+0.152834373 container start 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 04:02:27 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [NOTICE]   (300821) : New worker (300823) forked
Jan 31 04:02:27 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [NOTICE]   (300821) : Loading success.
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.885 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850147.8844004, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:27 np0005603622 nova_compute[221301]: 2026-01-31 09:02:27.886 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Started (Lifecycle Event)#033[00m
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.313 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.318 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850147.8846989, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.319 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.402 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.408 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:28.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:28 np0005603622 nova_compute[221301]: 2026-01-31 09:02:28.521 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.021 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Successfully created port: 0b483969-3772-416a-8ac5-352deae8b96e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:02:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:29.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.399 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.458 221324 DEBUG nova.compute.manager [req-8b4715b4-eeec-4d84-a7cd-57894d85a7ae req-3d50629b-b467-488a-b01e-5714a9ffd8c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.459 221324 DEBUG oslo_concurrency.lockutils [req-8b4715b4-eeec-4d84-a7cd-57894d85a7ae req-3d50629b-b467-488a-b01e-5714a9ffd8c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.460 221324 DEBUG oslo_concurrency.lockutils [req-8b4715b4-eeec-4d84-a7cd-57894d85a7ae req-3d50629b-b467-488a-b01e-5714a9ffd8c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.461 221324 DEBUG oslo_concurrency.lockutils [req-8b4715b4-eeec-4d84-a7cd-57894d85a7ae req-3d50629b-b467-488a-b01e-5714a9ffd8c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.461 221324 DEBUG nova.compute.manager [req-8b4715b4-eeec-4d84-a7cd-57894d85a7ae req-3d50629b-b467-488a-b01e-5714a9ffd8c2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Processing event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.463 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.468 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850149.4678597, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.468 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.475 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.479 221324 INFO nova.virt.libvirt.driver [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance spawned successfully.#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.480 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.634 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.641 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.646 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.647 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.647 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.648 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.648 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.649 221324 DEBUG nova.virt.libvirt.driver [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:29 np0005603622 nova_compute[221301]: 2026-01-31 09:02:29.934 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:30 np0005603622 nova_compute[221301]: 2026-01-31 09:02:30.191 221324 INFO nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Took 22.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:02:30 np0005603622 nova_compute[221301]: 2026-01-31 09:02:30.192 221324 DEBUG nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:30 np0005603622 nova_compute[221301]: 2026-01-31 09:02:30.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:30.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:30 np0005603622 nova_compute[221301]: 2026-01-31 09:02:30.585 221324 INFO nova.compute.manager [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Took 26.04 seconds to build instance.#033[00m
Jan 31 04:02:30 np0005603622 nova_compute[221301]: 2026-01-31 09:02:30.787 221324 DEBUG oslo_concurrency.lockutils [None req-62e84c02-272b-483c-837d-08258706eefc b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:30.921 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:30.922 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:30.922 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.179 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Successfully updated port: 0b483969-3772-416a-8ac5-352deae8b96e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:02:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:31.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.439 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.439 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.439 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.845 221324 DEBUG nova.compute.manager [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.845 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.846 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.846 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.846 221324 DEBUG nova.compute.manager [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.846 221324 WARNING nova.compute.manager [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state None.#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.847 221324 DEBUG nova.compute.manager [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-changed-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.847 221324 DEBUG nova.compute.manager [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Refreshing instance network info cache due to event network-changed-0b483969-3772-416a-8ac5-352deae8b96e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:31 np0005603622 nova_compute[221301]: 2026-01-31 09:02:31.847 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:32.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:32 np0005603622 nova_compute[221301]: 2026-01-31 09:02:32.994 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:02:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:33.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.650 221324 DEBUG os_brick.utils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.653 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.662 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.662 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[c6914437-b472-4a8c-ae70-deebb9ab1974]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.664 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.671 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.671 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[60a00895-a4c3-4115-921e-280d2210b634]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.673 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.683 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.683 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[6617a1b5-1abb-4e88-8141-b4fdea3106e8]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.685 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd7150f-86b9-49a3-9024-926503cb1781]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.686 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.707 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.712 221324 DEBUG os_brick.initiator.connectors.lightos [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.713 221324 DEBUG os_brick.initiator.connectors.lightos [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.713 221324 DEBUG os_brick.initiator.connectors.lightos [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.714 221324 DEBUG os_brick.utils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (62ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:02:33 np0005603622 nova_compute[221301]: 2026-01-31 09:02:33.714 221324 DEBUG nova.virt.block_device [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updating existing volume attachment record: cd9bf7d3-fc9f-40e2-b948-4891744e03c7 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:02:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:34.282 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:34 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:34.284 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:02:34 np0005603622 nova_compute[221301]: 2026-01-31 09:02:34.285 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603622 nova_compute[221301]: 2026-01-31 09:02:34.401 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:34.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2367798098' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:35.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.388 221324 DEBUG nova.network.neutron [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updating instance_info_cache with network_info: [{"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.443 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.444 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Instance network_info: |[{"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.445 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.445 221324 DEBUG nova.network.neutron [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Refreshing network info cache for port 0b483969-3772-416a-8ac5-352deae8b96e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:35 np0005603622 nova_compute[221301]: 2026-01-31 09:02:35.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:36.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:37.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.704 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.705 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.706 221324 INFO nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Creating image(s)#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.706 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.706 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Ensure instance console log exists: /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.706 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.707 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.707 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.709 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Start _get_guest_xml network_info=[{"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'boot_index': 0, 'attachment_id': 'cd9bf7d3-fc9f-40e2-b948-4891744e03c7', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-d9130530-fbc3-4fb7-957a-e45c619660f3', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'd9130530-fbc3-4fb7-957a-e45c619660f3', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '8828ce70-2ab6-41e3-ad2a-51031d11bf31', 'attached_at': '', 'detached_at': '', 'volume_id': 'd9130530-fbc3-4fb7-957a-e45c619660f3', 'serial': 'd9130530-fbc3-4fb7-957a-e45c619660f3'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.714 221324 WARNING nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.729 221324 DEBUG nova.virt.libvirt.host [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.729 221324 DEBUG nova.virt.libvirt.host [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.736 221324 DEBUG nova.virt.libvirt.host [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.736 221324 DEBUG nova.virt.libvirt.host [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.738 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.738 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.738 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.739 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.740 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.740 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.741 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.741 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.742 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.742 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.742 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.742 221324 DEBUG nova.virt.hardware [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.780 221324 DEBUG nova.storage.rbd_utils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:37 np0005603622 nova_compute[221301]: 2026-01-31 09:02:37.786 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3502741021' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:38 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:38.286 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.339 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.386 221324 DEBUG nova.virt.libvirt.vif [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1187114013',display_name='tempest-TestVolumeBootPattern-server-1187114013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1187114013',id=197,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0fbqy36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:24Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=8828ce70-2ab6-41e3-ad2a-51031d11bf31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.386 221324 DEBUG nova.network.os_vif_util [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.387 221324 DEBUG nova.network.os_vif_util [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.388 221324 DEBUG nova.objects.instance [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8828ce70-2ab6-41e3-ad2a-51031d11bf31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.426 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <uuid>8828ce70-2ab6-41e3-ad2a-51031d11bf31</uuid>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <name>instance-000000c5</name>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestVolumeBootPattern-server-1187114013</nova:name>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:02:37</nova:creationTime>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <nova:port uuid="0b483969-3772-416a-8ac5-352deae8b96e">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="serial">8828ce70-2ab6-41e3-ad2a-51031d11bf31</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="uuid">8828ce70-2ab6-41e3-ad2a-51031d11bf31</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-d9130530-fbc3-4fb7-957a-e45c619660f3">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <serial>d9130530-fbc3-4fb7-957a-e45c619660f3</serial>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:2d:91:28"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <target dev="tap0b483969-37"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/console.log" append="off"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:02:38 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:02:38 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:02:38 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:02:38 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.427 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Preparing to wait for external event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.428 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.428 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.429 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.429 221324 DEBUG nova.virt.libvirt.vif [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:02:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1187114013',display_name='tempest-TestVolumeBootPattern-server-1187114013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1187114013',id=197,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0fbqy36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:02:24Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=8828ce70-2ab6-41e3-ad2a-51031d11bf31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.431 221324 DEBUG nova.network.os_vif_util [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.431 221324 DEBUG nova.network.os_vif_util [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.432 221324 DEBUG os_vif [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.433 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.434 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.438 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.438 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b483969-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.439 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b483969-37, col_values=(('external_ids', {'iface-id': '0b483969-3772-416a-8ac5-352deae8b96e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:91:28', 'vm-uuid': '8828ce70-2ab6-41e3-ad2a-51031d11bf31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:38 np0005603622 NetworkManager[49080]: <info>  [1769850158.4420] manager: (tap0b483969-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.451 221324 INFO os_vif [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37')#033[00m
Jan 31 04:02:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:38.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.625 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.626 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.627 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:2d:91:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.627 221324 INFO nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Using config drive#033[00m
Jan 31 04:02:38 np0005603622 nova_compute[221301]: 2026-01-31 09:02:38.665 221324 DEBUG nova.storage.rbd_utils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:39.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.390 221324 DEBUG nova.network.neutron [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updated VIF entry in instance network info cache for port 0b483969-3772-416a-8ac5-352deae8b96e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.390 221324 DEBUG nova.network.neutron [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updating instance_info_cache with network_info: [{"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.421 221324 DEBUG oslo_concurrency.lockutils [req-f96d8ecb-347c-49a4-b849-0fbfda46e2cd req-a5fecd57-ff1d-464e-b0bc-5ca45d609907 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-8828ce70-2ab6-41e3-ad2a-51031d11bf31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.725 221324 INFO nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Creating config drive at /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.734 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpe2uen3zt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.871 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpe2uen3zt" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.916 221324 DEBUG nova.storage.rbd_utils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:02:39 np0005603622 nova_compute[221301]: 2026-01-31 09:02:39.920 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config 8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.240 221324 DEBUG oslo_concurrency.processutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config 8828ce70-2ab6-41e3-ad2a-51031d11bf31_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.241 221324 INFO nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Deleting local config drive /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31/disk.config because it was imported into RBD.#033[00m
Jan 31 04:02:40 np0005603622 kernel: tap0b483969-37: entered promiscuous mode
Jan 31 04:02:40 np0005603622 NetworkManager[49080]: <info>  [1769850160.2971] manager: (tap0b483969-37): new Tun device (/org/freedesktop/NetworkManager/Devices/414)
Jan 31 04:02:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:40Z|00939|binding|INFO|Claiming lport 0b483969-3772-416a-8ac5-352deae8b96e for this chassis.
Jan 31 04:02:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:40Z|00940|binding|INFO|0b483969-3772-416a-8ac5-352deae8b96e: Claiming fa:16:3e:2d:91:28 10.100.0.3
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.302 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:40Z|00941|binding|INFO|Setting lport 0b483969-3772-416a-8ac5-352deae8b96e ovn-installed in OVS
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.307 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.309 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:40 np0005603622 systemd-udevd[300952]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:02:40 np0005603622 NetworkManager[49080]: <info>  [1769850160.3324] device (tap0b483969-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:02:40 np0005603622 NetworkManager[49080]: <info>  [1769850160.3331] device (tap0b483969-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:02:40 np0005603622 systemd-machined[190406]: New machine qemu-98-instance-000000c5.
Jan 31 04:02:40 np0005603622 systemd[1]: Started Virtual Machine qemu-98-instance-000000c5.
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:40.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.929 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850160.9293766, 8828ce70-2ab6-41e3-ad2a-51031d11bf31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:40 np0005603622 nova_compute[221301]: 2026-01-31 09:02:40.930 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] VM Started (Lifecycle Event)#033[00m
Jan 31 04:02:41 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:41Z|00942|binding|INFO|Setting lport 0b483969-3772-416a-8ac5-352deae8b96e up in Southbound
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.194 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:91:28 10.100.0.3'], port_security=['fa:16:3e:2d:91:28 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8828ce70-2ab6-41e3-ad2a-51031d11bf31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b400fbac-34cb-4e36-b84a-e8d6447b9bc2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0b483969-3772-416a-8ac5-352deae8b96e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.197 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0b483969-3772-416a-8ac5-352deae8b96e in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.201 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:02:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:41.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.205 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.205 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.209 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eadd9750-ba92-4f51-af55-fd0f0358da83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.210 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.212 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.212 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe470950-54ed-466c-b66e-b230dff72f41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.213 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[04820d83-8e6e-4aae-954f-85c28f6d8aa9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.224 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[cb24ef31-4611-4c13-a2ac-b0976de857b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.238 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.243 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850160.929503, 8828ce70-2ab6-41e3-ad2a-51031d11bf31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.243 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.250 221324 DEBUG nova.objects.instance [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.252 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6316197f-42cb-4428-8290-fc939cab7f1c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.277 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.281 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.281 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7306d521-c814-45b8-b27e-7d9559a9014c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.285 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[34b2c2ab-65e9-4229-88d6-c23450567ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 NetworkManager[49080]: <info>  [1769850161.2875] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/415)
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.316 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b6b337-53e6-4a10-a5d4-2fa6f02eb162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.320 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.320 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[ca820860-3fbf-4b86-be5f-23bb790ff153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.332 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:41 np0005603622 NetworkManager[49080]: <info>  [1769850161.3385] device (tap650eb345-80): carrier: link connected
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.342 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e2aeedce-bc51-46f3-b96f-f44e4a84eb96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.359 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[697adc41-a1b0-4e47-9766-56d9b19329cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 941267, 'reachable_time': 25466, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301029, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.375 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[777e7e9a-f686-4447-9dbc-6c2383b2d1de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 941267, 'tstamp': 941267}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301030, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.396 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2584e01d-2f0f-4e7f-8a4b-550d4d7d736e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 941267, 'reachable_time': 25466, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 301031, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.446 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[62d52388-3760-483b-9e23-4970a2424c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.496 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f9c216-0140-4224-a7e1-42749d69c17f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.498 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.499 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.502 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:41 np0005603622 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:02:41 np0005603622 NetworkManager[49080]: <info>  [1769850161.5035] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.506 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:41 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:41Z|00943|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.508 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.510 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.511 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6276f986-3685-453e-8de9-5b92089e0fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:41 np0005603622 nova_compute[221301]: 2026-01-31 09:02:41.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.512 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:02:41 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:41.513 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:02:41 np0005603622 podman[301064]: 2026-01-31 09:02:41.921631458 +0000 UTC m=+0.063575903 container create d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:02:41 np0005603622 systemd[1]: Started libpod-conmon-d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa.scope.
Jan 31 04:02:41 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:02:41 np0005603622 podman[301064]: 2026-01-31 09:02:41.883018284 +0000 UTC m=+0.024962719 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:02:41 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b7c91b4548673c9dcacb82ccdf6bfb535f8cb269555f174e4afe37e96d59302/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:02:42 np0005603622 podman[301064]: 2026-01-31 09:02:42.004026513 +0000 UTC m=+0.145971028 container init d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 04:02:42 np0005603622 podman[301064]: 2026-01-31 09:02:42.008064321 +0000 UTC m=+0.150008776 container start d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.015 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.015 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.016 221324 INFO nova.compute.manager [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Attaching volume 881e1975-bbbe-4bd4-b026-dc1919960757 to /dev/vdb#033[00m
Jan 31 04:02:42 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [NOTICE]   (301083) : New worker (301085) forked
Jan 31 04:02:42 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [NOTICE]   (301083) : Loading success.
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.120 221324 DEBUG nova.compute.manager [req-3cf2835e-6cdf-4c52-a8dc-441c81de40b9 req-72f4de16-57e1-44aa-8a95-82e537b4ad86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.121 221324 DEBUG oslo_concurrency.lockutils [req-3cf2835e-6cdf-4c52-a8dc-441c81de40b9 req-72f4de16-57e1-44aa-8a95-82e537b4ad86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.121 221324 DEBUG oslo_concurrency.lockutils [req-3cf2835e-6cdf-4c52-a8dc-441c81de40b9 req-72f4de16-57e1-44aa-8a95-82e537b4ad86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.122 221324 DEBUG oslo_concurrency.lockutils [req-3cf2835e-6cdf-4c52-a8dc-441c81de40b9 req-72f4de16-57e1-44aa-8a95-82e537b4ad86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.122 221324 DEBUG nova.compute.manager [req-3cf2835e-6cdf-4c52-a8dc-441c81de40b9 req-72f4de16-57e1-44aa-8a95-82e537b4ad86 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Processing event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.123 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.128 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850162.1281564, 8828ce70-2ab6-41e3-ad2a-51031d11bf31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.129 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.131 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.135 221324 INFO nova.virt.libvirt.driver [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Instance spawned successfully.#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.135 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.153 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.157 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.172 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.173 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.174 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.174 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.175 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.175 221324 DEBUG nova.virt.libvirt.driver [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.208 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.305 221324 DEBUG os_brick.utils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.307 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.317 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.318 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0bdd47-4104-454a-a122-ce394f758e8b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.319 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.328 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.328 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[d09a779d-00cc-48a3-b7e8-77bf76b85435]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.330 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.338 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.338 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[3886960c-c03d-4fa1-99be-b4efb8e87025]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.340 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[d9068aac-7feb-4ed4-b99e-bbc7af5a791b]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.342 221324 DEBUG oslo_concurrency.processutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.369 221324 INFO nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Took 4.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.369 221324 DEBUG nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.371 221324 DEBUG oslo_concurrency.processutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.374 221324 DEBUG os_brick.initiator.connectors.lightos [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.374 221324 DEBUG os_brick.initiator.connectors.lightos [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.374 221324 DEBUG os_brick.initiator.connectors.lightos [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.375 221324 DEBUG os_brick.utils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.375 221324 DEBUG nova.virt.block_device [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating existing volume attachment record: dc3bb77a-cc57-4f7c-aec0-f53e1801fccd _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:02:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:42.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.639 221324 INFO nova.compute.manager [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Took 20.92 seconds to build instance.#033[00m
Jan 31 04:02:42 np0005603622 nova_compute[221301]: 2026-01-31 09:02:42.891 221324 DEBUG oslo_concurrency.lockutils [None req-421b71e9-b55e-4d5d-aff8-805fc433b7b3 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:43 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:43Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:02:43 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:43Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:02:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:43.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:43 np0005603622 nova_compute[221301]: 2026-01-31 09:02:43.441 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:02:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3200619428' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.060 221324 DEBUG nova.objects.instance [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.191 221324 DEBUG nova.virt.libvirt.driver [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Attempting to attach volume 881e1975-bbbe-4bd4-b026-dc1919960757 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.199 221324 DEBUG nova.virt.libvirt.guest [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-881e1975-bbbe-4bd4-b026-dc1919960757">
Jan 31 04:02:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 04:02:44 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  </auth>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:02:44 np0005603622 nova_compute[221301]:  <serial>881e1975-bbbe-4bd4-b026-dc1919960757</serial>
Jan 31 04:02:44 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:02:44 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 04:02:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:44.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.588 221324 DEBUG nova.virt.libvirt.driver [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.589 221324 DEBUG nova.virt.libvirt.driver [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.589 221324 DEBUG nova.virt.libvirt.driver [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.590 221324 DEBUG nova.virt.libvirt.driver [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] No VIF found with MAC fa:16:3e:f7:04:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.956 221324 DEBUG nova.compute.manager [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.956 221324 DEBUG oslo_concurrency.lockutils [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.957 221324 DEBUG oslo_concurrency.lockutils [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.957 221324 DEBUG oslo_concurrency.lockutils [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.957 221324 DEBUG nova.compute.manager [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] No waiting events found dispatching network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:44 np0005603622 nova_compute[221301]: 2026-01-31 09:02:44.958 221324 WARNING nova.compute.manager [req-67e7729e-4e3d-470d-8beb-96158c2ec1db req-f7b870e5-72e9-436c-be9e-3dff6eb72638 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received unexpected event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e for instance with vm_state active and task_state None.#033[00m
Jan 31 04:02:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:02:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:45.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:02:45 np0005603622 nova_compute[221301]: 2026-01-31 09:02:45.292 221324 DEBUG oslo_concurrency.lockutils [None req-804348cc-c285-45cf-b18c-0c6c8a71f8c7 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:45 np0005603622 nova_compute[221301]: 2026-01-31 09:02:45.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:46.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:02:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:47.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:48.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.641 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.641 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.642 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.642 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.642 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.643 221324 INFO nova.compute.manager [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Terminating instance#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.645 221324 DEBUG nova.compute.manager [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:02:48 np0005603622 kernel: tap0b483969-37 (unregistering): left promiscuous mode
Jan 31 04:02:48 np0005603622 NetworkManager[49080]: <info>  [1769850168.6836] device (tap0b483969-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:02:48 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:48Z|00944|binding|INFO|Releasing lport 0b483969-3772-416a-8ac5-352deae8b96e from this chassis (sb_readonly=0)
Jan 31 04:02:48 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:48Z|00945|binding|INFO|Setting lport 0b483969-3772-416a-8ac5-352deae8b96e down in Southbound
Jan 31 04:02:48 np0005603622 ovn_controller[130109]: 2026-01-31T09:02:48Z|00946|binding|INFO|Removing iface tap0b483969-37 ovn-installed in OVS
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.691 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000c5.scope: Deactivated successfully.
Jan 31 04:02:48 np0005603622 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000c5.scope: Consumed 7.413s CPU time.
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.727 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:91:28 10.100.0.3'], port_security=['fa:16:3e:2d:91:28 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8828ce70-2ab6-41e3-ad2a-51031d11bf31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b400fbac-34cb-4e36-b84a-e8d6447b9bc2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=0b483969-3772-416a-8ac5-352deae8b96e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:02:48 np0005603622 systemd-machined[190406]: Machine qemu-98-instance-000000c5 terminated.
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.728 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 0b483969-3772-416a-8ac5-352deae8b96e in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.729 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.731 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6824b6-c687-4279-b617-8190d3ddcfb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.731 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:02:48 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [NOTICE]   (301083) : haproxy version is 2.8.14-c23fe91
Jan 31 04:02:48 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [NOTICE]   (301083) : path to executable is /usr/sbin/haproxy
Jan 31 04:02:48 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [WARNING]  (301083) : Exiting Master process...
Jan 31 04:02:48 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [ALERT]    (301083) : Current worker (301085) exited with code 143 (Terminated)
Jan 31 04:02:48 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[301079]: [WARNING]  (301083) : All workers exited. Exiting... (0)
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.880 221324 INFO nova.virt.libvirt.driver [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Instance destroyed successfully.#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.880 221324 DEBUG nova.objects.instance [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid 8828ce70-2ab6-41e3-ad2a-51031d11bf31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:02:48 np0005603622 systemd[1]: libpod-d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa.scope: Deactivated successfully.
Jan 31 04:02:48 np0005603622 podman[301145]: 2026-01-31 09:02:48.889560093 +0000 UTC m=+0.052373973 container died d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:02:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa-userdata-shm.mount: Deactivated successfully.
Jan 31 04:02:48 np0005603622 systemd[1]: var-lib-containers-storage-overlay-1b7c91b4548673c9dcacb82ccdf6bfb535f8cb269555f174e4afe37e96d59302-merged.mount: Deactivated successfully.
Jan 31 04:02:48 np0005603622 podman[301145]: 2026-01-31 09:02:48.925524126 +0000 UTC m=+0.088338026 container cleanup d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.927 221324 DEBUG nova.virt.libvirt.vif [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:02:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1187114013',display_name='tempest-TestVolumeBootPattern-server-1187114013',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1187114013',id=197,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:02:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-r0fbqy36',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:02:42Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=8828ce70-2ab6-41e3-ad2a-51031d11bf31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.928 221324 DEBUG nova.network.os_vif_util [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "0b483969-3772-416a-8ac5-352deae8b96e", "address": "fa:16:3e:2d:91:28", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b483969-37", "ovs_interfaceid": "0b483969-3772-416a-8ac5-352deae8b96e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.929 221324 DEBUG nova.network.os_vif_util [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.930 221324 DEBUG os_vif [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.932 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 systemd[1]: libpod-conmon-d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa.scope: Deactivated successfully.
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.933 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b483969-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:48 np0005603622 nova_compute[221301]: 2026-01-31 09:02:48.940 221324 INFO os_vif [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:91:28,bridge_name='br-int',has_traffic_filtering=True,id=0b483969-3772-416a-8ac5-352deae8b96e,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b483969-37')#033[00m
Jan 31 04:02:48 np0005603622 podman[301187]: 2026-01-31 09:02:48.991950824 +0000 UTC m=+0.047916553 container remove d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.995 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[14e5ef37-b13c-466b-acf0-363ad5fcdde1]: (4, ('Sat Jan 31 09:02:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa)\nd72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa\nSat Jan 31 09:02:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (d72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa)\nd72204e9f92ccfc503d6eece96507a799e0436f194aafddcf526ca0cbd8fe5aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.997 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[567649e7-421e-4f06-9dbc-20b4081afd10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:48 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:48.998 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:02:49 np0005603622 nova_compute[221301]: 2026-01-31 09:02:49.000 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:49 np0005603622 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:02:49 np0005603622 nova_compute[221301]: 2026-01-31 09:02:49.007 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.011 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[224ebba5-a426-4054-b093-74641ee81aa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.028 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[50c6b0e1-26c7-4332-9133-7e11b1f2f1a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.030 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[067d3b2a-6b32-4aee-a6d3-cff22d2c294c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.044 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1149681b-2bd4-445b-b020-8f34e3178440]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 941261, 'reachable_time': 18763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301220, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:49 np0005603622 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.048 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:02:49 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:02:49.048 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bbb987-5223-4917-b403-064f8e7dcfe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:02:49 np0005603622 nova_compute[221301]: 2026-01-31 09:02:49.141 221324 INFO nova.virt.libvirt.driver [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Deleting instance files /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31_del#033[00m
Jan 31 04:02:49 np0005603622 nova_compute[221301]: 2026-01-31 09:02:49.142 221324 INFO nova.virt.libvirt.driver [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Deletion of /var/lib/nova/instances/8828ce70-2ab6-41e3-ad2a-51031d11bf31_del complete#033[00m
Jan 31 04:02:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:49.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:50 np0005603622 nova_compute[221301]: 2026-01-31 09:02:50.424 221324 INFO nova.compute.manager [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Took 1.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:02:50 np0005603622 nova_compute[221301]: 2026-01-31 09:02:50.424 221324 DEBUG oslo.service.loopingcall [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:02:50 np0005603622 nova_compute[221301]: 2026-01-31 09:02:50.426 221324 DEBUG nova.compute.manager [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:02:50 np0005603622 nova_compute[221301]: 2026-01-31 09:02:50.426 221324 DEBUG nova.network.neutron [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:02:50 np0005603622 nova_compute[221301]: 2026-01-31 09:02:50.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:50.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:50 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:51.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:02:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:51 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.493 221324 DEBUG nova.network.neutron [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.538 221324 DEBUG nova.compute.manager [req-74c55939-c22d-49e5-9548-b47cdd9a1086 req-30dbdc1b-3ebd-4048-afc0-76939629f8db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-deleted-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.539 221324 INFO nova.compute.manager [req-74c55939-c22d-49e5-9548-b47cdd9a1086 req-30dbdc1b-3ebd-4048-afc0-76939629f8db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Neutron deleted interface 0b483969-3772-416a-8ac5-352deae8b96e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.539 221324 DEBUG nova.network.neutron [req-74c55939-c22d-49e5-9548-b47cdd9a1086 req-30dbdc1b-3ebd-4048-afc0-76939629f8db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:02:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:02:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:52.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.923 221324 INFO nova.compute.manager [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Took 2.50 seconds to deallocate network for instance.#033[00m
Jan 31 04:02:52 np0005603622 nova_compute[221301]: 2026-01-31 09:02:52.934 221324 DEBUG nova.compute.manager [req-74c55939-c22d-49e5-9548-b47cdd9a1086 req-30dbdc1b-3ebd-4048-afc0-76939629f8db fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Detach interface failed, port_id=0b483969-3772-416a-8ac5-352deae8b96e, reason: Instance 8828ce70-2ab6-41e3-ad2a-51031d11bf31 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 04:02:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:53.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.336 221324 INFO nova.compute.manager [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Took 0.41 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.338 221324 DEBUG nova.compute.manager [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Deleting volume: d9130530-fbc3-4fb7-957a-e45c619660f3 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.766 221324 DEBUG nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-unplugged-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.767 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.767 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.767 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.768 221324 DEBUG nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] No waiting events found dispatching network-vif-unplugged-0b483969-3772-416a-8ac5-352deae8b96e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.768 221324 DEBUG nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-unplugged-0b483969-3772-416a-8ac5-352deae8b96e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.768 221324 DEBUG nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.769 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.769 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.769 221324 DEBUG oslo_concurrency.lockutils [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.769 221324 DEBUG nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] No waiting events found dispatching network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.770 221324 WARNING nova.compute.manager [req-cbbc6bf2-6638-431c-b0b0-4af3a4baf2a2 req-20ea2a2b-c5dc-41eb-a938-63a80cc8bc23 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Received unexpected event network-vif-plugged-0b483969-3772-416a-8ac5-352deae8b96e for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.821 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.822 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:02:53 np0005603622 nova_compute[221301]: 2026-01-31 09:02:53.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.023 221324 DEBUG oslo_concurrency.processutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:02:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:02:54 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4153791856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.534 221324 DEBUG oslo_concurrency.processutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.542 221324 DEBUG nova.compute.provider_tree [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:02:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:54.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.657 221324 DEBUG nova.scheduler.client.report [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.759 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:54 np0005603622 nova_compute[221301]: 2026-01-31 09:02:54.841 221324 INFO nova.scheduler.client.report [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance 8828ce70-2ab6-41e3-ad2a-51031d11bf31#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.067 221324 DEBUG oslo_concurrency.lockutils [None req-38586c82-e15e-4ef1-bc64-a79e6001414b dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "8828ce70-2ab6-41e3-ad2a-51031d11bf31" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.103 221324 DEBUG nova.compute.manager [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.104 221324 DEBUG nova.compute.manager [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.104 221324 DEBUG oslo_concurrency.lockutils [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.105 221324 DEBUG oslo_concurrency.lockutils [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.105 221324 DEBUG nova.network.neutron [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:02:55 np0005603622 podman[301375]: 2026-01-31 09:02:55.157080088 +0000 UTC m=+0.077142886 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 04:02:55 np0005603622 podman[301376]: 2026-01-31 09:02:55.165168185 +0000 UTC m=+0.084440562 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Jan 31 04:02:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:55.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:02:55 np0005603622 nova_compute[221301]: 2026-01-31 09:02:55.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:02:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:56.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:56 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:02:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:57.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:02:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:02:58.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:02:58 np0005603622 nova_compute[221301]: 2026-01-31 09:02:58.938 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:02:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:02:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:02:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:02:59.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:00 np0005603622 nova_compute[221301]: 2026-01-31 09:03:00.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:00.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:01.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.493 221324 DEBUG nova.network.neutron [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.494 221324 DEBUG nova.network.neutron [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.582 221324 DEBUG oslo_concurrency.lockutils [req-af2ed64e-7544-4016-bcb8-c4e373206e8b req-3deb1c71-9d16-4546-93b5-53c0e050abcc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.814 221324 DEBUG nova.compute.manager [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.815 221324 DEBUG nova.compute.manager [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.815 221324 DEBUG oslo_concurrency.lockutils [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.816 221324 DEBUG oslo_concurrency.lockutils [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:01 np0005603622 nova_compute[221301]: 2026-01-31 09:03:01.816 221324 DEBUG nova.network.neutron [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:02.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:03:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:03.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:03:03 np0005603622 nova_compute[221301]: 2026-01-31 09:03:03.877 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850168.876161, 8828ce70-2ab6-41e3-ad2a-51031d11bf31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:03 np0005603622 nova_compute[221301]: 2026-01-31 09:03:03.878 221324 INFO nova.compute.manager [-] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:03:03 np0005603622 nova_compute[221301]: 2026-01-31 09:03:03.928 221324 DEBUG nova.compute.manager [None req-4d340556-8e8a-4d40-9bf0-6188362ad810 - - - - - -] [instance: 8828ce70-2ab6-41e3-ad2a-51031d11bf31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:03 np0005603622 nova_compute[221301]: 2026-01-31 09:03:03.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.037 221324 DEBUG nova.compute.manager [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.037 221324 DEBUG nova.compute.manager [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.038 221324 DEBUG oslo_concurrency.lockutils [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.449 221324 DEBUG nova.network.neutron [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.449 221324 DEBUG nova.network.neutron [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.475 221324 DEBUG oslo_concurrency.lockutils [req-53ba20b7-357b-4b99-a663-808e8f76a612 req-fd932f12-e93d-4107-bf77-f741ce7cad31 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.475 221324 DEBUG oslo_concurrency.lockutils [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:04 np0005603622 nova_compute[221301]: 2026-01-31 09:03:04.476 221324 DEBUG nova.network.neutron [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:04.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:05.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:05 np0005603622 nova_compute[221301]: 2026-01-31 09:03:05.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:06.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:03:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:07.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:03:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 31 04:03:07 np0005603622 nova_compute[221301]: 2026-01-31 09:03:07.976 221324 DEBUG nova.network.neutron [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:07 np0005603622 nova_compute[221301]: 2026-01-31 09:03:07.977 221324 DEBUG nova.network.neutron [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:08 np0005603622 nova_compute[221301]: 2026-01-31 09:03:08.005 221324 DEBUG oslo_concurrency.lockutils [req-008aeff9-e5c2-473b-849b-7b0c2bd66ad6 req-9b92b8d1-1d4e-4c68-8728-70269b2f047f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:08.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:08 np0005603622 nova_compute[221301]: 2026-01-31 09:03:08.941 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:03:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:09.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:03:10 np0005603622 nova_compute[221301]: 2026-01-31 09:03:10.468 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:10.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:11.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:12.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.591888) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192591988, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 766, "num_deletes": 253, "total_data_size": 1371209, "memory_usage": 1390520, "flush_reason": "Manual Compaction"}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192603754, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 643817, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82246, "largest_seqno": 83007, "table_properties": {"data_size": 640521, "index_size": 1139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 9075, "raw_average_key_size": 21, "raw_value_size": 633443, "raw_average_value_size": 1480, "num_data_blocks": 49, "num_entries": 428, "num_filter_entries": 428, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850147, "oldest_key_time": 1769850147, "file_creation_time": 1769850192, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 11910 microseconds, and 4691 cpu microseconds.
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.603807) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 643817 bytes OK
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.603833) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.607523) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.607565) EVENT_LOG_v1 {"time_micros": 1769850192607539, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.607589) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 1367142, prev total WAL file size 1367857, number of live WAL files 2.
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.608417) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373536' seq:72057594037927935, type:22 .. '6D6772737461740033303039' seq:0, type:0; will stop at (end)
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(628KB)], [168(13MB)]
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192608493, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14342283, "oldest_snapshot_seqno": -1}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 10145 keys, 10722750 bytes, temperature: kUnknown
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192693241, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 10722750, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10660634, "index_size": 35635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25413, "raw_key_size": 268321, "raw_average_key_size": 26, "raw_value_size": 10486876, "raw_average_value_size": 1033, "num_data_blocks": 1343, "num_entries": 10145, "num_filter_entries": 10145, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850192, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.693682) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 10722750 bytes
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.696149) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.9 rd, 126.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 13.1 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(38.9) write-amplify(16.7) OK, records in: 10653, records dropped: 508 output_compression: NoCompression
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.696246) EVENT_LOG_v1 {"time_micros": 1769850192696230, "job": 108, "event": "compaction_finished", "compaction_time_micros": 84916, "compaction_time_cpu_micros": 42758, "output_level": 6, "num_output_files": 1, "total_output_size": 10722750, "num_input_records": 10653, "num_output_records": 10145, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192696519, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850192699410, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.608263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.699469) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.699476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.699479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.699482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:12 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:03:12.699484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:03:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:13.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:13 np0005603622 nova_compute[221301]: 2026-01-31 09:03:13.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:13 np0005603622 nova_compute[221301]: 2026-01-31 09:03:13.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:03:13 np0005603622 nova_compute[221301]: 2026-01-31 09:03:13.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:03:13 np0005603622 nova_compute[221301]: 2026-01-31 09:03:13.944 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:14.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:15.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:15 np0005603622 nova_compute[221301]: 2026-01-31 09:03:15.370 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:15 np0005603622 nova_compute[221301]: 2026-01-31 09:03:15.370 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:15 np0005603622 nova_compute[221301]: 2026-01-31 09:03:15.370 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:03:15 np0005603622 nova_compute[221301]: 2026-01-31 09:03:15.371 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:03:15 np0005603622 nova_compute[221301]: 2026-01-31 09:03:15.470 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:16 np0005603622 nova_compute[221301]: 2026-01-31 09:03:16.565 221324 DEBUG nova.compute.manager [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:16 np0005603622 nova_compute[221301]: 2026-01-31 09:03:16.565 221324 DEBUG nova.compute.manager [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:16 np0005603622 nova_compute[221301]: 2026-01-31 09:03:16.566 221324 DEBUG oslo_concurrency.lockutils [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:16.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:17.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:18.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:18 np0005603622 nova_compute[221301]: 2026-01-31 09:03:18.948 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:03:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:19.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:03:20 np0005603622 nova_compute[221301]: 2026-01-31 09:03:20.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:20.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.027 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.027 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.028 221324 INFO nova.compute.manager [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Rebooting instance#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.062 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:21.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.499 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.529 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.530 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.530 221324 DEBUG oslo_concurrency.lockutils [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.531 221324 DEBUG nova.network.neutron [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.533 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.534 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.534 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.535 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.535 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.536 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.789 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.790 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.790 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.790 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:03:21 np0005603622 nova_compute[221301]: 2026-01-31 09:03:21.791 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1713911538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.251 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.337 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.337 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.338 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.492 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.493 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4099MB free_disk=20.890331268310547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.493 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:22 np0005603622 nova_compute[221301]: 2026-01-31 09:03:22.494 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:22.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.107 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.108 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.108 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.194 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:03:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:23.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:03:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4132110621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.661 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.666 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.876 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.920 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.921 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.427s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:23 np0005603622 nova_compute[221301]: 2026-01-31 09:03:23.951 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:24.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:25.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:25 np0005603622 podman[301513]: 2026-01-31 09:03:25.379294828 +0000 UTC m=+0.085880680 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 31 04:03:25 np0005603622 podman[301512]: 2026-01-31 09:03:25.403971619 +0000 UTC m=+0.110186541 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.424 221324 DEBUG nova.network.neutron [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.425 221324 DEBUG nova.network.neutron [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.480 221324 DEBUG oslo_concurrency.lockutils [req-44b8b324-01bb-416b-8606-06c3bf8df852 req-95dd4518-81c9-420e-9472-dd309ae5c422 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.481 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.481 221324 DEBUG nova.network.neutron [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.783 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.783 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:25 np0005603622 nova_compute[221301]: 2026-01-31 09:03:25.783 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:26.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:27.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:27 np0005603622 nova_compute[221301]: 2026-01-31 09:03:27.962 221324 DEBUG nova.network.neutron [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:27 np0005603622 nova_compute[221301]: 2026-01-31 09:03:27.993 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:27 np0005603622 nova_compute[221301]: 2026-01-31 09:03:27.995 221324 DEBUG nova.compute.manager [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:28.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:28 np0005603622 nova_compute[221301]: 2026-01-31 09:03:28.953 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:29.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:30 np0005603622 nova_compute[221301]: 2026-01-31 09:03:30.478 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:30.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:30.922 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:30.923 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:30.924 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:30 np0005603622 kernel: tap3154ceb6-2a (unregistering): left promiscuous mode
Jan 31 04:03:30 np0005603622 NetworkManager[49080]: <info>  [1769850210.9977] device (tap3154ceb6-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:03:30 np0005603622 nova_compute[221301]: 2026-01-31 09:03:30.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00947|binding|INFO|Releasing lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb from this chassis (sb_readonly=0)
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00948|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb down in Southbound
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00949|binding|INFO|Removing iface tap3154ceb6-2a ovn-installed in OVS
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.010 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.017 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.044 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 daf6a038-8629-4623-8484-ce1c70772b88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.047 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.051 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:03:31 np0005603622 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000c4.scope: Deactivated successfully.
Jan 31 04:03:31 np0005603622 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000c4.scope: Consumed 15.427s CPU time.
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.052 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d4c918-e215-402b-95fa-beaed97bcb1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.054 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace which is not needed anymore#033[00m
Jan 31 04:03:31 np0005603622 systemd-machined[190406]: Machine qemu-97-instance-000000c4 terminated.
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [NOTICE]   (300821) : haproxy version is 2.8.14-c23fe91
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [NOTICE]   (300821) : path to executable is /usr/sbin/haproxy
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [WARNING]  (300821) : Exiting Master process...
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [WARNING]  (300821) : Exiting Master process...
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [ALERT]    (300821) : Current worker (300823) exited with code 143 (Terminated)
Jan 31 04:03:31 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[300812]: [WARNING]  (300821) : All workers exited. Exiting... (0)
Jan 31 04:03:31 np0005603622 systemd[1]: libpod-2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933.scope: Deactivated successfully.
Jan 31 04:03:31 np0005603622 podman[301579]: 2026-01-31 09:03:31.207571654 +0000 UTC m=+0.062936626 container died 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:03:31 np0005603622 kernel: tap3154ceb6-2a: entered promiscuous mode
Jan 31 04:03:31 np0005603622 NetworkManager[49080]: <info>  [1769850211.2252] manager: (tap3154ceb6-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/417)
Jan 31 04:03:31 np0005603622 kernel: tap3154ceb6-2a (unregistering): left promiscuous mode
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00950|binding|INFO|Claiming lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for this chassis.
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00951|binding|INFO|3154ceb6-2afa-43f3-bb3a-ddf285ae43eb: Claiming fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.232 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:31Z|00952|binding|INFO|Releasing lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb from this chassis (sb_readonly=0)
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.243 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.245 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 daf6a038-8629-4623-8484-ce1c70772b88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:31 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933-userdata-shm.mount: Deactivated successfully.
Jan 31 04:03:31 np0005603622 systemd[1]: var-lib-containers-storage-overlay-66103a895ade2f7563253136d89e67217ebbee68dd4d10b936a4e6870a9fa82e-merged.mount: Deactivated successfully.
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.260 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 daf6a038-8629-4623-8484-ce1c70772b88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:31.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:31 np0005603622 podman[301579]: 2026-01-31 09:03:31.283661921 +0000 UTC m=+0.139026903 container cleanup 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:03:31 np0005603622 systemd[1]: libpod-conmon-2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933.scope: Deactivated successfully.
Jan 31 04:03:31 np0005603622 podman[301620]: 2026-01-31 09:03:31.367824964 +0000 UTC m=+0.062839913 container remove 2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.372 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d0bf8f7a-e004-4e0f-8be7-8192d491d5be]: (4, ('Sat Jan 31 09:03:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933)\n2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933\nSat Jan 31 09:03:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933)\n2ea3e28c3cbbc3b0478574e36db5d5778fe3bf81d25d924d173ce1d2edf3c933\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.374 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cf13b724-119f-447b-a273-93f4c501ecca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.376 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:31 np0005603622 kernel: tap372c2b16-d0: left promiscuous mode
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.386 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.389 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0b86bd-1d90-4ae1-a958-047db511a6fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.406 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bae273bd-9739-4dee-9c8c-a6b717988b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.408 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd6ae45-a2ea-4cd0-9099-7801c9f534d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.431 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[01b202e8-0391-467d-9f0d-b64fe12505c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 939853, 'reachable_time': 15215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301639, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 systemd[1]: run-netns-ovnmeta\x2d372c2b16\x2ddcb3\x2d40d9\x2da594\x2d5fc185a4de0f.mount: Deactivated successfully.
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.437 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.438 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[41173366-9dcd-4b48-8cd6-6d6007204c05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.439 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.441 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.442 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff36a5c-b900-48fc-8ded-43fa7b6418f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.443 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.445 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:03:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:31.445 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7118e33e-4605-4b9d-8b94-060f53b4c82a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.529 221324 DEBUG nova.compute.manager [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.530 221324 DEBUG oslo_concurrency.lockutils [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.530 221324 DEBUG oslo_concurrency.lockutils [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.530 221324 DEBUG oslo_concurrency.lockutils [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.531 221324 DEBUG nova.compute.manager [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:31 np0005603622 nova_compute[221301]: 2026-01-31 09:03:31.531 221324 WARNING nova.compute.manager [req-1412526a-33f0-438a-8eb3-7d827c983ecc req-2006a820-e670-4ecb-8a47-edfebd92b5d1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state reboot_started.#033[00m
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.256 221324 INFO nova.virt.libvirt.driver [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance shutdown successfully.#033[00m
Jan 31 04:03:32 np0005603622 kernel: tap3154ceb6-2a: entered promiscuous mode
Jan 31 04:03:32 np0005603622 systemd-udevd[301557]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.3250] manager: (tap3154ceb6-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/418)
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:32Z|00953|binding|INFO|Claiming lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for this chassis.
Jan 31 04:03:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:32Z|00954|binding|INFO|3154ceb6-2afa-43f3-bb3a-ddf285ae43eb: Claiming fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:03:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:32Z|00955|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb ovn-installed in OVS
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.3396] device (tap3154ceb6-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.3409] device (tap3154ceb6-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:03:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:32Z|00956|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb up in Southbound
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.343 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.346 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6 daf6a038-8629-4623-8484-ce1c70772b88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.348 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f bound to our chassis#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.351 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f#033[00m
Jan 31 04:03:32 np0005603622 systemd-machined[190406]: New machine qemu-99-instance-000000c4.
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.368 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[62f2b6ff-a795-487a-8ebc-a4069703091e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.369 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap372c2b16-d1 in ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.371 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap372c2b16-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.371 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6592b342-fe19-4229-8e16-dd896746f277]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.372 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a780f5ee-b2e2-4d3c-bc3d-6f07866b4c92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 systemd[1]: Started Virtual Machine qemu-99-instance-000000c4.
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.384 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7e68d4-12a8-4703-85d8-acacd1c33980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.399 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb79728-1730-4dc6-b708-04e8df7f8cb2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.431 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e4d3a8-14c0-4e47-aec0-fac2c1b0fb01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.435 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c7290e26-1636-493d-b908-f020c382d8db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.4373] manager: (tap372c2b16-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/419)
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.463 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f28a621f-4654-4b31-9bec-42a326a97f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.466 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[80d52825-f5a1-4f21-9b3d-cdf77dffe6a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.4844] device (tap372c2b16-d0): carrier: link connected
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.491 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0a234262-5f99-430f-9770-8a440384837a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.506 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9798b1ba-23d7-4638-a35b-bef65f948847]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 287], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 946382, 'reachable_time': 29913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301683, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.525 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[059910b0-81b8-4b0c-b922-98a54cce6735]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:3847'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 946382, 'tstamp': 946382}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301684, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.540 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a1d31b-d3d5-4b5c-92d5-c69489c86077]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap372c2b16-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:38:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 287], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 946382, 'reachable_time': 29913, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 301685, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbc3b98-d436-4d8b-b951-08272a794b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.611 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea7e99f-4ec9-4368-bcfd-ecc78983d1f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.612 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.613 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.613 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap372c2b16-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:32 np0005603622 kernel: tap372c2b16-d0: entered promiscuous mode
Jan 31 04:03:32 np0005603622 NetworkManager[49080]: <info>  [1769850212.6162] manager: (tap372c2b16-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.617 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap372c2b16-d0, col_values=(('external_ids', {'iface-id': 'a077e97d-4d5f-408b-a598-1a37aa5f7556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.617 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:32.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:32Z|00957|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.620 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.621 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4ca5da-4cb2-446f-a16a-241fb605adf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.621 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/372c2b16-dcb3-40d9-a594-5fc185a4de0f.pid.haproxy
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 372c2b16-dcb3-40d9-a594-5fc185a4de0f
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:03:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:32.622 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'env', 'PROCESS_TAG=haproxy-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/372c2b16-dcb3-40d9-a594-5fc185a4de0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:03:32 np0005603622 nova_compute[221301]: 2026-01-31 09:03:32.627 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:32 np0005603622 podman[301717]: 2026-01-31 09:03:32.953078953 +0000 UTC m=+0.049497326 container create b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:03:32 np0005603622 systemd[1]: Started libpod-conmon-b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8.scope.
Jan 31 04:03:33 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:03:33 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bca3c81a2fd506142a53e084a21e25512dd6bc041a1f307cce8f97b9d0ad12f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:03:33 np0005603622 podman[301717]: 2026-01-31 09:03:32.927272222 +0000 UTC m=+0.023690635 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:03:33 np0005603622 podman[301717]: 2026-01-31 09:03:33.029586261 +0000 UTC m=+0.126004644 container init b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:03:33 np0005603622 podman[301717]: 2026-01-31 09:03:33.036031763 +0000 UTC m=+0.132450126 container start b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:03:33 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [NOTICE]   (301736) : New worker (301738) forked
Jan 31 04:03:33 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [NOTICE]   (301736) : Loading success.
Jan 31 04:03:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:33.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.652 221324 DEBUG nova.virt.libvirt.host [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Removed pending event for 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.653 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850213.6515732, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.653 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.659 221324 INFO nova.virt.libvirt.driver [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance running successfully.#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.660 221324 INFO nova.virt.libvirt.driver [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance soft rebooted successfully.#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.661 221324 DEBUG nova.compute.manager [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.707 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.712 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.749 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.750 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850213.651783, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.751 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Started (Lifecycle Event)#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.760 221324 DEBUG oslo_concurrency.lockutils [None req-c8705e60-fcdb-4ed0-b148-42705fe2b5f0 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 12.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.820 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.825 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:03:33 np0005603622 nova_compute[221301]: 2026-01-31 09:03:33.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:34.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:35.184 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:35.186 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:03:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:35.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.358 221324 DEBUG nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.358 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.359 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.359 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.359 221324 DEBUG nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.360 221324 WARNING nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state None.#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.360 221324 DEBUG nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.361 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.361 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.361 221324 DEBUG oslo_concurrency.lockutils [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.362 221324 DEBUG nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.362 221324 WARNING nova.compute.manager [req-a2780688-6b66-4af4-8911-4ade6cd582f1 req-73e0dcc5-b5f6-4c8e-beb4-89ce36ea878a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state None.#033[00m
Jan 31 04:03:35 np0005603622 nova_compute[221301]: 2026-01-31 09:03:35.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:36.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:37.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.579 221324 DEBUG nova.compute.manager [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.579 221324 DEBUG oslo_concurrency.lockutils [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.579 221324 DEBUG oslo_concurrency.lockutils [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.580 221324 DEBUG oslo_concurrency.lockutils [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.580 221324 DEBUG nova.compute.manager [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:03:37 np0005603622 nova_compute[221301]: 2026-01-31 09:03:37.580 221324 WARNING nova.compute.manager [req-9c2b8eb5-7822-47eb-a43c-02a6428759a2 req-b263e8e3-d13f-43b5-9cf1-644362478e7e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state None.#033[00m
Jan 31 04:03:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:38.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:38 np0005603622 nova_compute[221301]: 2026-01-31 09:03:38.958 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:39 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:03:39.188 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:03:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:39.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:40 np0005603622 nova_compute[221301]: 2026-01-31 09:03:40.483 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:41.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:42.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:43.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:43 np0005603622 nova_compute[221301]: 2026-01-31 09:03:43.961 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:44 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:44Z|00958|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:03:44 np0005603622 nova_compute[221301]: 2026-01-31 09:03:44.127 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:44.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:45.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:45 np0005603622 nova_compute[221301]: 2026-01-31 09:03:45.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:03:45 np0005603622 nova_compute[221301]: 2026-01-31 09:03:45.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:45 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:45Z|00959|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:03:45 np0005603622 nova_compute[221301]: 2026-01-31 09:03:45.630 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:46Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:04:1c 10.100.0.10
Jan 31 04:03:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:03:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:46.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:03:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:47.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:48.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:48 np0005603622 nova_compute[221301]: 2026-01-31 09:03:48.964 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:49.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:50 np0005603622 nova_compute[221301]: 2026-01-31 09:03:50.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603622 nova_compute[221301]: 2026-01-31 09:03:50.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:52.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:53 np0005603622 nova_compute[221301]: 2026-01-31 09:03:53.966 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.463 221324 DEBUG nova.compute.manager [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.463 221324 DEBUG nova.compute.manager [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.463 221324 DEBUG oslo_concurrency.lockutils [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.464 221324 DEBUG oslo_concurrency.lockutils [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.464 221324 DEBUG nova.network.neutron [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:03:54 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:54Z|00960|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:03:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:54.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:03:54 np0005603622 ovn_controller[130109]: 2026-01-31T09:03:54Z|00961|binding|INFO|Releasing lport a077e97d-4d5f-408b-a598-1a37aa5f7556 from this chassis (sb_readonly=0)
Jan 31 04:03:54 np0005603622 nova_compute[221301]: 2026-01-31 09:03:54.673 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:55 np0005603622 nova_compute[221301]: 2026-01-31 09:03:55.493 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:56 np0005603622 podman[301810]: 2026-01-31 09:03:56.143233377 +0000 UTC m=+0.058476115 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:03:56 np0005603622 podman[301809]: 2026-01-31 09:03:56.179484438 +0000 UTC m=+0.096968056 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:03:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:03:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:56.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:57.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:57 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:03:57 np0005603622 nova_compute[221301]: 2026-01-31 09:03:57.899 221324 DEBUG nova.network.neutron [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:03:57 np0005603622 nova_compute[221301]: 2026-01-31 09:03:57.901 221324 DEBUG nova.network.neutron [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:03:57 np0005603622 nova_compute[221301]: 2026-01-31 09:03:57.944 221324 DEBUG oslo_concurrency.lockutils [req-457de33a-9fb0-478c-80c3-396241c0af9f req-972ca15b-ec90-4be6-8102-846a817729d6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:03:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:03:58.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:03:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:03:58 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:03:58 np0005603622 nova_compute[221301]: 2026-01-31 09:03:58.968 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:03:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:03:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:03:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:03:59.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:00 np0005603622 nova_compute[221301]: 2026-01-31 09:04:00.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:00.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:01.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:02 np0005603622 nova_compute[221301]: 2026-01-31 09:04:02.095 221324 DEBUG nova.compute.manager [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:02 np0005603622 nova_compute[221301]: 2026-01-31 09:04:02.095 221324 DEBUG nova.compute.manager [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing instance network info cache due to event network-changed-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:04:02 np0005603622 nova_compute[221301]: 2026-01-31 09:04:02.096 221324 DEBUG oslo_concurrency.lockutils [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:04:02 np0005603622 nova_compute[221301]: 2026-01-31 09:04:02.096 221324 DEBUG oslo_concurrency.lockutils [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:04:02 np0005603622 nova_compute[221301]: 2026-01-31 09:04:02.096 221324 DEBUG nova.network.neutron [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Refreshing network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:04:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:02.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:03.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:03 np0005603622 nova_compute[221301]: 2026-01-31 09:04:03.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:04:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:04.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:04:04 np0005603622 nova_compute[221301]: 2026-01-31 09:04:04.811 221324 DEBUG oslo_concurrency.lockutils [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:04 np0005603622 nova_compute[221301]: 2026-01-31 09:04:04.812 221324 DEBUG oslo_concurrency.lockutils [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:04 np0005603622 nova_compute[221301]: 2026-01-31 09:04:04.832 221324 INFO nova.compute.manager [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Detaching volume 881e1975-bbbe-4bd4-b026-dc1919960757#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.019 221324 DEBUG nova.network.neutron [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updated VIF entry in instance network info cache for port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.020 221324 DEBUG nova.network.neutron [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [{"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.036 221324 DEBUG oslo_concurrency.lockutils [req-3568cc95-2c8a-4b7f-8116-a681c2473efc req-0892714d-e348-424b-98d2-93d8441a9b7a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.201 221324 INFO nova.virt.block_device [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Attempting to driver detach volume 881e1975-bbbe-4bd4-b026-dc1919960757 from mountpoint /dev/vdb#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.214 221324 DEBUG nova.virt.libvirt.driver [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Attempting to detach device vdb from instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.215 221324 DEBUG nova.virt.libvirt.guest [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-881e1975-bbbe-4bd4-b026-dc1919960757">
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <serial>881e1975-bbbe-4bd4-b026-dc1919960757</serial>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:04:05 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.229 221324 INFO nova.virt.libvirt.driver [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully detached device vdb from instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 from the persistent domain config.#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.229 221324 DEBUG nova.virt.libvirt.driver [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.230 221324 DEBUG nova.virt.libvirt.guest [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-881e1975-bbbe-4bd4-b026-dc1919960757">
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <serial>881e1975-bbbe-4bd4-b026-dc1919960757</serial>
Jan 31 04:04:05 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 04:04:05 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:04:05 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:04:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:05.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.448 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769850245.448292, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.450 221324 DEBUG nova.virt.libvirt.driver [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.453 221324 INFO nova.virt.libvirt.driver [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully detached device vdb from instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 from the live domain config.#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.942 221324 DEBUG nova.objects.instance [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'flavor' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:04:05 np0005603622 nova_compute[221301]: 2026-01-31 09:04:05.994 221324 DEBUG oslo_concurrency.lockutils [None req-80db54dd-0e26-4537-8e41-72168bdb08b4 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:08.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:08 np0005603622 nova_compute[221301]: 2026-01-31 09:04:08.975 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:09.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:10 np0005603622 nova_compute[221301]: 2026-01-31 09:04:10.500 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:04:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:04:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:11.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.582 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.582 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.583 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.583 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.583 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.584 221324 INFO nova.compute.manager [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Terminating instance#033[00m
Jan 31 04:04:12 np0005603622 nova_compute[221301]: 2026-01-31 09:04:12.585 221324 DEBUG nova.compute.manager [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:04:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:04:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:04:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:13.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:13 np0005603622 kernel: tap3154ceb6-2a (unregistering): left promiscuous mode
Jan 31 04:04:13 np0005603622 NetworkManager[49080]: <info>  [1769850253.3756] device (tap3154ceb6-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:04:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:13Z|00962|binding|INFO|Releasing lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb from this chassis (sb_readonly=0)
Jan 31 04:04:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:13Z|00963|binding|INFO|Setting lport 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb down in Southbound
Jan 31 04:04:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:13Z|00964|binding|INFO|Removing iface tap3154ceb6-2a ovn-installed in OVS
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.395 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.399 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:04:1c 10.100.0.10'], port_security=['fa:16:3e:f7:04:1c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d3e35e8f8144cd58baa994b70ef9668', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0f67f4e6-5dee-4537-8a43-5daf64a11ff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bc6b884-4c13-493f-b55d-7e323b6a5c45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.402 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb in datapath 372c2b16-dcb3-40d9-a594-5fc185a4de0f unbound from our chassis#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.406 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 372c2b16-dcb3-40d9-a594-5fc185a4de0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.408 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9700ebef-e7d3-45f8-8e5b-381506c412d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.409 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f namespace which is not needed anymore#033[00m
Jan 31 04:04:13 np0005603622 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000c4.scope: Deactivated successfully.
Jan 31 04:04:13 np0005603622 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000c4.scope: Consumed 14.593s CPU time.
Jan 31 04:04:13 np0005603622 systemd-machined[190406]: Machine qemu-99-instance-000000c4 terminated.
Jan 31 04:04:13 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [NOTICE]   (301736) : haproxy version is 2.8.14-c23fe91
Jan 31 04:04:13 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [NOTICE]   (301736) : path to executable is /usr/sbin/haproxy
Jan 31 04:04:13 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [WARNING]  (301736) : Exiting Master process...
Jan 31 04:04:13 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [ALERT]    (301736) : Current worker (301738) exited with code 143 (Terminated)
Jan 31 04:04:13 np0005603622 neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f[301732]: [WARNING]  (301736) : All workers exited. Exiting... (0)
Jan 31 04:04:13 np0005603622 systemd[1]: libpod-b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8.scope: Deactivated successfully.
Jan 31 04:04:13 np0005603622 podman[302062]: 2026-01-31 09:04:13.531979553 +0000 UTC m=+0.045183420 container died b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:04:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8-userdata-shm.mount: Deactivated successfully.
Jan 31 04:04:13 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8bca3c81a2fd506142a53e084a21e25512dd6bc041a1f307cce8f97b9d0ad12f-merged.mount: Deactivated successfully.
Jan 31 04:04:13 np0005603622 podman[302062]: 2026-01-31 09:04:13.570482544 +0000 UTC m=+0.083686401 container cleanup b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:04:13 np0005603622 systemd[1]: libpod-conmon-b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8.scope: Deactivated successfully.
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.612 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.654 221324 INFO nova.virt.libvirt.driver [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Instance destroyed successfully.#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.655 221324 DEBUG nova.objects.instance [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lazy-loading 'resources' on Instance uuid 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.683 221324 DEBUG nova.virt.libvirt.vif [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-658263724',display_name='tempest-TestMinimumBasicScenario-server-658263724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-658263724',id=196,image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGA+WuywXL+55/vx7uj7PTx1cz1/ItP/f4HuT/p0jrGyBXYVxdICigpiBOn5njD5HFLs9ouHXHjfSOv771OgNM07YTfPB0CvsdhhrfkbZLffGXg9UUWxiE7KGrMJRQCH+g==',key_name='tempest-TestMinimumBasicScenario-2070658277',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:02:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d3e35e8f8144cd58baa994b70ef9668',ramdisk_id='',reservation_id='r-gck4f39w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac04f66-def8-4f84-bf0b-3cd9f30ead2e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1062170996',owner_user_name='tempest-TestMinimumBasicScenario-1062170996-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:03:33Z,user_data=None,user_id='b97882caec944ca3b26c230fcebff775',uuid=83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.684 221324 DEBUG nova.network.os_vif_util [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converting VIF {"id": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "address": "fa:16:3e:f7:04:1c", "network": {"id": "372c2b16-dcb3-40d9-a594-5fc185a4de0f", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1025491448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d3e35e8f8144cd58baa994b70ef9668", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3154ceb6-2a", "ovs_interfaceid": "3154ceb6-2afa-43f3-bb3a-ddf285ae43eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.685 221324 DEBUG nova.network.os_vif_util [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.686 221324 DEBUG os_vif [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.687 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.687 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3154ceb6-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.692 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 podman[302094]: 2026-01-31 09:04:13.692907262 +0000 UTC m=+0.096818853 container remove b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.695 221324 INFO os_vif [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:04:1c,bridge_name='br-int',has_traffic_filtering=True,id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb,network=Network(372c2b16-dcb3-40d9-a594-5fc185a4de0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3154ceb6-2a')#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.697 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[932a1fed-fed7-4871-b27b-4622a126c782]: (4, ('Sat Jan 31 09:04:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8)\nb908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8\nSat Jan 31 09:04:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f (b908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8)\nb908eab7fe815a27d30fa029b5b7e2dcc740170af6ae33715aab40865d52a9c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.699 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[19334785-9899-4b91-96ab-377665e151cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.700 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap372c2b16-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:13 np0005603622 kernel: tap372c2b16-d0: left promiscuous mode
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.710 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad7f97a-627c-407d-87b5-6a1f582ea2ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 nova_compute[221301]: 2026-01-31 09:04:13.719 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.725 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[201f426f-72f4-487e-b199-c7befa0ede53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.727 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c175345a-978f-4162-9b10-aa07f4662827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.740 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[74dcdb8e-a7a5-47c7-8156-e1c949bbd9cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 946376, 'reachable_time': 21286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302134, 'error': None, 'target': 'ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:13 np0005603622 systemd[1]: run-netns-ovnmeta\x2d372c2b16\x2ddcb3\x2d40d9\x2da594\x2d5fc185a4de0f.mount: Deactivated successfully.
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.742 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-372c2b16-dcb3-40d9-a594-5fc185a4de0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:04:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:13.742 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[947c8565-634d-408b-bcc7-d93d7204e9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:14 np0005603622 nova_compute[221301]: 2026-01-31 09:04:14.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:14 np0005603622 nova_compute[221301]: 2026-01-31 09:04:14.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:04:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:04:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/468143087' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:04:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:04:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/468143087' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:04:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:15.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.425 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.426 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.426 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:15 np0005603622 nova_compute[221301]: 2026-01-31 09:04:15.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:16.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:17.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:17 np0005603622 nova_compute[221301]: 2026-01-31 09:04:17.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.357 221324 INFO nova.virt.libvirt.driver [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Deleting instance files /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_del#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.359 221324 INFO nova.virt.libvirt.driver [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Deletion of /var/lib/nova/instances/83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6_del complete#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.645 221324 DEBUG nova.compute.manager [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.645 221324 DEBUG oslo_concurrency.lockutils [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.646 221324 DEBUG oslo_concurrency.lockutils [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.646 221324 DEBUG oslo_concurrency.lockutils [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.646 221324 DEBUG nova.compute.manager [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.646 221324 DEBUG nova.compute.manager [req-2080cd9e-36f6-47cf-99fd-02f2b47057fd req-659e9b22-69f5-42a2-bdb8-0e9d18722bbe fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-unplugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:04:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:18.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.690 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.817 221324 INFO nova.compute.manager [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Took 6.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.818 221324 DEBUG oslo.service.loopingcall [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.818 221324 DEBUG nova.compute.manager [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:04:18 np0005603622 nova_compute[221301]: 2026-01-31 09:04:18.819 221324 DEBUG nova.network.neutron [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:04:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:19 np0005603622 nova_compute[221301]: 2026-01-31 09:04:19.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:20.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.863 221324 DEBUG nova.compute.manager [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.863 221324 DEBUG oslo_concurrency.lockutils [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.864 221324 DEBUG oslo_concurrency.lockutils [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.864 221324 DEBUG oslo_concurrency.lockutils [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.864 221324 DEBUG nova.compute.manager [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] No waiting events found dispatching network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:04:20 np0005603622 nova_compute[221301]: 2026-01-31 09:04:20.864 221324 WARNING nova.compute.manager [req-fc56234d-e2c0-47c4-9d00-fb713bf964d8 req-ad62e20a-f15a-465e-bcb3-eba138e3366e fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received unexpected event network-vif-plugged-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:04:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.446 221324 DEBUG nova.network.neutron [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.731 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.731 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.732 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.732 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:04:21 np0005603622 nova_compute[221301]: 2026-01-31 09:04:21.733 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:04:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/719474154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.190 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.235 221324 DEBUG nova.compute.manager [req-8bed9258-2d30-4494-8bca-af43c228bf2c req-5952559f-0826-46c6-a7d8-23a6aab5063a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Received event network-vif-deleted-3154ceb6-2afa-43f3-bb3a-ddf285ae43eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.235 221324 INFO nova.compute.manager [req-8bed9258-2d30-4494-8bca-af43c228bf2c req-5952559f-0826-46c6-a7d8-23a6aab5063a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Neutron deleted interface 3154ceb6-2afa-43f3-bb3a-ddf285ae43eb; detaching it from the instance and deleting it from the info cache#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.235 221324 DEBUG nova.network.neutron [req-8bed9258-2d30-4494-8bca-af43c228bf2c req-5952559f-0826-46c6-a7d8-23a6aab5063a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.254 221324 INFO nova.compute.manager [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Took 3.44 seconds to deallocate network for instance.#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.276 221324 DEBUG nova.compute.manager [req-8bed9258-2d30-4494-8bca-af43c228bf2c req-5952559f-0826-46c6-a7d8-23a6aab5063a fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Detach interface failed, port_id=3154ceb6-2afa-43f3-bb3a-ddf285ae43eb, reason: Instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.338 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.339 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4289MB free_disk=20.97597885131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.340 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.340 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.355 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.630 221324 WARNING nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.630 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:04:22 np0005603622 nova_compute[221301]: 2026-01-31 09:04:22.630 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:04:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:22.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.128 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:23.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:04:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3658128806' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.579 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.583 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.649 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.693 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.695 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.696 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.696 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.700 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.800 221324 INFO nova.scheduler.client.report [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Deleted allocations for instance 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6#033[00m
Jan 31 04:04:23 np0005603622 nova_compute[221301]: 2026-01-31 09:04:23.990 221324 DEBUG oslo_concurrency.lockutils [None req-68293d53-e61d-48c3-a8d5-d6b2fc43b925 b97882caec944ca3b26c230fcebff775 3d3e35e8f8144cd58baa994b70ef9668 - - default default] Lock "83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:24.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:25.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:25 np0005603622 nova_compute[221301]: 2026-01-31 09:04:25.507 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 31 04:04:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:26.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:27 np0005603622 podman[302186]: 2026-01-31 09:04:27.121957623 +0000 UTC m=+0.051475049 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 31 04:04:27 np0005603622 podman[302185]: 2026-01-31 09:04:27.148070963 +0000 UTC m=+0.075719399 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 31 04:04:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:04:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:27.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:04:27 np0005603622 nova_compute[221301]: 2026-01-31 09:04:27.696 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:27 np0005603622 nova_compute[221301]: 2026-01-31 09:04:27.697 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:04:28 np0005603622 nova_compute[221301]: 2026-01-31 09:04:28.619 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850253.6188948, 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:04:28 np0005603622 nova_compute[221301]: 2026-01-31 09:04:28.620 221324 INFO nova.compute.manager [-] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:04:28 np0005603622 nova_compute[221301]: 2026-01-31 09:04:28.695 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:28.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:28 np0005603622 nova_compute[221301]: 2026-01-31 09:04:28.710 221324 DEBUG nova.compute.manager [None req-3bdd0cd9-518e-41b3-841c-72bebabc0337 - - - - - -] [instance: 83cae6c9-b20c-4ec6-9f1c-5cf06dabdde6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:04:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:29.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:30 np0005603622 nova_compute[221301]: 2026-01-31 09:04:30.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:30.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:30.923 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:30.923 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:30.923 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:31.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 31 04:04:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:33.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:33 np0005603622 nova_compute[221301]: 2026-01-31 09:04:33.697 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:34.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:35.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:35 np0005603622 nova_compute[221301]: 2026-01-31 09:04:35.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:36.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:04:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:37.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:04:38 np0005603622 nova_compute[221301]: 2026-01-31 09:04:38.700 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:38.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:38 np0005603622 nova_compute[221301]: 2026-01-31 09:04:38.835 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:39.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:39 np0005603622 nova_compute[221301]: 2026-01-31 09:04:39.981 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:39 np0005603622 nova_compute[221301]: 2026-01-31 09:04:39.982 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.013 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.120 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.121 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.132 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.133 221324 INFO nova.compute.claims [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.310 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:40.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:04:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3074120452' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.781 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.786 221324 DEBUG nova.compute.provider_tree [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.811 221324 DEBUG nova.scheduler.client.report [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.860 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.862 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.941 221324 INFO nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.946 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.947 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:04:40 np0005603622 nova_compute[221301]: 2026-01-31 09:04:40.990 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:04:41 np0005603622 nova_compute[221301]: 2026-01-31 09:04:41.128 221324 INFO nova.virt.block_device [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Booting with volume snapshot a5bb09a5-5ee5-473c-a7c9-bc49c24b7332 at /dev/vda#033[00m
Jan 31 04:04:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:41.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:41 np0005603622 nova_compute[221301]: 2026-01-31 09:04:41.449 221324 DEBUG nova.policy [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:04:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:42.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:43 np0005603622 nova_compute[221301]: 2026-01-31 09:04:43.018 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Successfully created port: 39773c6d-4081-4edd-80c4-97fa7fa170bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:04:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:43.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:43 np0005603622 nova_compute[221301]: 2026-01-31 09:04:43.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:44.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.200 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Successfully updated port: 39773c6d-4081-4edd-80c4-97fa7fa170bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.248 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.248 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.249 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:04:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:45.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.473 221324 DEBUG nova.compute.manager [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-changed-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.474 221324 DEBUG nova.compute.manager [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Refreshing instance network info cache due to event network-changed-39773c6d-4081-4edd-80c4-97fa7fa170bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.474 221324 DEBUG oslo_concurrency.lockutils [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.516 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:45 np0005603622 nova_compute[221301]: 2026-01-31 09:04:45.626 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:04:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.708 221324 DEBUG os_brick.utils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.711 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.726 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.727 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[88f242ef-2c7b-4c70-9a25-3ac616a3773a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.729 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:46.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.738 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.739 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[263cebc6-d9d4-414c-a360-b5a6c4bc6824]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.741 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.749 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.750 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[cac8d289-be13-4ed9-b8a9-48921852ac45]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.752 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[cddc4129-fbb4-47e0-a007-d670c4e0e777]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.752 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.778 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.780 221324 DEBUG os_brick.initiator.connectors.lightos [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.781 221324 DEBUG os_brick.initiator.connectors.lightos [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.781 221324 DEBUG os_brick.initiator.connectors.lightos [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.781 221324 DEBUG os_brick.utils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (71ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:04:46 np0005603622 nova_compute[221301]: 2026-01-31 09:04:46.781 221324 DEBUG nova.virt.block_device [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating existing volume attachment record: bb9add01-1136-49a5-ade8-ee21559c0532 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:04:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:47.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:47 np0005603622 nova_compute[221301]: 2026-01-31 09:04:47.377 221324 DEBUG nova.network.neutron [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating instance_info_cache with network_info: [{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:47 np0005603622 nova_compute[221301]: 2026-01-31 09:04:47.449 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:04:47 np0005603622 nova_compute[221301]: 2026-01-31 09:04:47.450 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Instance network_info: |[{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:04:47 np0005603622 nova_compute[221301]: 2026-01-31 09:04:47.450 221324 DEBUG oslo_concurrency.lockutils [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:04:47 np0005603622 nova_compute[221301]: 2026-01-31 09:04:47.451 221324 DEBUG nova.network.neutron [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Refreshing network info cache for port 39773c6d-4081-4edd-80c4-97fa7fa170bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.705 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:04:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:48.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.945 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.948 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.948 221324 INFO nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Creating image(s)#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.950 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.951 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Ensure instance console log exists: /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.951 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.952 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.952 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.956 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Start _get_guest_xml network_info=[{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='d41d8cd98f00b204e9800998ecf8427e',container_format='bare',created_at=2026-01-31T09:04:17Z,direct_url=<?>,disk_format='qcow2',id=7ac3b86b-56fa-425d-911b-e0b1bc3c9b64,min_disk=1,min_ram=0,name='tempest-TestVolumeBootPatternsnapshot-724948954',owner='76ce367a834b49dfb5b436848118b860',properties=ImageMetaProps,protected=<?>,size=0,status='active',tags=<?>,updated_at=2026-01-31T09:04:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'boot_index': 0, 'attachment_id': 'bb9add01-1136-49a5-ade8-ee21559c0532', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-5fdc5138-2006-48ee-b966-42f286bea7e1', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '5fdc5138-2006-48ee-b966-42f286bea7e1', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '69fccdf1-17df-4c1f-8408-5f8b451e95e9', 'attached_at': '', 'detached_at': '', 'volume_id': '5fdc5138-2006-48ee-b966-42f286bea7e1', 'serial': '5fdc5138-2006-48ee-b966-42f286bea7e1'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.962 221324 WARNING nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.967 221324 DEBUG nova.virt.libvirt.host [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.968 221324 DEBUG nova.virt.libvirt.host [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.972 221324 DEBUG nova.virt.libvirt.host [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.972 221324 DEBUG nova.virt.libvirt.host [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.974 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.974 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='d41d8cd98f00b204e9800998ecf8427e',container_format='bare',created_at=2026-01-31T09:04:17Z,direct_url=<?>,disk_format='qcow2',id=7ac3b86b-56fa-425d-911b-e0b1bc3c9b64,min_disk=1,min_ram=0,name='tempest-TestVolumeBootPatternsnapshot-724948954',owner='76ce367a834b49dfb5b436848118b860',properties=ImageMetaProps,protected=<?>,size=0,status='active',tags=<?>,updated_at=2026-01-31T09:04:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.975 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.975 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.975 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.976 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.976 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.976 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.976 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.977 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.977 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:04:48 np0005603622 nova_compute[221301]: 2026-01-31 09:04:48.977 221324 DEBUG nova.virt.hardware [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.006 221324 DEBUG nova.storage.rbd_utils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.010 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:49.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:04:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1930839327' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.425 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.467 221324 DEBUG nova.virt.libvirt.vif [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-1233723673',id=201,image_ref='7ac3b86b-56fa-425d-911b-e0b1bc3c9b64',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPCtb7nQ9OPuSr7GQ0CfT7KPntrVfSKZWei3LEpyV1VbDhH3ugPVM/fSAF99q9l2asb44AlWBS0SwKuq2lOIit92sRzMVHs32u6ZuHLCjjWueCcvohT/FuGDaFWnQyQ1mw==',key_name='tempest-keypair-1588807957',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-kf8beh9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-1392945362',image_owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:04:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=69fccdf1-17df-4c1f-8408-5f8b451e95e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.468 221324 DEBUG nova.network.os_vif_util [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.470 221324 DEBUG nova.network.os_vif_util [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.473 221324 DEBUG nova.objects.instance [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid 69fccdf1-17df-4c1f-8408-5f8b451e95e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.506 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <uuid>69fccdf1-17df-4c1f-8408-5f8b451e95e9</uuid>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <name>instance-000000c9</name>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestVolumeBootPattern-image-snapshot-server-1233723673</nova:name>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:04:48</nova:creationTime>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="7ac3b86b-56fa-425d-911b-e0b1bc3c9b64"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <nova:port uuid="39773c6d-4081-4edd-80c4-97fa7fa170bf">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="serial">69fccdf1-17df-4c1f-8408-5f8b451e95e9</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="uuid">69fccdf1-17df-4c1f-8408-5f8b451e95e9</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-5fdc5138-2006-48ee-b966-42f286bea7e1">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <serial>5fdc5138-2006-48ee-b966-42f286bea7e1</serial>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:72:64:42"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <target dev="tap39773c6d-40"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/console.log" append="off"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:04:49 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:04:49 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:04:49 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:04:49 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.508 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Preparing to wait for external event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.508 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.509 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.509 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.510 221324 DEBUG nova.virt.libvirt.vif [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-1233723673',id=201,image_ref='7ac3b86b-56fa-425d-911b-e0b1bc3c9b64',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPCtb7nQ9OPuSr7GQ0CfT7KPntrVfSKZWei3LEpyV1VbDhH3ugPVM/fSAF99q9l2asb44AlWBS0SwKuq2lOIit92sRzMVHs32u6ZuHLCjjWueCcvohT/FuGDaFWnQyQ1mw==',key_name='tempest-keypair-1588807957',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-kf8beh9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-1392945362',image_owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:04:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=69fccdf1-17df-4c1f-8408-5f8b451e95e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.510 221324 DEBUG nova.network.os_vif_util [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.511 221324 DEBUG nova.network.os_vif_util [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.511 221324 DEBUG os_vif [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.512 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.512 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.516 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.516 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39773c6d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.517 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39773c6d-40, col_values=(('external_ids', {'iface-id': '39773c6d-4081-4edd-80c4-97fa7fa170bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:64:42', 'vm-uuid': '69fccdf1-17df-4c1f-8408-5f8b451e95e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:49 np0005603622 NetworkManager[49080]: <info>  [1769850289.5209] manager: (tap39773c6d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/421)
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.526 221324 INFO os_vif [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40')#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.603 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.603 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.604 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:72:64:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.604 221324 INFO nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Using config drive#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.636 221324 DEBUG nova.storage.rbd_utils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.642 221324 DEBUG nova.network.neutron [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updated VIF entry in instance network info cache for port 39773c6d-4081-4edd-80c4-97fa7fa170bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.643 221324 DEBUG nova.network.neutron [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating instance_info_cache with network_info: [{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:04:49 np0005603622 nova_compute[221301]: 2026-01-31 09:04:49.678 221324 DEBUG oslo_concurrency.lockutils [req-a534cfd8-f09c-4280-b236-d76be1f07b7a req-cd438a71-1c08-479e-9558-2085e82a31ba fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.268 221324 INFO nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Creating config drive at /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.276 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy4qoxl_w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.406 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy4qoxl_w" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.443 221324 DEBUG nova.storage.rbd_utils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.446 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config 69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.519 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:50.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.919 221324 DEBUG oslo_concurrency.processutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config 69fccdf1-17df-4c1f-8408-5f8b451e95e9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.920 221324 INFO nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Deleting local config drive /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9/disk.config because it was imported into RBD.#033[00m
Jan 31 04:04:50 np0005603622 kernel: tap39773c6d-40: entered promiscuous mode
Jan 31 04:04:50 np0005603622 NetworkManager[49080]: <info>  [1769850290.9699] manager: (tap39773c6d-40): new Tun device (/org/freedesktop/NetworkManager/Devices/422)
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.969 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:50 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:50Z|00965|binding|INFO|Claiming lport 39773c6d-4081-4edd-80c4-97fa7fa170bf for this chassis.
Jan 31 04:04:50 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:50Z|00966|binding|INFO|39773c6d-4081-4edd-80c4-97fa7fa170bf: Claiming fa:16:3e:72:64:42 10.100.0.14
Jan 31 04:04:50 np0005603622 nova_compute[221301]: 2026-01-31 09:04:50.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:50 np0005603622 systemd-machined[190406]: New machine qemu-100-instance-000000c9.
Jan 31 04:04:50 np0005603622 systemd-udevd[302374]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:04:51 np0005603622 systemd[1]: Started Virtual Machine qemu-100-instance-000000c9.
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.0077] device (tap39773c6d-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.0104] device (tap39773c6d-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.011 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.0146] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/423)
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.0177] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.016 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:64:42 10.100.0.14'], port_security=['fa:16:3e:72:64:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '69fccdf1-17df-4c1f-8408-5f8b451e95e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e67b7140-9bc8-49a3-8c13-31644608dd01', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=39773c6d-4081-4edd-80c4-97fa7fa170bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.018 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 39773c6d-4081-4edd-80c4-97fa7fa170bf in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.020 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.032 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d1dd6c-56a2-4fe5-8deb-9720be557d28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.033 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.035 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.036 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4d044beb-b21a-41d6-af8b-81156deabcf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.036 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[07917d2c-b67a-4999-8b8f-7782ff689d65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.051 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[449473ff-4a9c-4f8b-b22d-79e3ac33c3c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.075 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1eab3d76-622e-4ba7-be24-49de89edc9d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.089 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.112 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[37baccb6-58e8-481b-b02e-1e753a6c91a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 systemd-udevd[302376]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.117 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6c182cf3-a09a-44b1-9867-ecc0db146d73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.1202] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/425)
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.121 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:51Z|00967|binding|INFO|Setting lport 39773c6d-4081-4edd-80c4-97fa7fa170bf ovn-installed in OVS
Jan 31 04:04:51 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:51Z|00968|binding|INFO|Setting lport 39773c6d-4081-4edd-80c4-97fa7fa170bf up in Southbound
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.126 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.156 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[4501d808-8639-416f-b4a6-559e89b197c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.160 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[491171b5-6341-4ef5-89ae-9ede244d710d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.1820] device (tap650eb345-80): carrier: link connected
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.188 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc0ef63-4958-4243-ba70-2c0f8ac0174d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.205 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2a714724-bf66-4bd2-a581-5b114a88182c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 290], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 954252, 'reachable_time': 18696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302407, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.217 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9580a13d-eae1-4141-9353-754e05992604]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 954252, 'tstamp': 954252}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302408, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.230 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e6e3df-6c90-4d61-a8e5-3856164cd900]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 290], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 954252, 'reachable_time': 18696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302409, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.256 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2272afbd-39db-4994-99fb-42a750218477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.298 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.311 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b2ac50-4bc3-4fd4-a328-795ccd596817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.312 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.312 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.313 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:04:51 np0005603622 NetworkManager[49080]: <info>  [1769850291.3195] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.324 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.326 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.327 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcb48ad-6aee-4f7c-9c11-8121c772e47b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.328 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:04:51 np0005603622 ovn_controller[130109]: 2026-01-31T09:04:51Z|00969|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.333 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.334 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:51.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:51 np0005603622 podman[302475]: 2026-01-31 09:04:51.698043621 +0000 UTC m=+0.048578672 container create 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:04:51 np0005603622 systemd[1]: Started libpod-conmon-4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3.scope.
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.733 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850291.7330616, 69fccdf1-17df-4c1f-8408-5f8b451e95e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.734 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] VM Started (Lifecycle Event)#033[00m
Jan 31 04:04:51 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:04:51 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b6b4d44e3b6395bd982cc442bb74f18f7482fae8564753f1dc8e1466ef0c28e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:04:51 np0005603622 podman[302475]: 2026-01-31 09:04:51.761683755 +0000 UTC m=+0.112218836 container init 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:04:51 np0005603622 podman[302475]: 2026-01-31 09:04:51.667805942 +0000 UTC m=+0.018341003 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:04:51 np0005603622 podman[302475]: 2026-01-31 09:04:51.766047312 +0000 UTC m=+0.116582353 container start 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.766 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.770 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850291.7333407, 69fccdf1-17df-4c1f-8408-5f8b451e95e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.770 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:04:51 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [NOTICE]   (302500) : New worker (302502) forked
Jan 31 04:04:51 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [NOTICE]   (302500) : Loading success.
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.801 221324 DEBUG nova.compute.manager [req-dc858037-af7b-4dff-9984-ba07cc04870a req-0093c14c-1fc0-4258-8cbc-f2898cc6d22f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.801 221324 DEBUG oslo_concurrency.lockutils [req-dc858037-af7b-4dff-9984-ba07cc04870a req-0093c14c-1fc0-4258-8cbc-f2898cc6d22f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.802 221324 DEBUG oslo_concurrency.lockutils [req-dc858037-af7b-4dff-9984-ba07cc04870a req-0093c14c-1fc0-4258-8cbc-f2898cc6d22f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.802 221324 DEBUG oslo_concurrency.lockutils [req-dc858037-af7b-4dff-9984-ba07cc04870a req-0093c14c-1fc0-4258-8cbc-f2898cc6d22f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.802 221324 DEBUG nova.compute.manager [req-dc858037-af7b-4dff-9984-ba07cc04870a req-0093c14c-1fc0-4258-8cbc-f2898cc6d22f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Processing event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.802 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.805 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.806 221324 DEBUG nova.virt.libvirt.driver [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.808 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850291.8052297, 69fccdf1-17df-4c1f-8408-5f8b451e95e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.809 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.810 221324 INFO nova.virt.libvirt.driver [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Instance spawned successfully.#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.810 221324 INFO nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Took 2.86 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.811 221324 DEBUG nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:04:51 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:51.817 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.841 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.843 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.881 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.909 221324 INFO nova.compute.manager [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Took 11.83 seconds to build instance.#033[00m
Jan 31 04:04:51 np0005603622 nova_compute[221301]: 2026-01-31 09:04:51.940 221324 DEBUG oslo_concurrency.lockutils [None req-7717018a-68bb-437c-8c17-b24158e4f9dd dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:52.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:52 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:04:52.819 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:04:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:53.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.099 221324 DEBUG nova.compute.manager [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.100 221324 DEBUG oslo_concurrency.lockutils [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.100 221324 DEBUG oslo_concurrency.lockutils [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.101 221324 DEBUG oslo_concurrency.lockutils [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.101 221324 DEBUG nova.compute.manager [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] No waiting events found dispatching network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.102 221324 WARNING nova.compute.manager [req-75c88e2d-8790-4969-96d8-cacd8977d58b req-ec6cbda8-e220-47ad-a233-2b7951d47f7c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received unexpected event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf for instance with vm_state active and task_state None.#033[00m
Jan 31 04:04:54 np0005603622 nova_compute[221301]: 2026-01-31 09:04:54.521 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:54.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:04:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:55.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:04:55 np0005603622 nova_compute[221301]: 2026-01-31 09:04:55.520 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:04:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:04:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:56.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:57.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:58 np0005603622 nova_compute[221301]: 2026-01-31 09:04:58.125 221324 DEBUG nova.compute.manager [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-changed-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:04:58 np0005603622 nova_compute[221301]: 2026-01-31 09:04:58.125 221324 DEBUG nova.compute.manager [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Refreshing instance network info cache due to event network-changed-39773c6d-4081-4edd-80c4-97fa7fa170bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:04:58 np0005603622 nova_compute[221301]: 2026-01-31 09:04:58.126 221324 DEBUG oslo_concurrency.lockutils [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:04:58 np0005603622 nova_compute[221301]: 2026-01-31 09:04:58.126 221324 DEBUG oslo_concurrency.lockutils [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:04:58 np0005603622 nova_compute[221301]: 2026-01-31 09:04:58.126 221324 DEBUG nova.network.neutron [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Refreshing network info cache for port 39773c6d-4081-4edd-80c4-97fa7fa170bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:04:58 np0005603622 podman[302512]: 2026-01-31 09:04:58.129741311 +0000 UTC m=+0.054013247 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 04:04:58 np0005603622 podman[302511]: 2026-01-31 09:04:58.141143947 +0000 UTC m=+0.072891163 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 31 04:04:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:04:58.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:04:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:04:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:04:59.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:04:59 np0005603622 nova_compute[221301]: 2026-01-31 09:04:59.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:00 np0005603622 nova_compute[221301]: 2026-01-31 09:05:00.523 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:00.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:00 np0005603622 nova_compute[221301]: 2026-01-31 09:05:00.791 221324 DEBUG nova.network.neutron [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updated VIF entry in instance network info cache for port 39773c6d-4081-4edd-80c4-97fa7fa170bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:05:00 np0005603622 nova_compute[221301]: 2026-01-31 09:05:00.791 221324 DEBUG nova.network.neutron [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating instance_info_cache with network_info: [{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:00 np0005603622 nova_compute[221301]: 2026-01-31 09:05:00.838 221324 DEBUG oslo_concurrency.lockutils [req-bff7c512-b938-483d-8577-27dcd940f148 req-0835fa3d-0bbe-496c-ab06-9a2b8c3381bc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:01.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:02.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:03.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 31 04:05:03 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:03Z|00115|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.13 does not match offer 10.100.0.14
Jan 31 04:05:03 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:03Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:72:64:42 10.100.0.14
Jan 31 04:05:04 np0005603622 nova_compute[221301]: 2026-01-31 09:05:04.526 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 31 04:05:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:04.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:05.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:05 np0005603622 nova_compute[221301]: 2026-01-31 09:05:05.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 31 04:05:06 np0005603622 nova_compute[221301]: 2026-01-31 09:05:06.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:06.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 31 04:05:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:07.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:08 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:08Z|00117|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.13 does not match offer 10.100.0.14
Jan 31 04:05:08 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:08Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:72:64:42 10.100.0.14
Jan 31 04:05:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:05:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:05:08 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:08Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:72:64:42 10.100.0.14
Jan 31 04:05:08 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:08Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:72:64:42 10.100.0.14
Jan 31 04:05:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:09.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:09 np0005603622 nova_compute[221301]: 2026-01-31 09:05:09.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:05:10 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3674756384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:05:10 np0005603622 nova_compute[221301]: 2026-01-31 09:05:10.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:10.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:05:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:05:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:11.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:12.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 31 04:05:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:13.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:14 np0005603622 nova_compute[221301]: 2026-01-31 09:05:14.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:14 np0005603622 nova_compute[221301]: 2026-01-31 09:05:14.531 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:14.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:15 np0005603622 nova_compute[221301]: 2026-01-31 09:05:15.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:15 np0005603622 nova_compute[221301]: 2026-01-31 09:05:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:05:15 np0005603622 nova_compute[221301]: 2026-01-31 09:05:15.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:05:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:16.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.982 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.983 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.984 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:05:16 np0005603622 nova_compute[221301]: 2026-01-31 09:05:16.984 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 69fccdf1-17df-4c1f-8408-5f8b451e95e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:17.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:05:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:19.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.534 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.671 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating instance_info_cache with network_info: [{"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.707 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-69fccdf1-17df-4c1f-8408-5f8b451e95e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.707 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.708 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:19 np0005603622 nova_compute[221301]: 2026-01-31 09:05:19.708 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:20 np0005603622 nova_compute[221301]: 2026-01-31 09:05:20.532 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:20.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:21.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.446 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.446 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4054261649' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:21 np0005603622 nova_compute[221301]: 2026-01-31 09:05:21.898 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.066 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.067 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.245 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.246 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4097MB free_disk=20.98794174194336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.246 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.247 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.387 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 69fccdf1-17df-4c1f-8408-5f8b451e95e9 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.388 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.388 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.452 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:22.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1185832184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.900 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.908 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:22 np0005603622 nova_compute[221301]: 2026-01-31 09:05:22.983 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:23 np0005603622 nova_compute[221301]: 2026-01-31 09:05:23.030 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:05:23 np0005603622 nova_compute[221301]: 2026-01-31 09:05:23.030 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:23.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:24 np0005603622 nova_compute[221301]: 2026-01-31 09:05:24.539 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:24.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:25.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:25 np0005603622 nova_compute[221301]: 2026-01-31 09:05:25.535 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:26.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:27.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:05:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2879014838' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:05:28 np0005603622 nova_compute[221301]: 2026-01-31 09:05:28.030 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:28 np0005603622 nova_compute[221301]: 2026-01-31 09:05:28.031 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:28 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:28Z|00970|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:05:28 np0005603622 nova_compute[221301]: 2026-01-31 09:05:28.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:28.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:29 np0005603622 podman[302780]: 2026-01-31 09:05:29.142110589 +0000 UTC m=+0.059087103 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:05:29 np0005603622 podman[302779]: 2026-01-31 09:05:29.1705397 +0000 UTC m=+0.089629471 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:05:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:29.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:29 np0005603622 nova_compute[221301]: 2026-01-31 09:05:29.541 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:30 np0005603622 nova_compute[221301]: 2026-01-31 09:05:30.538 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:30.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:30.924 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:30.925 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:30.926 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:05:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:31.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:05:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.615 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.616 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.617 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.618 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.618 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.621 221324 INFO nova.compute.manager [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Terminating instance#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.624 221324 DEBUG nova.compute.manager [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:05:32 np0005603622 kernel: tap39773c6d-40 (unregistering): left promiscuous mode
Jan 31 04:05:32 np0005603622 NetworkManager[49080]: <info>  [1769850332.6953] device (tap39773c6d-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.705 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:32Z|00971|binding|INFO|Releasing lport 39773c6d-4081-4edd-80c4-97fa7fa170bf from this chassis (sb_readonly=0)
Jan 31 04:05:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:32Z|00972|binding|INFO|Setting lport 39773c6d-4081-4edd-80c4-97fa7fa170bf down in Southbound
Jan 31 04:05:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:05:32Z|00973|binding|INFO|Removing iface tap39773c6d-40 ovn-installed in OVS
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.715 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:64:42 10.100.0.14'], port_security=['fa:16:3e:72:64:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '69fccdf1-17df-4c1f-8408-5f8b451e95e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e67b7140-9bc8-49a3-8c13-31644608dd01', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.211'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=39773c6d-4081-4edd-80c4-97fa7fa170bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.716 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 39773c6d-4081-4edd-80c4-97fa7fa170bf in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.718 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.720 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[33745dd2-c933-46d0-9bba-c7470fadeeca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.721 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:05:32 np0005603622 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000c9.scope: Deactivated successfully.
Jan 31 04:05:32 np0005603622 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000c9.scope: Consumed 14.525s CPU time.
Jan 31 04:05:32 np0005603622 systemd-machined[190406]: Machine qemu-100-instance-000000c9 terminated.
Jan 31 04:05:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:32.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:32 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [NOTICE]   (302500) : haproxy version is 2.8.14-c23fe91
Jan 31 04:05:32 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [NOTICE]   (302500) : path to executable is /usr/sbin/haproxy
Jan 31 04:05:32 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [WARNING]  (302500) : Exiting Master process...
Jan 31 04:05:32 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [ALERT]    (302500) : Current worker (302502) exited with code 143 (Terminated)
Jan 31 04:05:32 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[302496]: [WARNING]  (302500) : All workers exited. Exiting... (0)
Jan 31 04:05:32 np0005603622 systemd[1]: libpod-4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3.scope: Deactivated successfully.
Jan 31 04:05:32 np0005603622 podman[302851]: 2026-01-31 09:05:32.859225556 +0000 UTC m=+0.052838456 container died 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.862 221324 INFO nova.virt.libvirt.driver [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Instance destroyed successfully.#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.863 221324 DEBUG nova.objects.instance [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid 69fccdf1-17df-4c1f-8408-5f8b451e95e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3-userdata-shm.mount: Deactivated successfully.
Jan 31 04:05:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay-5b6b4d44e3b6395bd982cc442bb74f18f7482fae8564753f1dc8e1466ef0c28e-merged.mount: Deactivated successfully.
Jan 31 04:05:32 np0005603622 podman[302851]: 2026-01-31 09:05:32.904084767 +0000 UTC m=+0.097697677 container cleanup 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.905 221324 DEBUG nova.virt.libvirt.vif [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:04:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-1233723673',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-1233723673',id=201,image_ref='7ac3b86b-56fa-425d-911b-e0b1bc3c9b64',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPCtb7nQ9OPuSr7GQ0CfT7KPntrVfSKZWei3LEpyV1VbDhH3ugPVM/fSAF99q9l2asb44AlWBS0SwKuq2lOIit92sRzMVHs32u6ZuHLCjjWueCcvohT/FuGDaFWnQyQ1mw==',key_name='tempest-keypair-1588807957',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:04:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-kf8beh9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-1392945362',image_owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:04:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=69fccdf1-17df-4c1f-8408-5f8b451e95e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.906 221324 DEBUG nova.network.os_vif_util [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "address": "fa:16:3e:72:64:42", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39773c6d-40", "ovs_interfaceid": "39773c6d-4081-4edd-80c4-97fa7fa170bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.909 221324 DEBUG nova.network.os_vif_util [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.910 221324 DEBUG os_vif [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:05:32 np0005603622 systemd[1]: libpod-conmon-4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3.scope: Deactivated successfully.
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.916 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39773c6d-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.921 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.924 221324 INFO os_vif [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:64:42,bridge_name='br-int',has_traffic_filtering=True,id=39773c6d-4081-4edd-80c4-97fa7fa170bf,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39773c6d-40')#033[00m
Jan 31 04:05:32 np0005603622 podman[302893]: 2026-01-31 09:05:32.987933691 +0000 UTC m=+0.063126541 container remove 4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.992 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[02286abc-f6e3-4036-938b-30b66933db96]: (4, ('Sat Jan 31 09:05:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3)\n4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3\nSat Jan 31 09:05:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3)\n4b747906b755adc87663b8a0efbdcb5d89c011568b8b90823ff5003d17fbcad3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.993 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8de26c-709d-403e-a27e-ddaa8cd7ac60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:32.994 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:32 np0005603622 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:05:32 np0005603622 nova_compute[221301]: 2026-01-31 09:05:32.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.002 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.003 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6cce8c23-3aee-48ef-9a3e-96c0d363f9cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.018 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[58910e98-8ddb-4c87-9451-e51b234c3ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.019 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d6038673-5d42-4fdf-9983-677bf47c0232]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.030 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5996f2b4-d73b-4dad-806a-8a826e11c322]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 954244, 'reachable_time': 19350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302927, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:33 np0005603622 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.034 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:05:33 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:33.034 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[ef3f9588-fb72-433c-9a91-d207a7778ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.178 221324 INFO nova.virt.libvirt.driver [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Deleting instance files /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9_del#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.180 221324 INFO nova.virt.libvirt.driver [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Deletion of /var/lib/nova/instances/69fccdf1-17df-4c1f-8408-5f8b451e95e9_del complete#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.254 221324 INFO nova.compute.manager [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Took 0.63 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.256 221324 DEBUG oslo.service.loopingcall [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.258 221324 DEBUG nova.compute.manager [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:05:33 np0005603622 nova_compute[221301]: 2026-01-31 09:05:33.258 221324 DEBUG nova.network.neutron [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:05:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:33.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.121 221324 DEBUG nova.compute.manager [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-unplugged-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.122 221324 DEBUG oslo_concurrency.lockutils [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.122 221324 DEBUG oslo_concurrency.lockutils [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.122 221324 DEBUG oslo_concurrency.lockutils [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.122 221324 DEBUG nova.compute.manager [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] No waiting events found dispatching network-vif-unplugged-39773c6d-4081-4edd-80c4-97fa7fa170bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:34 np0005603622 nova_compute[221301]: 2026-01-31 09:05:34.123 221324 DEBUG nova.compute.manager [req-1179de4d-b683-45b8-98d3-bfe4df68ebdd req-2a0cf53c-823a-4f43-a77e-c5972e328fb2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-unplugged-39773c6d-4081-4edd-80c4-97fa7fa170bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:05:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:34.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:35.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:35 np0005603622 nova_compute[221301]: 2026-01-31 09:05:35.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:35 np0005603622 nova_compute[221301]: 2026-01-31 09:05:35.702 221324 DEBUG nova.network.neutron [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:35 np0005603622 nova_compute[221301]: 2026-01-31 09:05:35.761 221324 INFO nova.compute.manager [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Took 2.50 seconds to deallocate network for instance.#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.089 221324 INFO nova.compute.manager [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Took 0.33 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.090 221324 DEBUG nova.compute.manager [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Deleting volume: 5fdc5138-2006-48ee-b966-42f286bea7e1 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 31 04:05:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.508 221324 DEBUG nova.compute.manager [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.509 221324 DEBUG oslo_concurrency.lockutils [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.509 221324 DEBUG oslo_concurrency.lockutils [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.510 221324 DEBUG oslo_concurrency.lockutils [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.510 221324 DEBUG nova.compute.manager [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] No waiting events found dispatching network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.510 221324 WARNING nova.compute.manager [req-f974c211-b118-409e-8511-6bd733d2b818 req-3213e656-3575-4c84-80eb-5ad024913f91 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received unexpected event network-vif-plugged-39773c6d-4081-4edd-80c4-97fa7fa170bf for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:05:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:36.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.869 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.869 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:36 np0005603622 nova_compute[221301]: 2026-01-31 09:05:36.977 221324 DEBUG oslo_concurrency.processutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:37.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3495840186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.432 221324 DEBUG oslo_concurrency.processutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.438 221324 DEBUG nova.compute.provider_tree [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.474 221324 DEBUG nova.scheduler.client.report [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.507 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.571 221324 INFO nova.scheduler.client.report [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance 69fccdf1-17df-4c1f-8408-5f8b451e95e9#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.666 221324 DEBUG oslo_concurrency.lockutils [None req-2e882c11-f064-4933-95be-7bfaaba062d2 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "69fccdf1-17df-4c1f-8408-5f8b451e95e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:37 np0005603622 nova_compute[221301]: 2026-01-31 09:05:37.919 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:38 np0005603622 nova_compute[221301]: 2026-01-31 09:05:38.201 221324 DEBUG nova.compute.manager [req-7b1eb216-5c7f-4140-a8bc-f995fffffc78 req-2be26a4d-4f59-4375-87ae-887adb26b921 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Received event network-vif-deleted-39773c6d-4081-4edd-80c4-97fa7fa170bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:38.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:39.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 31 04:05:40 np0005603622 nova_compute[221301]: 2026-01-31 09:05:40.542 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:40.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:41.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:05:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:42.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:05:42 np0005603622 nova_compute[221301]: 2026-01-31 09:05:42.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:43.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:44.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:44 np0005603622 nova_compute[221301]: 2026-01-31 09:05:44.889 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:45 np0005603622 nova_compute[221301]: 2026-01-31 09:05:45.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:05:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:45.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:45 np0005603622 nova_compute[221301]: 2026-01-31 09:05:45.543 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:46.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:47.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:47 np0005603622 nova_compute[221301]: 2026-01-31 09:05:47.860 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850332.8585494, 69fccdf1-17df-4c1f-8408-5f8b451e95e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:05:47 np0005603622 nova_compute[221301]: 2026-01-31 09:05:47.860 221324 INFO nova.compute.manager [-] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:05:47 np0005603622 nova_compute[221301]: 2026-01-31 09:05:47.888 221324 DEBUG nova.compute.manager [None req-4ab8f848-feed-4543-8795-91bcadf4ce7b - - - - - -] [instance: 69fccdf1-17df-4c1f-8408-5f8b451e95e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:05:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 31 04:05:47 np0005603622 nova_compute[221301]: 2026-01-31 09:05:47.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:48.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.017669) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349017746, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 1943, "num_deletes": 254, "total_data_size": 4362985, "memory_usage": 4421544, "flush_reason": "Manual Compaction"}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349038010, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 2864427, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 83012, "largest_seqno": 84950, "table_properties": {"data_size": 2856398, "index_size": 4842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 17462, "raw_average_key_size": 20, "raw_value_size": 2840038, "raw_average_value_size": 3368, "num_data_blocks": 212, "num_entries": 843, "num_filter_entries": 843, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850192, "oldest_key_time": 1769850192, "file_creation_time": 1769850349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 20382 microseconds, and 5785 cpu microseconds.
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.038059) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 2864427 bytes OK
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.038080) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.045313) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.045343) EVENT_LOG_v1 {"time_micros": 1769850349045335, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.045366) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 4354254, prev total WAL file size 4354254, number of live WAL files 2.
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.046049) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(2797KB)], [171(10MB)]
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349046087, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 13587177, "oldest_snapshot_seqno": -1}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10464 keys, 11763559 bytes, temperature: kUnknown
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349130421, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 11763559, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11698340, "index_size": 37970, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26181, "raw_key_size": 275978, "raw_average_key_size": 26, "raw_value_size": 11517907, "raw_average_value_size": 1100, "num_data_blocks": 1436, "num_entries": 10464, "num_filter_entries": 10464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.130752) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 11763559 bytes
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.133185) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.9 rd, 139.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.2 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 10988, records dropped: 524 output_compression: NoCompression
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.133227) EVENT_LOG_v1 {"time_micros": 1769850349133207, "job": 110, "event": "compaction_finished", "compaction_time_micros": 84430, "compaction_time_cpu_micros": 30346, "output_level": 6, "num_output_files": 1, "total_output_size": 11763559, "num_input_records": 10988, "num_output_records": 10464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349133914, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850349135979, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.045964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.136079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.136085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.136088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.136091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:05:49.136093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:05:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:49.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:50 np0005603622 nova_compute[221301]: 2026-01-31 09:05:50.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:50.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:51.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:52.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:52 np0005603622 nova_compute[221301]: 2026-01-31 09:05:52.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:52 np0005603622 nova_compute[221301]: 2026-01-31 09:05:52.994 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:52 np0005603622 nova_compute[221301]: 2026-01-31 09:05:52.995 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.025 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.128 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:53.128 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:05:53 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:53.129 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.143 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.144 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.151 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.152 221324 INFO nova.compute.claims [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.285 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:53.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:05:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4240616942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.729 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.737 221324 DEBUG nova.compute.provider_tree [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.772 221324 DEBUG nova.scheduler.client.report [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.815 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.816 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.909 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.909 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.936 221324 INFO nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:05:53 np0005603622 nova_compute[221301]: 2026-01-31 09:05:53.955 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.078 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.080 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.080 221324 INFO nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Creating image(s)#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.203 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.246 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.281 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.286 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.312 221324 DEBUG nova.policy [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd442c7ba12ed444ca6d4dcc5cfd36150', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.344 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.345 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.346 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.346 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.374 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.377 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 933259c3-a1cb-4a08-9206-6db46a39bd19_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.745 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 933259c3-a1cb-4a08-9206-6db46a39bd19_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.368s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:54.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:54 np0005603622 nova_compute[221301]: 2026-01-31 09:05:54.859 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] resizing rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.024 221324 DEBUG nova.objects.instance [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'migration_context' on Instance uuid 933259c3-a1cb-4a08-9206-6db46a39bd19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.058 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.059 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Ensure instance console log exists: /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.059 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.060 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.060 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:05:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:05:55.132 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:05:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:55.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.550 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:55 np0005603622 nova_compute[221301]: 2026-01-31 09:05:55.905 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Successfully created port: 18a0e279-b8f4-42b8-9c2e-c82c8409abab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:05:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:05:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:56.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.166 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Successfully updated port: 18a0e279-b8f4-42b8-9c2e-c82c8409abab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.219 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.219 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquired lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.220 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:05:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:57.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.570 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.768 221324 DEBUG nova.compute.manager [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.768 221324 DEBUG nova.compute.manager [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing instance network info cache due to event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.769 221324 DEBUG oslo_concurrency.lockutils [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:05:57 np0005603622 nova_compute[221301]: 2026-01-31 09:05:57.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:05:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 31 04:05:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:05:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:05:58.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.292 221324 DEBUG nova.network.neutron [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.330 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Releasing lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.330 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Instance network_info: |[{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.331 221324 DEBUG oslo_concurrency.lockutils [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.331 221324 DEBUG nova.network.neutron [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.337 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Start _get_guest_xml network_info=[{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.343 221324 WARNING nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.349 221324 DEBUG nova.virt.libvirt.host [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.350 221324 DEBUG nova.virt.libvirt.host [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.357 221324 DEBUG nova.virt.libvirt.host [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.358 221324 DEBUG nova.virt.libvirt.host [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.360 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.361 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.362 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.362 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.363 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.363 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.364 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.364 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.365 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.365 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.366 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.366 221324 DEBUG nova.virt.hardware [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.372 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:05:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:05:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:05:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:05:59.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:05:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:05:59 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1656396257' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.799 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.828 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:05:59 np0005603622 nova_compute[221301]: 2026-01-31 09:05:59.832 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:00 np0005603622 podman[303201]: 2026-01-31 09:06:00.136316178 +0000 UTC m=+0.059338380 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:06:00 np0005603622 podman[303200]: 2026-01-31 09:06:00.166234279 +0000 UTC m=+0.088043318 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 04:06:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:06:00 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/828196942' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.277 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.280 221324 DEBUG nova.virt.libvirt.vif [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1952330094',display_name='tempest-TestNetworkBasicOps-server-1952330094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1952330094',id=205,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCN4LKckxbk0fJFpgEsW7Ahr/av0VRwSTVlL/dEnSvPTw7QpJqoVODLyzFAIa3ET+Fky9JC46hRIgJ8C/nVvN0bptl2M6nVdsdxoqFlwbG5+E7zKzAU0FLzToaYbzWbViA==',key_name='tempest-TestNetworkBasicOps-245937908',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-jt2l8w7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:05:53Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=933259c3-a1cb-4a08-9206-6db46a39bd19,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.280 221324 DEBUG nova.network.os_vif_util [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.282 221324 DEBUG nova.network.os_vif_util [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.283 221324 DEBUG nova.objects.instance [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'pci_devices' on Instance uuid 933259c3-a1cb-4a08-9206-6db46a39bd19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.432 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <uuid>933259c3-a1cb-4a08-9206-6db46a39bd19</uuid>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <name>instance-000000cd</name>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestNetworkBasicOps-server-1952330094</nova:name>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:05:59</nova:creationTime>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:user uuid="d442c7ba12ed444ca6d4dcc5cfd36150">tempest-TestNetworkBasicOps-104417095-project-member</nova:user>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:project uuid="abf9393aa2b646feb00a3d887a9dee14">tempest-TestNetworkBasicOps-104417095</nova:project>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <nova:port uuid="18a0e279-b8f4-42b8-9c2e-c82c8409abab">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="serial">933259c3-a1cb-4a08-9206-6db46a39bd19</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="uuid">933259c3-a1cb-4a08-9206-6db46a39bd19</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/933259c3-a1cb-4a08-9206-6db46a39bd19_disk">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:9f:ff:20"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <target dev="tap18a0e279-b8"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/console.log" append="off"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:06:00 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:06:00 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:06:00 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:06:00 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.433 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Preparing to wait for external event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.434 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.435 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.435 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.436 221324 DEBUG nova.virt.libvirt.vif [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1952330094',display_name='tempest-TestNetworkBasicOps-server-1952330094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1952330094',id=205,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCN4LKckxbk0fJFpgEsW7Ahr/av0VRwSTVlL/dEnSvPTw7QpJqoVODLyzFAIa3ET+Fky9JC46hRIgJ8C/nVvN0bptl2M6nVdsdxoqFlwbG5+E7zKzAU0FLzToaYbzWbViA==',key_name='tempest-TestNetworkBasicOps-245937908',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-jt2l8w7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:05:53Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=933259c3-a1cb-4a08-9206-6db46a39bd19,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.437 221324 DEBUG nova.network.os_vif_util [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.438 221324 DEBUG nova.network.os_vif_util [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.439 221324 DEBUG os_vif [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.440 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.441 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.446 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18a0e279-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.448 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18a0e279-b8, col_values=(('external_ids', {'iface-id': '18a0e279-b8f4-42b8-9c2e-c82c8409abab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:ff:20', 'vm-uuid': '933259c3-a1cb-4a08-9206-6db46a39bd19'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:00 np0005603622 NetworkManager[49080]: <info>  [1769850360.4519] manager: (tap18a0e279-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.453 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.456 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.458 221324 INFO os_vif [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8')#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.534 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.534 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.534 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] No VIF found with MAC fa:16:3e:9f:ff:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.535 221324 INFO nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Using config drive#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.561 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:00 np0005603622 nova_compute[221301]: 2026-01-31 09:06:00.566 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:00.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.239 221324 INFO nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Creating config drive at /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.246 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxd5at09l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.371 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxd5at09l" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.414 221324 DEBUG nova.storage.rbd_utils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] rbd image 933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.420 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config 933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:01.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.625 221324 DEBUG oslo_concurrency.processutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config 933259c3-a1cb-4a08-9206-6db46a39bd19_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.627 221324 INFO nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Deleting local config drive /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19/disk.config because it was imported into RBD.#033[00m
Jan 31 04:06:01 np0005603622 kernel: tap18a0e279-b8: entered promiscuous mode
Jan 31 04:06:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:01Z|00974|binding|INFO|Claiming lport 18a0e279-b8f4-42b8-9c2e-c82c8409abab for this chassis.
Jan 31 04:06:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:01Z|00975|binding|INFO|18a0e279-b8f4-42b8-9c2e-c82c8409abab: Claiming fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.688 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603622 NetworkManager[49080]: <info>  [1769850361.6902] manager: (tap18a0e279-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/428)
Jan 31 04:06:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:01Z|00976|binding|INFO|Setting lport 18a0e279-b8f4-42b8-9c2e-c82c8409abab ovn-installed in OVS
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.695 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.698 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603622 nova_compute[221301]: 2026-01-31 09:06:01.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:01Z|00977|binding|INFO|Setting lport 18a0e279-b8f4-42b8-9c2e-c82c8409abab up in Southbound
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.706 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:ff:20 10.100.0.3'], port_security=['fa:16:3e:9f:ff:20 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '933259c3-a1cb-4a08-9206-6db46a39bd19', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-818a5d89-9573-4880-a88d-1838e69ca745', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4f953884-6af6-4cb5-82b1-05b5f9c3185d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1517643-63ff-454d-bacd-f8f229c2a3eb, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=18a0e279-b8f4-42b8-9c2e-c82c8409abab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.709 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 18a0e279-b8f4-42b8-9c2e-c82c8409abab in datapath 818a5d89-9573-4880-a88d-1838e69ca745 bound to our chassis#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.712 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 818a5d89-9573-4880-a88d-1838e69ca745#033[00m
Jan 31 04:06:01 np0005603622 systemd-udevd[303320]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:01 np0005603622 systemd-machined[190406]: New machine qemu-101-instance-000000cd.
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.723 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a56677f3-4371-40d3-bdc2-4af73b1af9f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.724 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap818a5d89-91 in ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.726 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap818a5d89-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.726 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[72efdff4-5310-4776-901d-239ebbd8f1ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 NetworkManager[49080]: <info>  [1769850361.7279] device (tap18a0e279-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.727 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2c97f477-36f9-439c-a37a-beb3406c96c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 NetworkManager[49080]: <info>  [1769850361.7286] device (tap18a0e279-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:06:01 np0005603622 systemd[1]: Started Virtual Machine qemu-101-instance-000000cd.
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.742 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9b920be0-d264-47a1-ac49-8a77bfa8deeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.755 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[046b1f93-4827-44a2-bf1e-3e23c8c4e5b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.790 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[22617b39-07a8-4d83-bfe6-1b1afc52f9a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 systemd-udevd[303325]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.797 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[6021f320-1ab6-4706-b37a-9e806fff9d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 NetworkManager[49080]: <info>  [1769850361.7982] manager: (tap818a5d89-90): new Veth device (/org/freedesktop/NetworkManager/Devices/429)
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.836 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[adc79d70-b58f-4648-8d3c-fa0d4b4f17c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.840 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3159d4-d678-4157-b00c-59a97ff030ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 NetworkManager[49080]: <info>  [1769850361.8706] device (tap818a5d89-90): carrier: link connected
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.877 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[3de94a75-fdeb-4aa9-ab93-8b09391e7812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.900 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7757690e-fcfb-4029-97d7-42421097180e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap818a5d89-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:70:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 293], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961321, 'reachable_time': 41269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303355, 'error': None, 'target': 'ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.920 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[31345ed6-cdbe-45c5-b31b-8d8295b6cbf6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:70ea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 961321, 'tstamp': 961321}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303356, 'error': None, 'target': 'ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.943 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f34dee24-097d-4cb2-ae29-4c9885a252cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap818a5d89-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:70:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 293], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961321, 'reachable_time': 41269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 303357, 'error': None, 'target': 'ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:01.979 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e229ab94-b099-4e86-a6aa-0f92b25dd3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.029 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eeec663f-0aaa-400d-8b12-6c6cc1201716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.031 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap818a5d89-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.032 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.032 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap818a5d89-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:02 np0005603622 NetworkManager[49080]: <info>  [1769850362.0363] manager: (tap818a5d89-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Jan 31 04:06:02 np0005603622 kernel: tap818a5d89-90: entered promiscuous mode
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.039 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.039 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap818a5d89-90, col_values=(('external_ids', {'iface-id': '52425cac-5c2e-4922-bfe7-80e01f5f8bf7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:02 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:02Z|00978|binding|INFO|Releasing lport 52425cac-5c2e-4922-bfe7-80e01f5f8bf7 from this chassis (sb_readonly=0)
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.046 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/818a5d89-9573-4880-a88d-1838e69ca745.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/818a5d89-9573-4880-a88d-1838e69ca745.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.047 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a114e7e-0f02-4eb7-829b-13b3af275402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.049 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-818a5d89-9573-4880-a88d-1838e69ca745
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/818a5d89-9573-4880-a88d-1838e69ca745.pid.haproxy
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 818a5d89-9573-4880-a88d-1838e69ca745
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:06:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:02.050 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745', 'env', 'PROCESS_TAG=haproxy-818a5d89-9573-4880-a88d-1838e69ca745', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/818a5d89-9573-4880-a88d-1838e69ca745.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.313 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850362.3129714, 933259c3-a1cb-4a08-9206-6db46a39bd19 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.314 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] VM Started (Lifecycle Event)#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.351 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.353 221324 DEBUG nova.network.neutron [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updated VIF entry in instance network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.354 221324 DEBUG nova.network.neutron [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.372 221324 DEBUG oslo_concurrency.lockutils [req-6773cce9-0218-44bc-9e13-80b234b6e21d req-7666ece4-44f0-43f3-8ac0-223b4450aac1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:02 np0005603622 podman[303429]: 2026-01-31 09:06:02.334977506 +0000 UTC m=+0.022094495 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.464 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850362.313703, 933259c3-a1cb-4a08-9206-6db46a39bd19 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.465 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.497 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.502 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.533 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.592 221324 DEBUG nova.compute.manager [req-a8c62e5a-359b-4ed5-a920-04739624241b req-3c839ae4-a179-412c-8d54-6f44e6b95fad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.594 221324 DEBUG oslo_concurrency.lockutils [req-a8c62e5a-359b-4ed5-a920-04739624241b req-3c839ae4-a179-412c-8d54-6f44e6b95fad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.595 221324 DEBUG oslo_concurrency.lockutils [req-a8c62e5a-359b-4ed5-a920-04739624241b req-3c839ae4-a179-412c-8d54-6f44e6b95fad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.595 221324 DEBUG oslo_concurrency.lockutils [req-a8c62e5a-359b-4ed5-a920-04739624241b req-3c839ae4-a179-412c-8d54-6f44e6b95fad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.596 221324 DEBUG nova.compute.manager [req-a8c62e5a-359b-4ed5-a920-04739624241b req-3c839ae4-a179-412c-8d54-6f44e6b95fad fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Processing event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.597 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:06:02 np0005603622 podman[303429]: 2026-01-31 09:06:02.598035725 +0000 UTC m=+0.285152734 container create 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.601 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850362.6011138, 933259c3-a1cb-4a08-9206-6db46a39bd19 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.601 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.604 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.609 221324 INFO nova.virt.libvirt.driver [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Instance spawned successfully.#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.609 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.641 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.646 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.655 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.656 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.656 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.657 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.658 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.659 221324 DEBUG nova.virt.libvirt.driver [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:06:02 np0005603622 systemd[1]: Started libpod-conmon-43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8.scope.
Jan 31 04:06:02 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.694 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:06:02 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48c9f85dcf07076cb444c77d69d9d98dbd2dd9ac7316a792694872ed47140ce2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:06:02 np0005603622 podman[303429]: 2026-01-31 09:06:02.718653115 +0000 UTC m=+0.405770134 container init 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 04:06:02 np0005603622 podman[303429]: 2026-01-31 09:06:02.722199939 +0000 UTC m=+0.409316928 container start 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Jan 31 04:06:02 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [NOTICE]   (303448) : New worker (303450) forked
Jan 31 04:06:02 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [NOTICE]   (303448) : Loading success.
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.754 221324 INFO nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Took 8.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.754 221324 DEBUG nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:02.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.864 221324 INFO nova.compute.manager [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Took 9.77 seconds to build instance.#033[00m
Jan 31 04:06:02 np0005603622 nova_compute[221301]: 2026-01-31 09:06:02.887 221324 DEBUG oslo_concurrency.lockutils [None req-ce086369-ee1b-415f-93da-e3a8dcb0d15c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 31 04:06:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:03.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 31 04:06:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:06:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3775686426' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:06:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:06:04 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3775686426' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.793 221324 DEBUG nova.compute.manager [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.793 221324 DEBUG oslo_concurrency.lockutils [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.794 221324 DEBUG oslo_concurrency.lockutils [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.794 221324 DEBUG oslo_concurrency.lockutils [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.794 221324 DEBUG nova.compute.manager [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] No waiting events found dispatching network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:04 np0005603622 nova_compute[221301]: 2026-01-31 09:06:04.794 221324 WARNING nova.compute.manager [req-2b4727f0-7288-4dce-b891-bb5e3a906cae req-028f703e-e1d0-4376-b3a2-0e263b9d3bf9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received unexpected event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab for instance with vm_state active and task_state None.#033[00m
Jan 31 04:06:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:04.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:05.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:05 np0005603622 nova_compute[221301]: 2026-01-31 09:06:05.452 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:05 np0005603622 nova_compute[221301]: 2026-01-31 09:06:05.554 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:06.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:07.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:07 np0005603622 nova_compute[221301]: 2026-01-31 09:06:07.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:08.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:09.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:09 np0005603622 nova_compute[221301]: 2026-01-31 09:06:09.447 221324 DEBUG nova.compute.manager [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:09 np0005603622 nova_compute[221301]: 2026-01-31 09:06:09.447 221324 DEBUG nova.compute.manager [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing instance network info cache due to event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:09 np0005603622 nova_compute[221301]: 2026-01-31 09:06:09.447 221324 DEBUG oslo_concurrency.lockutils [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:09 np0005603622 nova_compute[221301]: 2026-01-31 09:06:09.448 221324 DEBUG oslo_concurrency.lockutils [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:09 np0005603622 nova_compute[221301]: 2026-01-31 09:06:09.448 221324 DEBUG nova.network.neutron [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:10 np0005603622 nova_compute[221301]: 2026-01-31 09:06:10.455 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:10 np0005603622 nova_compute[221301]: 2026-01-31 09:06:10.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:10.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:11.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:11 np0005603622 nova_compute[221301]: 2026-01-31 09:06:11.808 221324 DEBUG nova.network.neutron [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updated VIF entry in instance network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:11 np0005603622 nova_compute[221301]: 2026-01-31 09:06:11.809 221324 DEBUG nova.network.neutron [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:11 np0005603622 nova_compute[221301]: 2026-01-31 09:06:11.853 221324 DEBUG oslo_concurrency.lockutils [req-03d54ef9-fce8-4eff-967d-4421326e60d7 req-741c0812-c13c-46b6-b88d-778955adb3fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:12 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:12Z|00979|binding|INFO|Releasing lport 52425cac-5c2e-4922-bfe7-80e01f5f8bf7 from this chassis (sb_readonly=0)
Jan 31 04:06:12 np0005603622 nova_compute[221301]: 2026-01-31 09:06:12.164 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:12.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 31 04:06:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:13.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:14.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:15 np0005603622 nova_compute[221301]: 2026-01-31 09:06:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:15 np0005603622 nova_compute[221301]: 2026-01-31 09:06:15.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:06:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:15.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:15 np0005603622 nova_compute[221301]: 2026-01-31 09:06:15.459 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:15 np0005603622 nova_compute[221301]: 2026-01-31 09:06:15.559 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:15 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:15Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:15 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:15Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:16.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:06:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:17.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.688 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.689 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.689 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:06:17 np0005603622 nova_compute[221301]: 2026-01-31 09:06:17.690 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 933259c3-a1cb-4a08-9206-6db46a39bd19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:18.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:06:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:19.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:19 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:19Z|00980|binding|INFO|Releasing lport 52425cac-5c2e-4922-bfe7-80e01f5f8bf7 from this chassis (sb_readonly=0)
Jan 31 04:06:19 np0005603622 nova_compute[221301]: 2026-01-31 09:06:19.843 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.211 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.248 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.248 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.249 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:20 np0005603622 nova_compute[221301]: 2026-01-31 09:06:20.562 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:20.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.429 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.429 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.429 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.430 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:21.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1217188053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.903 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.983 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000cd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:06:21 np0005603622 nova_compute[221301]: 2026-01-31 09:06:21.986 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000cd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.160 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.161 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4031MB free_disk=20.876556396484375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.162 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.162 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.267 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 933259c3-a1cb-4a08-9206-6db46a39bd19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.267 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.268 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.319 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.460 221324 INFO nova.compute.manager [None req-0a7cf54b-000a-4e56-84cc-1e1d42e39b8c d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Get console output#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.467 259905 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 31 04:06:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4035361874' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.722 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.727 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.745 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.801 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:06:22 np0005603622 nova_compute[221301]: 2026-01-31 09:06:22.801 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:22.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:23.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:24 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:24Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:06:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:24.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:25.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:25 np0005603622 nova_compute[221301]: 2026-01-31 09:06:25.466 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:25 np0005603622 nova_compute[221301]: 2026-01-31 09:06:25.564 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:26 np0005603622 nova_compute[221301]: 2026-01-31 09:06:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:26.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:27 np0005603622 nova_compute[221301]: 2026-01-31 09:06:27.414 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:27 np0005603622 nova_compute[221301]: 2026-01-31 09:06:27.415 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:27.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:28 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:28Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:28.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:29.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:30 np0005603622 nova_compute[221301]: 2026-01-31 09:06:30.471 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:30 np0005603622 nova_compute[221301]: 2026-01-31 09:06:30.568 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:30.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:30.925 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:30.926 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:30.927 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:31 np0005603622 podman[303809]: 2026-01-31 09:06:31.146071383 +0000 UTC m=+0.058148769 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127)
Jan 31 04:06:31 np0005603622 podman[303808]: 2026-01-31 09:06:31.167265623 +0000 UTC m=+0.079396031 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:06:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:31Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:ff:20 10.100.0.3
Jan 31 04:06:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:31.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.734 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.734 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.734 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.734 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.735 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.736 221324 INFO nova.compute.manager [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Terminating instance#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.737 221324 DEBUG nova.compute.manager [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:06:31 np0005603622 kernel: tap18a0e279-b8 (unregistering): left promiscuous mode
Jan 31 04:06:31 np0005603622 NetworkManager[49080]: <info>  [1769850391.8337] device (tap18a0e279-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.844 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:31Z|00981|binding|INFO|Releasing lport 18a0e279-b8f4-42b8-9c2e-c82c8409abab from this chassis (sb_readonly=0)
Jan 31 04:06:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:31Z|00982|binding|INFO|Setting lport 18a0e279-b8f4-42b8-9c2e-c82c8409abab down in Southbound
Jan 31 04:06:31 np0005603622 ovn_controller[130109]: 2026-01-31T09:06:31Z|00983|binding|INFO|Removing iface tap18a0e279-b8 ovn-installed in OVS
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.847 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.854 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:31.865 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:ff:20 10.100.0.3'], port_security=['fa:16:3e:9f:ff:20 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '933259c3-a1cb-4a08-9206-6db46a39bd19', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-818a5d89-9573-4880-a88d-1838e69ca745', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abf9393aa2b646feb00a3d887a9dee14', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4f953884-6af6-4cb5-82b1-05b5f9c3185d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1517643-63ff-454d-bacd-f8f229c2a3eb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=18a0e279-b8f4-42b8-9c2e-c82c8409abab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:31.867 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 18a0e279-b8f4-42b8-9c2e-c82c8409abab in datapath 818a5d89-9573-4880-a88d-1838e69ca745 unbound from our chassis#033[00m
Jan 31 04:06:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:31.870 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 818a5d89-9573-4880-a88d-1838e69ca745, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:06:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:31.872 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[50239033-5df1-4c0d-a2cc-9d59e6aafef9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:31.873 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745 namespace which is not needed anymore#033[00m
Jan 31 04:06:31 np0005603622 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000cd.scope: Deactivated successfully.
Jan 31 04:06:31 np0005603622 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000cd.scope: Consumed 14.100s CPU time.
Jan 31 04:06:31 np0005603622 systemd-machined[190406]: Machine qemu-101-instance-000000cd terminated.
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.959 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.971 221324 INFO nova.virt.libvirt.driver [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Instance destroyed successfully.#033[00m
Jan 31 04:06:31 np0005603622 nova_compute[221301]: 2026-01-31 09:06:31.972 221324 DEBUG nova.objects.instance [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lazy-loading 'resources' on Instance uuid 933259c3-a1cb-4a08-9206-6db46a39bd19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.011 221324 DEBUG nova.virt.libvirt.vif [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:05:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1952330094',display_name='tempest-TestNetworkBasicOps-server-1952330094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1952330094',id=205,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCN4LKckxbk0fJFpgEsW7Ahr/av0VRwSTVlL/dEnSvPTw7QpJqoVODLyzFAIa3ET+Fky9JC46hRIgJ8C/nVvN0bptl2M6nVdsdxoqFlwbG5+E7zKzAU0FLzToaYbzWbViA==',key_name='tempest-TestNetworkBasicOps-245937908',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:06:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='abf9393aa2b646feb00a3d887a9dee14',ramdisk_id='',reservation_id='r-jt2l8w7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-104417095',owner_user_name='tempest-TestNetworkBasicOps-104417095-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:06:02Z,user_data=None,user_id='d442c7ba12ed444ca6d4dcc5cfd36150',uuid=933259c3-a1cb-4a08-9206-6db46a39bd19,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.011 221324 DEBUG nova.network.os_vif_util [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converting VIF {"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [NOTICE]   (303448) : haproxy version is 2.8.14-c23fe91
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [NOTICE]   (303448) : path to executable is /usr/sbin/haproxy
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [WARNING]  (303448) : Exiting Master process...
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [WARNING]  (303448) : Exiting Master process...
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.013 221324 DEBUG nova.network.os_vif_util [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [ALERT]    (303448) : Current worker (303450) exited with code 143 (Terminated)
Jan 31 04:06:32 np0005603622 neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745[303444]: [WARNING]  (303448) : All workers exited. Exiting... (0)
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.015 221324 DEBUG os_vif [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:06:32 np0005603622 systemd[1]: libpod-43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8.scope: Deactivated successfully.
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.018 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.019 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18a0e279-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.021 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603622 podman[303880]: 2026-01-31 09:06:32.025139996 +0000 UTC m=+0.054802900 container died 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.027 221324 INFO os_vif [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:ff:20,bridge_name='br-int',has_traffic_filtering=True,id=18a0e279-b8f4-42b8-9c2e-c82c8409abab,network=Network(818a5d89-9573-4880-a88d-1838e69ca745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18a0e279-b8')#033[00m
Jan 31 04:06:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8-userdata-shm.mount: Deactivated successfully.
Jan 31 04:06:32 np0005603622 systemd[1]: var-lib-containers-storage-overlay-48c9f85dcf07076cb444c77d69d9d98dbd2dd9ac7316a792694872ed47140ce2-merged.mount: Deactivated successfully.
Jan 31 04:06:32 np0005603622 podman[303880]: 2026-01-31 09:06:32.074009349 +0000 UTC m=+0.103672253 container cleanup 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:06:32 np0005603622 systemd[1]: libpod-conmon-43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8.scope: Deactivated successfully.
Jan 31 04:06:32 np0005603622 podman[303937]: 2026-01-31 09:06:32.138169476 +0000 UTC m=+0.046302506 container remove 43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.145 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[82f39685-5d0c-4c22-ba4d-b9c7fdaf959f]: (4, ('Sat Jan 31 09:06:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745 (43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8)\n43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8\nSat Jan 31 09:06:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745 (43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8)\n43cf71bd01d8057702faae24e3df1c864f1b0d45fcf103976e7cf5179bf6eaa8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.147 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6b5c0f-ea63-43e2-b2fc-bc2ec4a018be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.149 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap818a5d89-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603622 kernel: tap818a5d89-90: left promiscuous mode
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.160 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.163 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2ea19e-9d09-404f-90cb-c4edd361f0d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.186 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[14d4af26-6448-4239-9fff-729faaa9a343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.187 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3e054598-84c7-4406-b26d-b68fee3bcf54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.199 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5de666-8979-4044-99c3-847cc6c8b47d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961312, 'reachable_time': 23241, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303953, 'error': None, 'target': 'ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 systemd[1]: run-netns-ovnmeta\x2d818a5d89\x2d9573\x2d4880\x2da88d\x2d1838e69ca745.mount: Deactivated successfully.
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.203 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-818a5d89-9573-4880-a88d-1838e69ca745 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:06:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:32.203 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb3baf6-9e68-4cea-88d1-c4fea3df9f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.526 221324 DEBUG nova.compute.manager [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-unplugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.526 221324 DEBUG oslo_concurrency.lockutils [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.527 221324 DEBUG oslo_concurrency.lockutils [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.528 221324 DEBUG oslo_concurrency.lockutils [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.528 221324 DEBUG nova.compute.manager [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] No waiting events found dispatching network-vif-unplugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.529 221324 DEBUG nova.compute.manager [req-dc72d3e7-eb1c-4005-86f9-ac3cae685cd3 req-89fda57b-0c85-4a49-ba1e-146f88777286 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-unplugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.772 221324 INFO nova.virt.libvirt.driver [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Deleting instance files /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19_del#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.773 221324 INFO nova.virt.libvirt.driver [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Deletion of /var/lib/nova/instances/933259c3-a1cb-4a08-9206-6db46a39bd19_del complete#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.878 221324 INFO nova.compute.manager [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Took 1.14 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.879 221324 DEBUG oslo.service.loopingcall [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.880 221324 DEBUG nova.compute.manager [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:06:32 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.880 221324 DEBUG nova.network.neutron [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:06:32 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:06:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:32.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.941550) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392941669, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 850, "num_deletes": 259, "total_data_size": 1515430, "memory_usage": 1532080, "flush_reason": "Manual Compaction"}
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392954398, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 987887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84956, "largest_seqno": 85800, "table_properties": {"data_size": 983844, "index_size": 1758, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9779, "raw_average_key_size": 20, "raw_value_size": 975365, "raw_average_value_size": 2011, "num_data_blocks": 75, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850349, "oldest_key_time": 1769850349, "file_creation_time": 1769850392, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 12865 microseconds, and 4256 cpu microseconds.
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.954449) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 987887 bytes OK
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.954471) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.966212) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.966242) EVENT_LOG_v1 {"time_micros": 1769850392966233, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.966266) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 1510961, prev total WAL file size 1510961, number of live WAL files 2.
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.966946) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323634' seq:72057594037927935, type:22 .. '6C6F676D0033353135' seq:0, type:0; will stop at (end)
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(964KB)], [174(11MB)]
Jan 31 04:06:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850392967012, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 12751446, "oldest_snapshot_seqno": -1}
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.969 221324 DEBUG nova.compute.manager [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.970 221324 DEBUG nova.compute.manager [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing instance network info cache due to event network-changed-18a0e279-b8f4-42b8-9c2e-c82c8409abab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.970 221324 DEBUG oslo_concurrency.lockutils [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.970 221324 DEBUG oslo_concurrency.lockutils [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:06:32 np0005603622 nova_compute[221301]: 2026-01-31 09:06:32.971 221324 DEBUG nova.network.neutron [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Refreshing network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10411 keys, 12594219 bytes, temperature: kUnknown
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393070018, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 12594219, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12528081, "index_size": 39017, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 275886, "raw_average_key_size": 26, "raw_value_size": 12347440, "raw_average_value_size": 1185, "num_data_blocks": 1477, "num_entries": 10411, "num_filter_entries": 10411, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850392, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.070308) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 12594219 bytes
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.076132) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.7 rd, 122.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(25.7) write-amplify(12.7) OK, records in: 10949, records dropped: 538 output_compression: NoCompression
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.076173) EVENT_LOG_v1 {"time_micros": 1769850393076156, "job": 112, "event": "compaction_finished", "compaction_time_micros": 103080, "compaction_time_cpu_micros": 24222, "output_level": 6, "num_output_files": 1, "total_output_size": 12594219, "num_input_records": 10949, "num_output_records": 10411, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393076410, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850393077179, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:32.966850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.077266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.077295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.077299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.077302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:06:33.077305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:06:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:33.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.837 221324 DEBUG nova.compute.manager [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.837 221324 DEBUG oslo_concurrency.lockutils [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.837 221324 DEBUG oslo_concurrency.lockutils [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.838 221324 DEBUG oslo_concurrency.lockutils [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.838 221324 DEBUG nova.compute.manager [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] No waiting events found dispatching network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:06:34 np0005603622 nova_compute[221301]: 2026-01-31 09:06:34.838 221324 WARNING nova.compute.manager [req-3c03c700-11a1-4e87-97ff-f18af9074519 req-963c8d98-d77a-4cf5-9016-6925b91a1f30 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received unexpected event network-vif-plugged-18a0e279-b8f4-42b8-9c2e-c82c8409abab for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:06:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:34.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.004 221324 DEBUG nova.network.neutron [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.032 221324 INFO nova.compute.manager [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Took 2.15 seconds to deallocate network for instance.#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.092 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.093 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.174 221324 DEBUG oslo_concurrency.processutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:06:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:35.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.570 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:06:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/364859204' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.606 221324 DEBUG oslo_concurrency.processutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.615 221324 DEBUG nova.compute.provider_tree [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.639 221324 DEBUG nova.scheduler.client.report [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.664 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.728 221324 INFO nova.scheduler.client.report [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Deleted allocations for instance 933259c3-a1cb-4a08-9206-6db46a39bd19#033[00m
Jan 31 04:06:35 np0005603622 nova_compute[221301]: 2026-01-31 09:06:35.887 221324 DEBUG oslo_concurrency.lockutils [None req-c752c267-373c-461b-9fae-4d33427b4bb8 d442c7ba12ed444ca6d4dcc5cfd36150 abf9393aa2b646feb00a3d887a9dee14 - - default default] Lock "933259c3-a1cb-4a08-9206-6db46a39bd19" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:06:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:36 np0005603622 nova_compute[221301]: 2026-01-31 09:06:36.774 221324 DEBUG nova.network.neutron [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updated VIF entry in instance network info cache for port 18a0e279-b8f4-42b8-9c2e-c82c8409abab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:06:36 np0005603622 nova_compute[221301]: 2026-01-31 09:06:36.775 221324 DEBUG nova.network.neutron [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Updating instance_info_cache with network_info: [{"id": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "address": "fa:16:3e:9f:ff:20", "network": {"id": "818a5d89-9573-4880-a88d-1838e69ca745", "bridge": "br-int", "label": "tempest-network-smoke--599157447", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abf9393aa2b646feb00a3d887a9dee14", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18a0e279-b8", "ovs_interfaceid": "18a0e279-b8f4-42b8-9c2e-c82c8409abab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:06:36 np0005603622 nova_compute[221301]: 2026-01-31 09:06:36.821 221324 DEBUG oslo_concurrency.lockutils [req-8cf3e236-c0bb-425e-90cf-7c69153f4a6b req-f5a8e882-69e3-4f07-9255-af32d59dabaf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-933259c3-a1cb-4a08-9206-6db46a39bd19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:06:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:36.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:37 np0005603622 nova_compute[221301]: 2026-01-31 09:06:37.005 221324 DEBUG nova.compute.manager [req-d8e3c29e-5376-4170-9dff-b82a98bfc4dd req-56770042-6a2a-4788-843f-1d4a9d9a3471 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Received event network-vif-deleted-18a0e279-b8f4-42b8-9c2e-c82c8409abab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:06:37 np0005603622 nova_compute[221301]: 2026-01-31 09:06:37.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:37.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:38.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:39.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:40 np0005603622 nova_compute[221301]: 2026-01-31 09:06:40.573 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:40.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:41.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:42 np0005603622 nova_compute[221301]: 2026-01-31 09:06:42.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:42 np0005603622 nova_compute[221301]: 2026-01-31 09:06:42.204 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:42 np0005603622 nova_compute[221301]: 2026-01-31 09:06:42.291 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:42.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:44.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:45.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:45 np0005603622 nova_compute[221301]: 2026-01-31 09:06:45.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:46.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:46 np0005603622 nova_compute[221301]: 2026-01-31 09:06:46.969 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850391.9685466, 933259c3-a1cb-4a08-9206-6db46a39bd19 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:06:46 np0005603622 nova_compute[221301]: 2026-01-31 09:06:46.970 221324 INFO nova.compute.manager [-] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:06:47 np0005603622 nova_compute[221301]: 2026-01-31 09:06:47.004 221324 DEBUG nova.compute.manager [None req-d1eff47f-a713-49e0-bd1b-390afa0613f4 - - - - - -] [instance: 933259c3-a1cb-4a08-9206-6db46a39bd19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:06:47 np0005603622 nova_compute[221301]: 2026-01-31 09:06:47.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:47.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:49.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:50 np0005603622 nova_compute[221301]: 2026-01-31 09:06:50.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:06:50 np0005603622 nova_compute[221301]: 2026-01-31 09:06:50.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:06:50 np0005603622 nova_compute[221301]: 2026-01-31 09:06:50.577 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:50.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:51.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:52 np0005603622 nova_compute[221301]: 2026-01-31 09:06:52.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:52.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:53.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:06:54 np0005603622 nova_compute[221301]: 2026-01-31 09:06:54.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:54.141 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:06:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:54.142 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:06:54 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:06:54.142 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:06:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:54.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:55.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:55 np0005603622 nova_compute[221301]: 2026-01-31 09:06:55.579 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:06:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:56.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:57 np0005603622 nova_compute[221301]: 2026-01-31 09:06:57.033 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:06:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:06:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:57.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:06:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:06:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:06:58.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:06:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:06:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:06:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:06:59.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:00 np0005603622 nova_compute[221301]: 2026-01-31 09:07:00.581 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:00.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:01.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:02 np0005603622 nova_compute[221301]: 2026-01-31 09:07:02.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:02 np0005603622 podman[303982]: 2026-01-31 09:07:02.206510254 +0000 UTC m=+0.093947906 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:07:02 np0005603622 podman[303981]: 2026-01-31 09:07:02.21918551 +0000 UTC m=+0.115738103 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:07:02 np0005603622 nova_compute[221301]: 2026-01-31 09:07:02.623 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:02 np0005603622 nova_compute[221301]: 2026-01-31 09:07:02.624 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:02.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:02 np0005603622 nova_compute[221301]: 2026-01-31 09:07:02.966 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.086 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.087 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.094 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.094 221324 INFO nova.compute.claims [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.224 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:03.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:03 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3992958546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.652 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.659 221324 DEBUG nova.compute.provider_tree [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.679 221324 DEBUG nova.scheduler.client.report [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.706 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.706 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.764 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.765 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.797 221324 INFO nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.854 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:07:03 np0005603622 nova_compute[221301]: 2026-01-31 09:07:03.966 221324 INFO nova.virt.block_device [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Booting with volume b457f046-29d9-4842-87e2-ad26874ea748 at /dev/vda#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.170 221324 DEBUG nova.policy [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc42b92a5dd34d32b6b184bdc7acb092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76ce367a834b49dfb5b436848118b860', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.262 221324 DEBUG os_brick.utils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.265 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.279 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.279 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[3af6e330-d6f4-471c-932b-d5c107ae4529]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.281 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.290 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.290 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[26d4154e-335e-4ed9-bc54-c86affdbdad2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.293 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.301 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.301 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[cec41150-d8ce-4be7-a57a-9d0ff03f8b8e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.303 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[94b6677b-5d47-450d-b28e-9938c58ad7af]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.304 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.329 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.333 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.333 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.334 221324 DEBUG os_brick.initiator.connectors.lightos [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.334 221324 DEBUG os_brick.utils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] <== get_connector_properties: return (71ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:07:04 np0005603622 nova_compute[221301]: 2026-01-31 09:07:04.335 221324 DEBUG nova.virt.block_device [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating existing volume attachment record: e215381c-08ee-4176-a743-9d9c114514bb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:07:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:04.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4033757799' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:05.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:05 np0005603622 nova_compute[221301]: 2026-01-31 09:07:05.583 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.080 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.083 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.084 221324 INFO nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Creating image(s)#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.084 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.084 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Ensure instance console log exists: /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.085 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.085 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.085 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:06 np0005603622 nova_compute[221301]: 2026-01-31 09:07:06.722 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Successfully created port: 8c752671-f32b-4648-8978-c2ba2b140e68 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:07:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:06.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:07 np0005603622 nova_compute[221301]: 2026-01-31 09:07:07.039 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:07 np0005603622 nova_compute[221301]: 2026-01-31 09:07:07.427 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:07 np0005603622 nova_compute[221301]: 2026-01-31 09:07:07.427 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:07:07 np0005603622 nova_compute[221301]: 2026-01-31 09:07:07.449 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:07:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:07.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:08 np0005603622 nova_compute[221301]: 2026-01-31 09:07:08.873 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Successfully updated port: 8c752671-f32b-4648-8978-c2ba2b140e68 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:07:08 np0005603622 nova_compute[221301]: 2026-01-31 09:07:08.910 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:08 np0005603622 nova_compute[221301]: 2026-01-31 09:07:08.911 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:08 np0005603622 nova_compute[221301]: 2026-01-31 09:07:08.911 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:07:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:08.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:09 np0005603622 nova_compute[221301]: 2026-01-31 09:07:09.097 221324 DEBUG nova.compute.manager [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:09 np0005603622 nova_compute[221301]: 2026-01-31 09:07:09.098 221324 DEBUG nova.compute.manager [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing instance network info cache due to event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:07:09 np0005603622 nova_compute[221301]: 2026-01-31 09:07:09.098 221324 DEBUG oslo_concurrency.lockutils [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:09 np0005603622 nova_compute[221301]: 2026-01-31 09:07:09.203 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:07:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:10 np0005603622 nova_compute[221301]: 2026-01-31 09:07:10.585 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.258 221324 DEBUG nova.network.neutron [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.282 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.283 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Instance network_info: |[{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.284 221324 DEBUG oslo_concurrency.lockutils [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.284 221324 DEBUG nova.network.neutron [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.290 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Start _get_guest_xml network_info=[{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'boot_index': 0, 'attachment_id': 'e215381c-08ee-4176-a743-9d9c114514bb', 'mount_device': '/dev/vda', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b457f046-29d9-4842-87e2-ad26874ea748', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b457f046-29d9-4842-87e2-ad26874ea748', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '84bda6d6-4f33-40a6-9f4c-478d073878e1', 'attached_at': '', 'detached_at': '', 'volume_id': 'b457f046-29d9-4842-87e2-ad26874ea748', 'serial': 'b457f046-29d9-4842-87e2-ad26874ea748'}, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.297 221324 WARNING nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.303 221324 DEBUG nova.virt.libvirt.host [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.304 221324 DEBUG nova.virt.libvirt.host [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.312 221324 DEBUG nova.virt.libvirt.host [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.313 221324 DEBUG nova.virt.libvirt.host [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.315 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.315 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.316 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.317 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.317 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.317 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.318 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.318 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.319 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.319 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.320 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.320 221324 DEBUG nova.virt.hardware [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.365 221324 DEBUG nova.storage.rbd_utils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.370 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:11.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/163341203' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.817 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.852 221324 DEBUG nova.virt.libvirt.vif [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1036819813',display_name='tempest-TestVolumeBootPattern-server-1036819813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1036819813',id=208,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-f7cqfwfi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:03Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=84bda6d6-4f33-40a6-9f4c-478d073878e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.853 221324 DEBUG nova.network.os_vif_util [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.854 221324 DEBUG nova.network.os_vif_util [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.855 221324 DEBUG nova.objects.instance [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'pci_devices' on Instance uuid 84bda6d6-4f33-40a6-9f4c-478d073878e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.871 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <uuid>84bda6d6-4f33-40a6-9f4c-478d073878e1</uuid>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <name>instance-000000d0</name>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestVolumeBootPattern-server-1036819813</nova:name>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:07:11</nova:creationTime>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:user uuid="dc42b92a5dd34d32b6b184bdc7acb092">tempest-TestVolumeBootPattern-1392945362-project-member</nova:user>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:project uuid="76ce367a834b49dfb5b436848118b860">tempest-TestVolumeBootPattern-1392945362</nova:project>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <nova:port uuid="8c752671-f32b-4648-8978-c2ba2b140e68">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="serial">84bda6d6-4f33-40a6-9f4c-478d073878e1</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="uuid">84bda6d6-4f33-40a6-9f4c-478d073878e1</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="volumes/volume-b457f046-29d9-4842-87e2-ad26874ea748">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <serial>b457f046-29d9-4842-87e2-ad26874ea748</serial>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:cb:4d:e0"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <target dev="tap8c752671-f3"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/console.log" append="off"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:07:11 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:07:11 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:07:11 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:07:11 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.872 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Preparing to wait for external event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.872 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.872 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.872 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.873 221324 DEBUG nova.virt.libvirt.vif [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1036819813',display_name='tempest-TestVolumeBootPattern-server-1036819813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1036819813',id=208,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-f7cqfwfi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:03Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=84bda6d6-4f33-40a6-9f4c-478d073878e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.873 221324 DEBUG nova.network.os_vif_util [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.874 221324 DEBUG nova.network.os_vif_util [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.874 221324 DEBUG os_vif [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.874 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.875 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.875 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.878 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c752671-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.878 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c752671-f3, col_values=(('external_ids', {'iface-id': '8c752671-f32b-4648-8978-c2ba2b140e68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:4d:e0', 'vm-uuid': '84bda6d6-4f33-40a6-9f4c-478d073878e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.880 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:11 np0005603622 NetworkManager[49080]: <info>  [1769850431.8815] manager: (tap8c752671-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.882 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.885 221324 INFO os_vif [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3')#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.967 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.968 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.969 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] No VIF found with MAC fa:16:3e:cb:4d:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:07:11 np0005603622 nova_compute[221301]: 2026-01-31 09:07:11.970 221324 INFO nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Using config drive#033[00m
Jan 31 04:07:12 np0005603622 nova_compute[221301]: 2026-01-31 09:07:12.004 221324 DEBUG nova.storage.rbd_utils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:12 np0005603622 nova_compute[221301]: 2026-01-31 09:07:12.888 221324 INFO nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Creating config drive at /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config#033[00m
Jan 31 04:07:12 np0005603622 nova_compute[221301]: 2026-01-31 09:07:12.896 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphtzqtktv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:12.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.031 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphtzqtktv" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.073 221324 DEBUG nova.storage.rbd_utils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] rbd image 84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.078 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config 84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.261 221324 DEBUG oslo_concurrency.processutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config 84bda6d6-4f33-40a6-9f4c-478d073878e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.262 221324 INFO nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Deleting local config drive /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1/disk.config because it was imported into RBD.#033[00m
Jan 31 04:07:13 np0005603622 kernel: tap8c752671-f3: entered promiscuous mode
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.3051] manager: (tap8c752671-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/432)
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.307 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:13Z|00984|binding|INFO|Claiming lport 8c752671-f32b-4648-8978-c2ba2b140e68 for this chassis.
Jan 31 04:07:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:13Z|00985|binding|INFO|8c752671-f32b-4648-8978-c2ba2b140e68: Claiming fa:16:3e:cb:4d:e0 10.100.0.4
Jan 31 04:07:13 np0005603622 systemd-udevd[304164]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.350 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.3546] device (tap8c752671-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:07:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:13Z|00986|binding|INFO|Setting lport 8c752671-f32b-4648-8978-c2ba2b140e68 ovn-installed in OVS
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.355 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.3563] device (tap8c752671-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:07:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:13Z|00987|binding|INFO|Setting lport 8c752671-f32b-4648-8978-c2ba2b140e68 up in Southbound
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.369 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:4d:e0 10.100.0.4'], port_security=['fa:16:3e:cb:4d:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '84bda6d6-4f33-40a6-9f4c-478d073878e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cea15428-ed6f-44a7-98e5-24c0fab7b796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8c752671-f32b-4648-8978-c2ba2b140e68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.371 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8c752671-f32b-4648-8978-c2ba2b140e68 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 bound to our chassis#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.372 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 650eb345-8346-4e8f-8e83-eeb0117654f6#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.380 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[fda0ca44-3799-4478-a562-161e3e2be3a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.381 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap650eb345-81 in ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.384 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap650eb345-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.384 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a57446-6170-49c1-aac9-db994b9f50bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.385 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ed908448-97d2-42b6-bead-f2de98e6bc7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 systemd-machined[190406]: New machine qemu-102-instance-000000d0.
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.396 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[dae8dc3f-8c54-42ed-9bdf-07b8eb820b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 systemd[1]: Started Virtual Machine qemu-102-instance-000000d0.
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.417 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[804b44d4-bd29-4f08-bcb0-864a9dcd15ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.447 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a2855a59-7427-487b-a53a-08118be6e2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.4562] manager: (tap650eb345-80): new Veth device (/org/freedesktop/NetworkManager/Devices/433)
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.455 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2ef4df-78a4-4a04-8659-3e5b9dd9d4d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.494 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[e48fff35-dcf1-4455-ae10-922aabfb5284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.501 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f413ace4-53be-4083-91c2-127c8408eefe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.5216] device (tap650eb345-80): carrier: link connected
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.528 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[692d224b-b696-47d4-a366-d000770cb374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.540 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[adaf3bb3-7744-419f-b8fb-2fc28cec52ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 296], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968486, 'reachable_time': 22384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304200, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.553 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5587ca-0b4e-4db3-a67c-a69ccffe0f50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:27ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 968486, 'tstamp': 968486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304201, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.569 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7e84963f-6aa1-4052-83cb-3764c7e1b9c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap650eb345-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:27:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 296], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968486, 'reachable_time': 22384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304202, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.596 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[882002bf-0144-4051-b6e7-d91f31aebb6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.674 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[436f3c1d-d445-4e1c-8e9b-f15bdda5fd42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.676 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.677 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.678 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650eb345-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:13 np0005603622 NetworkManager[49080]: <info>  [1769850433.6814] manager: (tap650eb345-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.680 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 kernel: tap650eb345-80: entered promiscuous mode
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.686 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap650eb345-80, col_values=(('external_ids', {'iface-id': '74bde109-0188-4ce3-87c3-02a3eb853dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.687 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:13Z|00988|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.697 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.698 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7bf2f7-653c-42bd-9c19-a68de01f4b52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.700 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/650eb345-8346-4e8f-8e83-eeb0117654f6.pid.haproxy
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 650eb345-8346-4e8f-8e83-eeb0117654f6
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:07:13 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:13.701 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'env', 'PROCESS_TAG=haproxy-650eb345-8346-4e8f-8e83-eeb0117654f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/650eb345-8346-4e8f-8e83-eeb0117654f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.889 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850433.8884897, 84bda6d6-4f33-40a6-9f4c-478d073878e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.889 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] VM Started (Lifecycle Event)#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.970 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.978 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850433.8886395, 84bda6d6-4f33-40a6-9f4c-478d073878e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:13 np0005603622 nova_compute[221301]: 2026-01-31 09:07:13.979 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.021 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.026 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.077 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:14 np0005603622 podman[304276]: 2026-01-31 09:07:14.085980901 +0000 UTC m=+0.068204386 container create f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:07:14 np0005603622 podman[304276]: 2026-01-31 09:07:14.049918817 +0000 UTC m=+0.032142302 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:07:14 np0005603622 systemd[1]: Started libpod-conmon-f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5.scope.
Jan 31 04:07:14 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:07:14 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c76444c68add129a9728d8b2bb824def762e9efd1a5e998826d92e207f973aff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:07:14 np0005603622 podman[304276]: 2026-01-31 09:07:14.211759208 +0000 UTC m=+0.193982663 container init f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 31 04:07:14 np0005603622 podman[304276]: 2026-01-31 09:07:14.218092825 +0000 UTC m=+0.200316270 container start f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:07:14 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [NOTICE]   (304295) : New worker (304297) forked
Jan 31 04:07:14 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [NOTICE]   (304295) : Loading success.
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.474 221324 DEBUG nova.compute.manager [req-0c413ca6-e194-4dbb-aaef-da4651d3882d req-efae7973-ea4d-4119-8367-d75a8b1716a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.474 221324 DEBUG oslo_concurrency.lockutils [req-0c413ca6-e194-4dbb-aaef-da4651d3882d req-efae7973-ea4d-4119-8367-d75a8b1716a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.474 221324 DEBUG oslo_concurrency.lockutils [req-0c413ca6-e194-4dbb-aaef-da4651d3882d req-efae7973-ea4d-4119-8367-d75a8b1716a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.474 221324 DEBUG oslo_concurrency.lockutils [req-0c413ca6-e194-4dbb-aaef-da4651d3882d req-efae7973-ea4d-4119-8367-d75a8b1716a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.475 221324 DEBUG nova.compute.manager [req-0c413ca6-e194-4dbb-aaef-da4651d3882d req-efae7973-ea4d-4119-8367-d75a8b1716a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Processing event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.475 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.480 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850434.4803646, 84bda6d6-4f33-40a6-9f4c-478d073878e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.480 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.482 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.485 221324 INFO nova.virt.libvirt.driver [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Instance spawned successfully.#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.486 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.569 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.569 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.570 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.570 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.570 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.571 221324 DEBUG nova.virt.libvirt.driver [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.589 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.594 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.742 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.821 221324 INFO nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Took 8.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.821 221324 DEBUG nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:14.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.992 221324 DEBUG nova.network.neutron [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updated VIF entry in instance network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.993 221324 DEBUG nova.network.neutron [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:14 np0005603622 nova_compute[221301]: 2026-01-31 09:07:14.997 221324 INFO nova.compute.manager [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Took 11.95 seconds to build instance.#033[00m
Jan 31 04:07:15 np0005603622 nova_compute[221301]: 2026-01-31 09:07:15.024 221324 DEBUG oslo_concurrency.lockutils [req-310bef07-5cb7-4d94-a657-8835ca2bbfff req-cf306546-845a-4fbc-8e09-6d650e22104f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:15 np0005603622 nova_compute[221301]: 2026-01-31 09:07:15.035 221324 DEBUG oslo_concurrency.lockutils [None req-a5fc0e36-4a27-414a-b6fa-9b939bc9cf49 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:15.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:15 np0005603622 nova_compute[221301]: 2026-01-31 09:07:15.588 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.782 221324 DEBUG nova.compute.manager [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.782 221324 DEBUG oslo_concurrency.lockutils [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.782 221324 DEBUG oslo_concurrency.lockutils [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.782 221324 DEBUG oslo_concurrency.lockutils [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.783 221324 DEBUG nova.compute.manager [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] No waiting events found dispatching network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.783 221324 WARNING nova.compute.manager [req-f743b711-2953-4037-b0b2-c46d3cb2df0c req-3743fcca-ebbd-42dc-b0e8-509be5cd2614 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received unexpected event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:07:16 np0005603622 nova_compute[221301]: 2026-01-31 09:07:16.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:16.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.418 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.419 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.419 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:07:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.739 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.740 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.740 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:07:17 np0005603622 nova_compute[221301]: 2026-01-31 09:07:17.740 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 84bda6d6-4f33-40a6-9f4c-478d073878e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:18.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:19.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:20 np0005603622 nova_compute[221301]: 2026-01-31 09:07:20.590 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:20.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:21.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.935 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.978 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.979 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.980 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.980 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.980 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:21 np0005603622 nova_compute[221301]: 2026-01-31 09:07:21.980 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.433 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.433 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.433 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1331362542' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:22 np0005603622 nova_compute[221301]: 2026-01-31 09:07:22.893 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:22.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.026 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.026 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.165 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.166 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4113MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.166 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.167 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.317 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 84bda6d6-4f33-40a6-9f4c-478d073878e1 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.318 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.318 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.364 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.450 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.450 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.477 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:07:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:23.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.549 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:07:23 np0005603622 nova_compute[221301]: 2026-01-31 09:07:23.717 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1274268116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.200 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.206 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.231 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.272 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.273 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603622 NetworkManager[49080]: <info>  [1769850444.3939] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/435)
Jan 31 04:07:24 np0005603622 NetworkManager[49080]: <info>  [1769850444.3962] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/436)
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:24Z|00989|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:07:24 np0005603622 nova_compute[221301]: 2026-01-31 09:07:24.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:07:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:24 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:07:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:24.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.484 221324 DEBUG nova.compute.manager [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.485 221324 DEBUG nova.compute.manager [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing instance network info cache due to event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.485 221324 DEBUG oslo_concurrency.lockutils [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.485 221324 DEBUG oslo_concurrency.lockutils [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.486 221324 DEBUG nova.network.neutron [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:07:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:25.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:25 np0005603622 nova_compute[221301]: 2026-01-31 09:07:25.593 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:26 np0005603622 nova_compute[221301]: 2026-01-31 09:07:26.887 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:26.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:27.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:28 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:28Z|00126|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.4
Jan 31 04:07:28 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:28Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cb:4d:e0 10.100.0.4
Jan 31 04:07:28 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:28Z|00990|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:07:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:28.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.272 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.296 221324 DEBUG nova.network.neutron [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updated VIF entry in instance network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.297 221324 DEBUG nova.network.neutron [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.351 221324 DEBUG oslo_concurrency.lockutils [req-ba36596d-e763-4971-9b00-15837236e4b0 req-94717ea4-8e0f-4bde-a885-88f25c603dc1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:29 np0005603622 nova_compute[221301]: 2026-01-31 09:07:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:29.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:07:30 np0005603622 nova_compute[221301]: 2026-01-31 09:07:30.596 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:30.926 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:30.927 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:30.928 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:30.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:31.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:31 np0005603622 nova_compute[221301]: 2026-01-31 09:07:31.771 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:31.771 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:07:31 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:31.773 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:07:31 np0005603622 nova_compute[221301]: 2026-01-31 09:07:31.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:32.775 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:32Z|00128|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.4
Jan 31 04:07:32 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:32Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cb:4d:e0 10.100.0.4
Jan 31 04:07:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:32.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:33 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:33Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:4d:e0 10.100.0.4
Jan 31 04:07:33 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:33Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:4d:e0 10.100.0.4
Jan 31 04:07:33 np0005603622 podman[304534]: 2026-01-31 09:07:33.136947594 +0000 UTC m=+0.055283633 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 04:07:33 np0005603622 podman[304533]: 2026-01-31 09:07:33.161815432 +0000 UTC m=+0.078765294 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:07:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:33.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:34 np0005603622 nova_compute[221301]: 2026-01-31 09:07:34.812 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:34.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:35.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:35 np0005603622 nova_compute[221301]: 2026-01-31 09:07:35.597 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:36 np0005603622 nova_compute[221301]: 2026-01-31 09:07:36.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:36.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:37.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:39.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:40 np0005603622 nova_compute[221301]: 2026-01-31 09:07:40.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:41.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:41.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:41 np0005603622 nova_compute[221301]: 2026-01-31 09:07:41.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:41 np0005603622 nova_compute[221301]: 2026-01-31 09:07:41.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 31 04:07:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:43.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:43.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:45.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:45.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:45 np0005603622 nova_compute[221301]: 2026-01-31 09:07:45.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:46 np0005603622 nova_compute[221301]: 2026-01-31 09:07:46.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:47.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:48 np0005603622 nova_compute[221301]: 2026-01-31 09:07:48.945 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:48 np0005603622 nova_compute[221301]: 2026-01-31 09:07:48.945 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:48 np0005603622 nova_compute[221301]: 2026-01-31 09:07:48.964 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:07:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:49.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.093 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.093 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.102 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.103 221324 INFO nova.compute.claims [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.272 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:07:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:07:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/625046874' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.759 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.765 221324 DEBUG nova.compute.provider_tree [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.803 221324 DEBUG nova.scheduler.client.report [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.850 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.851 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.931 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.931 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.964 221324 INFO nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:07:49 np0005603622 nova_compute[221301]: 2026-01-31 09:07:49.988 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.159 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.161 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.162 221324 INFO nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Creating image(s)#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.202 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.252 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.276 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.279 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.301 221324 DEBUG nova.policy [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebd43008d7a64b8bbf97a2304b1f78b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.333 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.333 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.334 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.334 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.356 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.359 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:50 np0005603622 nova_compute[221301]: 2026-01-31 09:07:50.608 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:07:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:51.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:07:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:51.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:51 np0005603622 nova_compute[221301]: 2026-01-31 09:07:51.821 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:51 np0005603622 nova_compute[221301]: 2026-01-31 09:07:51.903 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:51 np0005603622 nova_compute[221301]: 2026-01-31 09:07:51.909 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] resizing rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.063 221324 DEBUG nova.objects.instance [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'migration_context' on Instance uuid 35d6cbea-df30-4560-8421-c97ad1fb0a36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.084 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.084 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Ensure instance console log exists: /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.085 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.085 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.085 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:52 np0005603622 nova_compute[221301]: 2026-01-31 09:07:52.224 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Successfully created port: 1ad58e77-d333-473c-bf1c-2936cfc4473f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:07:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:53.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.524 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Successfully updated port: 1ad58e77-d333-473c-bf1c-2936cfc4473f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.545 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.546 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquired lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.546 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:07:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:53.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.707 221324 DEBUG nova.compute.manager [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.708 221324 DEBUG nova.compute.manager [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing instance network info cache due to event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.709 221324 DEBUG oslo_concurrency.lockutils [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:07:53 np0005603622 nova_compute[221301]: 2026-01-31 09:07:53.806 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:07:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:55.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.042 221324 DEBUG nova.network.neutron [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updating instance_info_cache with network_info: [{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.070 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Releasing lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.070 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Instance network_info: |[{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.071 221324 DEBUG oslo_concurrency.lockutils [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.071 221324 DEBUG nova.network.neutron [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.077 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Start _get_guest_xml network_info=[{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.085 221324 WARNING nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.092 221324 DEBUG nova.virt.libvirt.host [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.093 221324 DEBUG nova.virt.libvirt.host [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.102 221324 DEBUG nova.virt.libvirt.host [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.103 221324 DEBUG nova.virt.libvirt.host [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.105 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.106 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.107 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.107 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.107 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.107 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.107 221324 DEBUG nova.virt.hardware [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.109 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:55 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/319915025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.547 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:55.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.588 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.593 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:55 np0005603622 nova_compute[221301]: 2026-01-31 09:07:55.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:07:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1650928665' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.033 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.036 221324 DEBUG nova.virt.libvirt.vif [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=211,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-s0p9jxt0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:50Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=35d6cbea-df30-4560-8421-c97ad1fb0a36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.036 221324 DEBUG nova.network.os_vif_util [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.038 221324 DEBUG nova.network.os_vif_util [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.040 221324 DEBUG nova.objects.instance [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'pci_devices' on Instance uuid 35d6cbea-df30-4560-8421-c97ad1fb0a36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.068 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <uuid>35d6cbea-df30-4560-8421-c97ad1fb0a36</uuid>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <name>instance-000000d3</name>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118</nova:name>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:07:55</nova:creationTime>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:user uuid="ebd43008d7a64b8bbf97a2304b1f78b6">tempest-TestSecurityGroupsBasicOps-1802479850-project-member</nova:user>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:project uuid="0c7930b92fc3471f87d9fe78ee56e71e">tempest-TestSecurityGroupsBasicOps-1802479850</nova:project>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <nova:port uuid="1ad58e77-d333-473c-bf1c-2936cfc4473f">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="serial">35d6cbea-df30-4560-8421-c97ad1fb0a36</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="uuid">35d6cbea-df30-4560-8421-c97ad1fb0a36</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/35d6cbea-df30-4560-8421-c97ad1fb0a36_disk">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:0b:ad:cb"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <target dev="tap1ad58e77-d3"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/console.log" append="off"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:07:56 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:07:56 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:07:56 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:07:56 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.068 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Preparing to wait for external event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.068 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.068 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.069 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.069 221324 DEBUG nova.virt.libvirt.vif [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=211,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-s0p9jxt0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:07:50Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=35d6cbea-df30-4560-8421-c97ad1fb0a36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.070 221324 DEBUG nova.network.os_vif_util [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.070 221324 DEBUG nova.network.os_vif_util [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.071 221324 DEBUG os_vif [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.072 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.078 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ad58e77-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.078 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ad58e77-d3, col_values=(('external_ids', {'iface-id': '1ad58e77-d333-473c-bf1c-2936cfc4473f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:ad:cb', 'vm-uuid': '35d6cbea-df30-4560-8421-c97ad1fb0a36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.080 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:56 np0005603622 NetworkManager[49080]: <info>  [1769850476.0831] manager: (tap1ad58e77-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.083 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.090 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.091 221324 INFO os_vif [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3')#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.153 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.154 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.155 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No VIF found with MAC fa:16:3e:0b:ad:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.155 221324 INFO nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Using config drive#033[00m
Jan 31 04:07:56 np0005603622 nova_compute[221301]: 2026-01-31 09:07:56.188 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:07:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:57.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.153 221324 INFO nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Creating config drive at /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.159 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxfu_h5bh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.283 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxfu_h5bh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.323 221324 DEBUG nova.storage.rbd_utils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.328 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.500 221324 DEBUG oslo_concurrency.processutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config 35d6cbea-df30-4560-8421-c97ad1fb0a36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.501 221324 INFO nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Deleting local config drive /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36/disk.config because it was imported into RBD.#033[00m
Jan 31 04:07:57 np0005603622 kernel: tap1ad58e77-d3: entered promiscuous mode
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.5441] manager: (tap1ad58e77-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/438)
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.546 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:57Z|00991|binding|INFO|Claiming lport 1ad58e77-d333-473c-bf1c-2936cfc4473f for this chassis.
Jan 31 04:07:57 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:57Z|00992|binding|INFO|1ad58e77-d333-473c-bf1c-2936cfc4473f: Claiming fa:16:3e:0b:ad:cb 10.100.0.8
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.553 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:57Z|00993|binding|INFO|Setting lport 1ad58e77-d333-473c-bf1c-2936cfc4473f ovn-installed in OVS
Jan 31 04:07:57 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:57Z|00994|binding|INFO|Setting lport 1ad58e77-d333-473c-bf1c-2936cfc4473f up in Southbound
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.555 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:ad:cb 10.100.0.8'], port_security=['fa:16:3e:0b:ad:cb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '35d6cbea-df30-4560-8421-c97ad1fb0a36', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '72f918ed-3c23-4718-b95e-e97f44b1ff67 f00beb61-464a-48b1-b739-7a38012ca3e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea5e5c3-41ce-42a3-ba60-7739a7d8baec, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1ad58e77-d333-473c-bf1c-2936cfc4473f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.558 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1ad58e77-d333-473c-bf1c-2936cfc4473f in datapath a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 bound to our chassis#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.559 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.567 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[13fd674c-d06e-45de-8f42-51bc3e52ea59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.568 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa43e5cb4-61 in ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.571 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa43e5cb4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.572 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[38d266e5-3a04-4da2-8f5a-cbd72cd9100e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 systemd-udevd[304902]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.573 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a40b4b52-3c6f-4d94-9a89-e3a163bbbafd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:07:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:57.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.582 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[984b39d4-b29e-457f-9f90-216cb42b5b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 systemd-machined[190406]: New machine qemu-103-instance-000000d3.
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.5924] device (tap1ad58e77-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.5931] device (tap1ad58e77-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.595 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[245aa9da-fc4f-4f61-bc4b-acbb351744b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 systemd[1]: Started Virtual Machine qemu-103-instance-000000d3.
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.631 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[161e6c69-93ea-419d-9c02-e73d0ded6ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 systemd-udevd[304907]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.635 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e09e16e7-985a-4951-a8de-aeab6f08d589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.6369] manager: (tapa43e5cb4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/439)
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.661 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[6f20bcc8-2855-430c-babe-a401d18b2b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.665 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2389c761-14ae-4812-bb78-fd34da3af68e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.6846] device (tapa43e5cb4-60): carrier: link connected
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.689 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[a371dc97-a2c3-4d34-a48f-4fedbfc1cc32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.703 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[45918b2c-7f19-4e79-80a8-d4e44dd53c8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa43e5cb4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:22:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972902, 'reachable_time': 17364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304935, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.718 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[75432b4b-a9bf-4ce8-9e6a-049de0182993]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:227e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 972902, 'tstamp': 972902}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304936, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.735 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[8285f074-dad7-4f87-a806-5d1ee9e72cf7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa43e5cb4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:22:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 298], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972902, 'reachable_time': 17364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304937, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.762 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[06975987-b37f-4c1c-9e85-99cdf220b8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.825 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9083547c-532c-403f-8ad4-3e625bd7c707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.827 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa43e5cb4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.827 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.828 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa43e5cb4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:57 np0005603622 kernel: tapa43e5cb4-60: entered promiscuous mode
Jan 31 04:07:57 np0005603622 NetworkManager[49080]: <info>  [1769850477.8314] manager: (tapa43e5cb4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/440)
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.830 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.837 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa43e5cb4-60, col_values=(('external_ids', {'iface-id': 'b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:07:57 np0005603622 ovn_controller[130109]: 2026-01-31T09:07:57Z|00995|binding|INFO|Releasing lport b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a from this chassis (sb_readonly=0)
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.843 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:07:57 np0005603622 nova_compute[221301]: 2026-01-31 09:07:57.845 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.844 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc934ce-37f6-4725-8c8e-9fb3bef1806e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.847 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.pid.haproxy
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:07:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:07:57.848 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'env', 'PROCESS_TAG=haproxy-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.048 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850478.0478163, 35d6cbea-df30-4560-8421-c97ad1fb0a36 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.049 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] VM Started (Lifecycle Event)#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.095 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.104 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850478.0539217, 35d6cbea-df30-4560-8421-c97ad1fb0a36 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.104 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.131 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.135 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.160 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:58 np0005603622 podman[305010]: 2026-01-31 09:07:58.212559978 +0000 UTC m=+0.050844556 container create b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:07:58 np0005603622 systemd[1]: Started libpod-conmon-b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981.scope.
Jan 31 04:07:58 np0005603622 podman[305010]: 2026-01-31 09:07:58.183857068 +0000 UTC m=+0.022141676 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:07:58 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:07:58 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fa4c37d563ac5885b5120d1d1db94870f55b8c5b30dce41a4445dfac365eb31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:07:58 np0005603622 podman[305010]: 2026-01-31 09:07:58.310207341 +0000 UTC m=+0.148491939 container init b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Jan 31 04:07:58 np0005603622 podman[305010]: 2026-01-31 09:07:58.316210929 +0000 UTC m=+0.154495497 container start b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:07:58 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [NOTICE]   (305029) : New worker (305031) forked
Jan 31 04:07:58 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [NOTICE]   (305029) : Loading success.
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.346 221324 DEBUG nova.compute.manager [req-26030837-0e6e-4a6a-a8a0-8df55a9d1b00 req-a52c19e3-4d46-4dd8-988f-0f49114388d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.346 221324 DEBUG oslo_concurrency.lockutils [req-26030837-0e6e-4a6a-a8a0-8df55a9d1b00 req-a52c19e3-4d46-4dd8-988f-0f49114388d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.347 221324 DEBUG oslo_concurrency.lockutils [req-26030837-0e6e-4a6a-a8a0-8df55a9d1b00 req-a52c19e3-4d46-4dd8-988f-0f49114388d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.348 221324 DEBUG oslo_concurrency.lockutils [req-26030837-0e6e-4a6a-a8a0-8df55a9d1b00 req-a52c19e3-4d46-4dd8-988f-0f49114388d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.348 221324 DEBUG nova.compute.manager [req-26030837-0e6e-4a6a-a8a0-8df55a9d1b00 req-a52c19e3-4d46-4dd8-988f-0f49114388d2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Processing event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.349 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.356 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850478.3556404, 35d6cbea-df30-4560-8421-c97ad1fb0a36 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.356 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.360 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.365 221324 INFO nova.virt.libvirt.driver [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Instance spawned successfully.#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.365 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.390 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.397 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.403 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.404 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.405 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.406 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.406 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.407 221324 DEBUG nova.virt.libvirt.driver [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.421 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.471 221324 INFO nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Took 8.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.472 221324 DEBUG nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.555 221324 INFO nova.compute.manager [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Took 9.51 seconds to build instance.#033[00m
Jan 31 04:07:58 np0005603622 nova_compute[221301]: 2026-01-31 09:07:58.574 221324 DEBUG oslo_concurrency.lockutils [None req-f5736e74-a70a-4885-91f6-f379f1f097da ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:07:59 np0005603622 nova_compute[221301]: 2026-01-31 09:07:59.013 221324 DEBUG nova.network.neutron [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updated VIF entry in instance network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:07:59 np0005603622 nova_compute[221301]: 2026-01-31 09:07:59.014 221324 DEBUG nova.network.neutron [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updating instance_info_cache with network_info: [{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:07:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:07:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:07:59.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:07:59 np0005603622 nova_compute[221301]: 2026-01-31 09:07:59.059 221324 DEBUG oslo_concurrency.lockutils [req-c75e281f-3a37-4727-919b-8c0e554ec7f4 req-a9adf0a1-631b-440d-914a-c5fdc111fa2d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:07:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:07:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:07:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:07:59.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:07:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.480 221324 DEBUG nova.compute.manager [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.481 221324 DEBUG oslo_concurrency.lockutils [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.481 221324 DEBUG oslo_concurrency.lockutils [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.482 221324 DEBUG oslo_concurrency.lockutils [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.482 221324 DEBUG nova.compute.manager [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] No waiting events found dispatching network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.482 221324 WARNING nova.compute.manager [req-82476d2e-f90f-45ab-b27d-75e73b65113a req-de98d209-2770-4f72-b8dc-ff59d7cb09b3 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received unexpected event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f for instance with vm_state active and task_state None.#033[00m
Jan 31 04:08:00 np0005603622 nova_compute[221301]: 2026-01-31 09:08:00.612 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:08:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:01.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:08:01 np0005603622 nova_compute[221301]: 2026-01-31 09:08:01.080 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:01.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:03.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:03.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:03 np0005603622 nova_compute[221301]: 2026-01-31 09:08:03.723 221324 DEBUG nova.compute.manager [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:03 np0005603622 nova_compute[221301]: 2026-01-31 09:08:03.724 221324 DEBUG nova.compute.manager [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing instance network info cache due to event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:03 np0005603622 nova_compute[221301]: 2026-01-31 09:08:03.724 221324 DEBUG oslo_concurrency.lockutils [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:03 np0005603622 nova_compute[221301]: 2026-01-31 09:08:03.724 221324 DEBUG oslo_concurrency.lockutils [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:03 np0005603622 nova_compute[221301]: 2026-01-31 09:08:03.725 221324 DEBUG nova.network.neutron [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:04 np0005603622 podman[305041]: 2026-01-31 09:08:04.172543233 +0000 UTC m=+0.092686103 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 04:08:04 np0005603622 podman[305040]: 2026-01-31 09:08:04.182085455 +0000 UTC m=+0.104598558 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:08:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:05.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:05.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:05 np0005603622 nova_compute[221301]: 2026-01-31 09:08:05.616 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:06 np0005603622 nova_compute[221301]: 2026-01-31 09:08:06.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:06 np0005603622 nova_compute[221301]: 2026-01-31 09:08:06.323 221324 DEBUG nova.network.neutron [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updated VIF entry in instance network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:06 np0005603622 nova_compute[221301]: 2026-01-31 09:08:06.324 221324 DEBUG nova.network.neutron [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updating instance_info_cache with network_info: [{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:06 np0005603622 nova_compute[221301]: 2026-01-31 09:08:06.349 221324 DEBUG oslo_concurrency.lockutils [req-9942a3b5-7de0-473e-942a-4883842c0c2a req-169b0517-3fff-4eba-af28-28e0cf7b1946 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:07.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:07.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:09.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:09.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:10 np0005603622 nova_compute[221301]: 2026-01-31 09:08:10.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:11.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:11 np0005603622 nova_compute[221301]: 2026-01-31 09:08:11.084 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:11.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:11 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:11Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:ad:cb 10.100.0.8
Jan 31 04:08:11 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:11Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:ad:cb 10.100.0.8
Jan 31 04:08:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:13.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:13.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:15.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:15.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:15 np0005603622 nova_compute[221301]: 2026-01-31 09:08:15.620 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:16 np0005603622 nova_compute[221301]: 2026-01-31 09:08:16.087 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:17.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:17.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.941 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.942 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.942 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:08:18 np0005603622 nova_compute[221301]: 2026-01-31 09:08:18.942 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 84bda6d6-4f33-40a6-9f4c-478d073878e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:19.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:19.181 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:19 np0005603622 nova_compute[221301]: 2026-01-31 09:08:19.182 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:19 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:19.183 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:08:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:20 np0005603622 nova_compute[221301]: 2026-01-31 09:08:20.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:20 np0005603622 nova_compute[221301]: 2026-01-31 09:08:20.981 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.003 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.003 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.004 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.004 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.005 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:08:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:21.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.088 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:21 np0005603622 nova_compute[221301]: 2026-01-31 09:08:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:21.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:22 np0005603622 nova_compute[221301]: 2026-01-31 09:08:22.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:22 np0005603622 nova_compute[221301]: 2026-01-31 09:08:22.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:23.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:23.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.431 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.432 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1948681057' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:24 np0005603622 nova_compute[221301]: 2026-01-31 09:08:24.897 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.004 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.005 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.010 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.011 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:08:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:25.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.236 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.238 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3927MB free_disk=20.851383209228516GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.238 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.239 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.325 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 84bda6d6-4f33-40a6-9f4c-478d073878e1 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.326 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 35d6cbea-df30-4560-8421-c97ad1fb0a36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.327 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.327 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.403 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:25.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:25 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/369341869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.846 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.853 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.878 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.903 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:08:25 np0005603622 nova_compute[221301]: 2026-01-31 09:08:25.904 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:26 np0005603622 nova_compute[221301]: 2026-01-31 09:08:26.090 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:26.185 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:27.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:27.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:29.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:29.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:30 np0005603622 nova_compute[221301]: 2026-01-31 09:08:30.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:30 np0005603622 nova_compute[221301]: 2026-01-31 09:08:30.904 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:30 np0005603622 nova_compute[221301]: 2026-01-31 09:08:30.905 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:30.926 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:30.927 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:30.927 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:31.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:31 np0005603622 nova_compute[221301]: 2026-01-31 09:08:31.093 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:08:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:31 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:08:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:31.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:33.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:33.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:35.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:35 np0005603622 podman[305257]: 2026-01-31 09:08:35.174557416 +0000 UTC m=+0.085988605 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:08:35 np0005603622 podman[305256]: 2026-01-31 09:08:35.208177156 +0000 UTC m=+0.119242906 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Jan 31 04:08:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:35.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:35 np0005603622 nova_compute[221301]: 2026-01-31 09:08:35.632 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:36 np0005603622 nova_compute[221301]: 2026-01-31 09:08:36.095 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.004000106s ======
Jan 31 04:08:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:37.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000106s
Jan 31 04:08:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:37.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:38 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:08:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:39.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:39.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:40 np0005603622 nova_compute[221301]: 2026-01-31 09:08:40.635 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:40Z|00996|binding|INFO|Releasing lport 74bde109-0188-4ce3-87c3-02a3eb853dc2 from this chassis (sb_readonly=0)
Jan 31 04:08:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:40Z|00997|binding|INFO|Releasing lport b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a from this chassis (sb_readonly=0)
Jan 31 04:08:40 np0005603622 nova_compute[221301]: 2026-01-31 09:08:40.910 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:41.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:41 np0005603622 nova_compute[221301]: 2026-01-31 09:08:41.097 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:41.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:43.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:43.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 31 04:08:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:45.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:45 np0005603622 nova_compute[221301]: 2026-01-31 09:08:45.638 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:45.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.100 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.889 221324 DEBUG nova.compute.manager [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.890 221324 DEBUG nova.compute.manager [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing instance network info cache due to event network-changed-8c752671-f32b-4648-8978-c2ba2b140e68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.890 221324 DEBUG oslo_concurrency.lockutils [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.891 221324 DEBUG oslo_concurrency.lockutils [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:08:46 np0005603622 nova_compute[221301]: 2026-01-31 09:08:46.891 221324 DEBUG nova.network.neutron [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Refreshing network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:08:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:47.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.133 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.134 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.134 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.135 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.135 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.136 221324 INFO nova.compute.manager [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Terminating instance#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.138 221324 DEBUG nova.compute.manager [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:08:47 np0005603622 kernel: tap8c752671-f3 (unregistering): left promiscuous mode
Jan 31 04:08:47 np0005603622 NetworkManager[49080]: <info>  [1769850527.2494] device (tap8c752671-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.264 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:47Z|00998|binding|INFO|Releasing lport 8c752671-f32b-4648-8978-c2ba2b140e68 from this chassis (sb_readonly=0)
Jan 31 04:08:47 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:47Z|00999|binding|INFO|Setting lport 8c752671-f32b-4648-8978-c2ba2b140e68 down in Southbound
Jan 31 04:08:47 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:47Z|01000|binding|INFO|Removing iface tap8c752671-f3 ovn-installed in OVS
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d0.scope: Deactivated successfully.
Jan 31 04:08:47 np0005603622 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d0.scope: Consumed 16.562s CPU time.
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.300 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:4d:e0 10.100.0.4'], port_security=['fa:16:3e:cb:4d:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '84bda6d6-4f33-40a6-9f4c-478d073878e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-650eb345-8346-4e8f-8e83-eeb0117654f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76ce367a834b49dfb5b436848118b860', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cea15428-ed6f-44a7-98e5-24c0fab7b796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ecdc171-9d09-4cba-9bb9-cd2f8ef8e6c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=8c752671-f32b-4648-8978-c2ba2b140e68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:08:47 np0005603622 systemd-machined[190406]: Machine qemu-102-instance-000000d0 terminated.
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.302 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 8c752671-f32b-4648-8978-c2ba2b140e68 in datapath 650eb345-8346-4e8f-8e83-eeb0117654f6 unbound from our chassis#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.304 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 650eb345-8346-4e8f-8e83-eeb0117654f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.306 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3465a20f-894c-4d90-a8f3-835886e1069e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.307 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 namespace which is not needed anymore#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.374 221324 INFO nova.virt.libvirt.driver [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Instance destroyed successfully.#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.375 221324 DEBUG nova.objects.instance [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lazy-loading 'resources' on Instance uuid 84bda6d6-4f33-40a6-9f4c-478d073878e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.401 221324 DEBUG nova.virt.libvirt.vif [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1036819813',display_name='tempest-TestVolumeBootPattern-server-1036819813',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1036819813',id=208,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNWvTxk1zh2OCmPH3tumEbxR7y880uhj4vJDAspX9r3EATf0w5oe5DG3NVBcNRbWTPcgVwlnXcyaRQZseLc7edDTe4kwfjogsRoplvkAsMWW9sCSaJlX0XBkMxl/Ghv8Fw==',key_name='tempest-TestVolumeBootPattern-11482540',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:07:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76ce367a834b49dfb5b436848118b860',ramdisk_id='',reservation_id='r-f7cqfwfi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1392945362',owner_user_name='tempest-TestVolumeBootPattern-1392945362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:07:14Z,user_data=None,user_id='dc42b92a5dd34d32b6b184bdc7acb092',uuid=84bda6d6-4f33-40a6-9f4c-478d073878e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.403 221324 DEBUG nova.network.os_vif_util [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converting VIF {"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.405 221324 DEBUG nova.network.os_vif_util [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.405 221324 DEBUG os_vif [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.409 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c752671-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.411 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.414 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.419 221324 INFO os_vif [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:4d:e0,bridge_name='br-int',has_traffic_filtering=True,id=8c752671-f32b-4648-8978-c2ba2b140e68,network=Network(650eb345-8346-4e8f-8e83-eeb0117654f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c752671-f3')#033[00m
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [NOTICE]   (304295) : haproxy version is 2.8.14-c23fe91
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [NOTICE]   (304295) : path to executable is /usr/sbin/haproxy
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [WARNING]  (304295) : Exiting Master process...
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [WARNING]  (304295) : Exiting Master process...
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [ALERT]    (304295) : Current worker (304297) exited with code 143 (Terminated)
Jan 31 04:08:47 np0005603622 neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6[304291]: [WARNING]  (304295) : All workers exited. Exiting... (0)
Jan 31 04:08:47 np0005603622 systemd[1]: libpod-f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5.scope: Deactivated successfully.
Jan 31 04:08:47 np0005603622 podman[305386]: 2026-01-31 09:08:47.475058731 +0000 UTC m=+0.056735541 container died f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:08:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5-userdata-shm.mount: Deactivated successfully.
Jan 31 04:08:47 np0005603622 systemd[1]: var-lib-containers-storage-overlay-c76444c68add129a9728d8b2bb824def762e9efd1a5e998826d92e207f973aff-merged.mount: Deactivated successfully.
Jan 31 04:08:47 np0005603622 podman[305386]: 2026-01-31 09:08:47.515000378 +0000 UTC m=+0.096677198 container cleanup f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:08:47 np0005603622 systemd[1]: libpod-conmon-f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5.scope: Deactivated successfully.
Jan 31 04:08:47 np0005603622 podman[305432]: 2026-01-31 09:08:47.575256802 +0000 UTC m=+0.044172260 container remove f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.579 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c28eeb-c654-45c1-a1d8-4c78a125a144]: (4, ('Sat Jan 31 09:08:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5)\nf63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5\nSat Jan 31 09:08:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 (f63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5)\nf63ba0dda80af8d22de26f435e3a26d8ab74739ac39031f155b1507c11cec6c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.581 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0d44822f-26b9-4a16-af42-9365b022c04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.582 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650eb345-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.583 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 kernel: tap650eb345-80: left promiscuous mode
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.589 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.592 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[23e4ae0f-b4c7-4392-945d-9f8048c88a9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.607 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[15e42eaf-5faa-4f85-b817-3fe4a6ce0ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.609 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd70a8e-2c7f-4a24-9471-459b3ae727f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.623 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[60185a3d-f4c4-46b0-a698-7cf86babdbf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968478, 'reachable_time': 35348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305450, 'error': None, 'target': 'ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 systemd[1]: run-netns-ovnmeta\x2d650eb345\x2d8346\x2d4e8f\x2d8e83\x2deeb0117654f6.mount: Deactivated successfully.
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.626 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-650eb345-8346-4e8f-8e83-eeb0117654f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:08:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:08:47.627 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6d6ad5-c17a-46dd-8b0f-992754a2496b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:08:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:47.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.666 221324 INFO nova.virt.libvirt.driver [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Deleting instance files /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1_del#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.668 221324 INFO nova.virt.libvirt.driver [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Deletion of /var/lib/nova/instances/84bda6d6-4f33-40a6-9f4c-478d073878e1_del complete#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.736 221324 INFO nova.compute.manager [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.736 221324 DEBUG oslo.service.loopingcall [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.736 221324 DEBUG nova.compute.manager [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:08:47 np0005603622 nova_compute[221301]: 2026-01-31 09:08:47.737 221324 DEBUG nova.network.neutron [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:08:48 np0005603622 nova_compute[221301]: 2026-01-31 09:08:48.678 221324 DEBUG nova.network.neutron [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:48 np0005603622 nova_compute[221301]: 2026-01-31 09:08:48.699 221324 INFO nova.compute.manager [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Took 0.96 seconds to deallocate network for instance.#033[00m
Jan 31 04:08:48 np0005603622 nova_compute[221301]: 2026-01-31 09:08:48.819 221324 DEBUG nova.compute.manager [req-d9626775-32b9-4316-b8ac-36a2477f0727 req-eb338cca-d9a9-48b4-9fff-24b94e54d756 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-deleted-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.038 221324 DEBUG nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-unplugged-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.038 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.038 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.039 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.039 221324 DEBUG nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] No waiting events found dispatching network-vif-unplugged-8c752671-f32b-4648-8978-c2ba2b140e68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.040 221324 DEBUG nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-unplugged-8c752671-f32b-4648-8978-c2ba2b140e68 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.040 221324 DEBUG nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.040 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.041 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.041 221324 DEBUG oslo_concurrency.lockutils [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.041 221324 DEBUG nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] No waiting events found dispatching network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.042 221324 WARNING nova.compute.manager [req-c3728a2e-e4fe-465d-9fd1-34f4ed8736a2 req-c374159f-2e17-4b23-bc87-c3c8688e47cc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Received unexpected event network-vif-plugged-8c752671-f32b-4648-8978-c2ba2b140e68 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.048 221324 INFO nova.compute.manager [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Took 0.35 seconds to detach 1 volumes for instance.#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.066 221324 DEBUG nova.network.neutron [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updated VIF entry in instance network info cache for port 8c752671-f32b-4648-8978-c2ba2b140e68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.066 221324 DEBUG nova.network.neutron [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Updating instance_info_cache with network_info: [{"id": "8c752671-f32b-4648-8978-c2ba2b140e68", "address": "fa:16:3e:cb:4d:e0", "network": {"id": "650eb345-8346-4e8f-8e83-eeb0117654f6", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-1550438709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76ce367a834b49dfb5b436848118b860", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c752671-f3", "ovs_interfaceid": "8c752671-f32b-4648-8978-c2ba2b140e68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:08:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:49.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.107 221324 DEBUG oslo_concurrency.lockutils [req-51dfb3ff-f908-4e3b-a178-7fff1ac175a0 req-90a7c421-7018-4b28-8598-4aa3c329a0e4 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-84bda6d6-4f33-40a6-9f4c-478d073878e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.112 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.113 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.194 221324 DEBUG oslo_concurrency.processutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:08:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:49.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:08:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3638441440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.675 221324 DEBUG oslo_concurrency.processutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.680 221324 DEBUG nova.compute.provider_tree [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.724 221324 DEBUG nova.scheduler.client.report [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.764 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.806 221324 INFO nova.scheduler.client.report [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Deleted allocations for instance 84bda6d6-4f33-40a6-9f4c-478d073878e1#033[00m
Jan 31 04:08:49 np0005603622 nova_compute[221301]: 2026-01-31 09:08:49.927 221324 DEBUG oslo_concurrency.lockutils [None req-ee2938e8-aa70-4c39-9f56-9c92a2427b93 dc42b92a5dd34d32b6b184bdc7acb092 76ce367a834b49dfb5b436848118b860 - - default default] Lock "84bda6d6-4f33-40a6-9f4c-478d073878e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:08:50 np0005603622 nova_compute[221301]: 2026-01-31 09:08:50.641 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:51.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:51.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:51 np0005603622 nova_compute[221301]: 2026-01-31 09:08:51.675 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:52 np0005603622 nova_compute[221301]: 2026-01-31 09:08:52.412 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 31 04:08:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:08:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/500623582' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:08:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:08:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/500623582' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:08:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:53.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:53.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:55.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:08:55 np0005603622 nova_compute[221301]: 2026-01-31 09:08:55.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:55.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:08:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:57.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:08:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:08:57 np0005603622 nova_compute[221301]: 2026-01-31 09:08:57.416 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:57.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:08:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:08:59.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:08:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:08:59Z|01001|binding|INFO|Releasing lport b8c8a7b7-27e3-462c-bcf8-67bad4f07d3a from this chassis (sb_readonly=0)
Jan 31 04:08:59 np0005603622 nova_compute[221301]: 2026-01-31 09:08:59.493 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:08:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:08:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:08:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:08:59.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:00 np0005603622 nova_compute[221301]: 2026-01-31 09:09:00.647 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:01.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.402 221324 DEBUG nova.compute.manager [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.402 221324 DEBUG nova.compute.manager [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing instance network info cache due to event network-changed-1ad58e77-d333-473c-bf1c-2936cfc4473f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.403 221324 DEBUG oslo_concurrency.lockutils [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.403 221324 DEBUG oslo_concurrency.lockutils [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.403 221324 DEBUG nova.network.neutron [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Refreshing network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.437 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.437 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.438 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.438 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.438 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.439 221324 INFO nova.compute.manager [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Terminating instance#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.440 221324 DEBUG nova.compute.manager [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:09:01 np0005603622 kernel: tap1ad58e77-d3 (unregistering): left promiscuous mode
Jan 31 04:09:01 np0005603622 NetworkManager[49080]: <info>  [1769850541.6068] device (tap1ad58e77-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.611 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:01Z|01002|binding|INFO|Releasing lport 1ad58e77-d333-473c-bf1c-2936cfc4473f from this chassis (sb_readonly=0)
Jan 31 04:09:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:01Z|01003|binding|INFO|Setting lport 1ad58e77-d333-473c-bf1c-2936cfc4473f down in Southbound
Jan 31 04:09:01 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:01Z|01004|binding|INFO|Removing iface tap1ad58e77-d3 ovn-installed in OVS
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.613 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:01.622 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:ad:cb 10.100.0.8'], port_security=['fa:16:3e:0b:ad:cb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '35d6cbea-df30-4560-8421-c97ad1fb0a36', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '72f918ed-3c23-4718-b95e-e97f44b1ff67 f00beb61-464a-48b1-b739-7a38012ca3e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ea5e5c3-41ce-42a3-ba60-7739a7d8baec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1ad58e77-d333-473c-bf1c-2936cfc4473f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:01.624 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1ad58e77-d333-473c-bf1c-2936cfc4473f in datapath a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 unbound from our chassis#033[00m
Jan 31 04:09:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:01.626 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:09:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:01.628 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[263863a1-6d1b-4cf3-9d16-633512fc8aa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.628 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:01.629 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 namespace which is not needed anymore#033[00m
Jan 31 04:09:01 np0005603622 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Jan 31 04:09:01 np0005603622 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d3.scope: Consumed 16.257s CPU time.
Jan 31 04:09:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:01.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:01 np0005603622 systemd-machined[190406]: Machine qemu-103-instance-000000d3 terminated.
Jan 31 04:09:01 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [NOTICE]   (305029) : haproxy version is 2.8.14-c23fe91
Jan 31 04:09:01 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [NOTICE]   (305029) : path to executable is /usr/sbin/haproxy
Jan 31 04:09:01 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [WARNING]  (305029) : Exiting Master process...
Jan 31 04:09:01 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [ALERT]    (305029) : Current worker (305031) exited with code 143 (Terminated)
Jan 31 04:09:01 np0005603622 neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98[305025]: [WARNING]  (305029) : All workers exited. Exiting... (0)
Jan 31 04:09:01 np0005603622 systemd[1]: libpod-b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981.scope: Deactivated successfully.
Jan 31 04:09:01 np0005603622 podman[305500]: 2026-01-31 09:09:01.818862376 +0000 UTC m=+0.059833094 container died b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:09:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981-userdata-shm.mount: Deactivated successfully.
Jan 31 04:09:01 np0005603622 systemd[1]: var-lib-containers-storage-overlay-8fa4c37d563ac5885b5120d1d1db94870f55b8c5b30dce41a4445dfac365eb31-merged.mount: Deactivated successfully.
Jan 31 04:09:01 np0005603622 podman[305500]: 2026-01-31 09:09:01.858794922 +0000 UTC m=+0.099765680 container cleanup b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:09:01 np0005603622 systemd[1]: libpod-conmon-b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981.scope: Deactivated successfully.
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.876 221324 INFO nova.virt.libvirt.driver [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Instance destroyed successfully.#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.877 221324 DEBUG nova.objects.instance [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'resources' on Instance uuid 35d6cbea-df30-4560-8421-c97ad1fb0a36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.916 221324 DEBUG nova.virt.libvirt.vif [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:07:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1583068118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=211,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOzLIomxM/9LD6pUmebgfRrGKD5wh7paXInFcqd2L0Y2NS24kEHuU2xVFiRKZyesRVQNhc+hG031RV8i0F2/Jgjlcab+v9GRiy/RiVhAhjy1pjZJBvE/J+e5Z7tpOEKxw==',key_name='tempest-TestSecurityGroupsBasicOps-542781411',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:07:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-s0p9jxt0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:07:58Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=35d6cbea-df30-4560-8421-c97ad1fb0a36,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.918 221324 DEBUG nova.network.os_vif_util [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.920 221324 DEBUG nova.network.os_vif_util [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.920 221324 DEBUG os_vif [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.922 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.922 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ad58e77-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.927 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:01 np0005603622 nova_compute[221301]: 2026-01-31 09:09:01.929 221324 INFO os_vif [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ad:cb,bridge_name='br-int',has_traffic_filtering=True,id=1ad58e77-d333-473c-bf1c-2936cfc4473f,network=Network(a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ad58e77-d3')#033[00m
Jan 31 04:09:01 np0005603622 podman[305539]: 2026-01-31 09:09:01.993434254 +0000 UTC m=+0.105338468 container remove b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.000 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[aa475c35-67e6-49da-a377-29c09840505f]: (4, ('Sat Jan 31 09:09:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 (b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981)\nb60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981\nSat Jan 31 09:09:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 (b60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981)\nb60268a14e0721675404352868ae9b69000e9ab75c98fe09dbe26897a193d981\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.002 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[36f90b2d-f0ea-4afd-81af-7694feb61177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.003 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa43e5cb4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:02 np0005603622 kernel: tapa43e5cb4-60: left promiscuous mode
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.005 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.015 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.018 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d6c909-4009-4125-8af6-25c060379416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.034 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0846cf1b-7c66-4cc5-b55d-e4f3fac61be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.036 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d28e9b27-d559-49c6-9392-b9944486173a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.050 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3150c9-70f4-45b2-8901-17d15dcaa2ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972896, 'reachable_time': 17878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305574, 'error': None, 'target': 'ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.052 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:09:02 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:02.052 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[d053ff55-8ebd-4b0a-be4b-b2b9623e2c58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:02 np0005603622 systemd[1]: run-netns-ovnmeta\x2da43e5cb4\x2d640f\x2d4ef5\x2d9c18\x2d9b6b0fe32a98.mount: Deactivated successfully.
Jan 31 04:09:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.372 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850527.3714297, 84bda6d6-4f33-40a6-9f4c-478d073878e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.373 221324 INFO nova.compute.manager [-] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.395 221324 DEBUG nova.compute.manager [None req-9945cedc-89af-465a-a029-9427afa961ef - - - - - -] [instance: 84bda6d6-4f33-40a6-9f4c-478d073878e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.932 221324 INFO nova.virt.libvirt.driver [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Deleting instance files /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36_del#033[00m
Jan 31 04:09:02 np0005603622 nova_compute[221301]: 2026-01-31 09:09:02.934 221324 INFO nova.virt.libvirt.driver [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Deletion of /var/lib/nova/instances/35d6cbea-df30-4560-8421-c97ad1fb0a36_del complete#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.026 221324 INFO nova.compute.manager [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Took 1.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.027 221324 DEBUG oslo.service.loopingcall [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.027 221324 DEBUG nova.compute.manager [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.027 221324 DEBUG nova.network.neutron [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:09:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:03.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.547 221324 DEBUG nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-unplugged-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.548 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.548 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.549 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.549 221324 DEBUG nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] No waiting events found dispatching network-vif-unplugged-1ad58e77-d333-473c-bf1c-2936cfc4473f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.549 221324 DEBUG nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-unplugged-1ad58e77-d333-473c-bf1c-2936cfc4473f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.549 221324 DEBUG nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.550 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.550 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.550 221324 DEBUG oslo_concurrency.lockutils [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.551 221324 DEBUG nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] No waiting events found dispatching network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:03 np0005603622 nova_compute[221301]: 2026-01-31 09:09:03.551 221324 WARNING nova.compute.manager [req-67f0a43d-850b-4ced-a824-498e7c497c1e req-8afc9648-0f70-48e6-a2dc-44d1938bbe54 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received unexpected event network-vif-plugged-1ad58e77-d333-473c-bf1c-2936cfc4473f for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:09:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:03.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:04 np0005603622 nova_compute[221301]: 2026-01-31 09:09:04.031 221324 DEBUG nova.network.neutron [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updated VIF entry in instance network info cache for port 1ad58e77-d333-473c-bf1c-2936cfc4473f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:04 np0005603622 nova_compute[221301]: 2026-01-31 09:09:04.031 221324 DEBUG nova.network.neutron [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updating instance_info_cache with network_info: [{"id": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "address": "fa:16:3e:0b:ad:cb", "network": {"id": "a43e5cb4-640f-4ef5-9c18-9b6b0fe32a98", "bridge": "br-int", "label": "tempest-network-smoke--845145870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ad58e77-d3", "ovs_interfaceid": "1ad58e77-d333-473c-bf1c-2936cfc4473f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:04 np0005603622 nova_compute[221301]: 2026-01-31 09:09:04.065 221324 DEBUG oslo_concurrency.lockutils [req-aaeffd26-e972-44b3-b987-538d6966cfb6 req-b0d58155-31e6-4479-866c-1d2d44eaa3cf fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-35d6cbea-df30-4560-8421-c97ad1fb0a36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.037 221324 DEBUG nova.network.neutron [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.060 221324 INFO nova.compute.manager [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Took 2.03 seconds to deallocate network for instance.#033[00m
Jan 31 04:09:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:05.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.128 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.129 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.134 221324 DEBUG nova.compute.manager [req-8fb2f01e-0616-48a7-88d1-59c1f2d47fd1 req-d9d7e49e-3ce5-4df6-9309-3c45c2abef7b fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Received event network-vif-deleted-1ad58e77-d333-473c-bf1c-2936cfc4473f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.220 221324 DEBUG oslo_concurrency.processutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.494387) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545494483, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1822, "num_deletes": 252, "total_data_size": 4121539, "memory_usage": 4170376, "flush_reason": "Manual Compaction"}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545511607, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 2706276, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85805, "largest_seqno": 87622, "table_properties": {"data_size": 2698889, "index_size": 4329, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15990, "raw_average_key_size": 20, "raw_value_size": 2683866, "raw_average_value_size": 3405, "num_data_blocks": 190, "num_entries": 788, "num_filter_entries": 788, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850393, "oldest_key_time": 1769850393, "file_creation_time": 1769850545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 17278 microseconds, and 9000 cpu microseconds.
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.511672) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 2706276 bytes OK
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.511697) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.514084) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.514106) EVENT_LOG_v1 {"time_micros": 1769850545514099, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.514133) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 4113381, prev total WAL file size 4113381, number of live WAL files 2.
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.515274) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(2642KB)], [177(12MB)]
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545515353, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 15300495, "oldest_snapshot_seqno": -1}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10678 keys, 13445863 bytes, temperature: kUnknown
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545599130, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 13445863, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13377280, "index_size": 40758, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26757, "raw_key_size": 282212, "raw_average_key_size": 26, "raw_value_size": 13191275, "raw_average_value_size": 1235, "num_data_blocks": 1546, "num_entries": 10678, "num_filter_entries": 10678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.599415) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 13445863 bytes
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.601612) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.4 rd, 160.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 12.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(10.6) write-amplify(5.0) OK, records in: 11199, records dropped: 521 output_compression: NoCompression
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.601685) EVENT_LOG_v1 {"time_micros": 1769850545601664, "job": 114, "event": "compaction_finished", "compaction_time_micros": 83885, "compaction_time_cpu_micros": 26073, "output_level": 6, "num_output_files": 1, "total_output_size": 13445863, "num_input_records": 11199, "num_output_records": 10678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545602306, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850545604143, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.515154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.604270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.604326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.604331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.604335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:09:05.604339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.623 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:05.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.706 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:05 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1585262327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.754 221324 DEBUG oslo_concurrency.processutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.761 221324 DEBUG nova.compute.provider_tree [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.791 221324 DEBUG nova.scheduler.client.report [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.825 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.887 221324 INFO nova.scheduler.client.report [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Deleted allocations for instance 35d6cbea-df30-4560-8421-c97ad1fb0a36#033[00m
Jan 31 04:09:05 np0005603622 nova_compute[221301]: 2026-01-31 09:09:05.997 221324 DEBUG oslo_concurrency.lockutils [None req-4f8d7b3a-9019-40f8-89f6-68978ec123b4 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "35d6cbea-df30-4560-8421-c97ad1fb0a36" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:06 np0005603622 podman[305600]: 2026-01-31 09:09:06.153524167 +0000 UTC m=+0.069776697 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 31 04:09:06 np0005603622 podman[305599]: 2026-01-31 09:09:06.190610868 +0000 UTC m=+0.107259468 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:09:06 np0005603622 nova_compute[221301]: 2026-01-31 09:09:06.925 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:07.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:07.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:09.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:09.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:10 np0005603622 nova_compute[221301]: 2026-01-31 09:09:10.702 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:11.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:11.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:11 np0005603622 nova_compute[221301]: 2026-01-31 09:09:11.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:13.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:13.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:15.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:15.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:15 np0005603622 nova_compute[221301]: 2026-01-31 09:09:15.704 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:16 np0005603622 nova_compute[221301]: 2026-01-31 09:09:16.874 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850541.8731112, 35d6cbea-df30-4560-8421-c97ad1fb0a36 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:16 np0005603622 nova_compute[221301]: 2026-01-31 09:09:16.874 221324 INFO nova.compute.manager [-] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:09:16 np0005603622 nova_compute[221301]: 2026-01-31 09:09:16.917 221324 DEBUG nova.compute.manager [None req-950606d0-91fa-4377-8f86-fa1c25b752b7 - - - - - -] [instance: 35d6cbea-df30-4560-8421-c97ad1fb0a36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:16 np0005603622 nova_compute[221301]: 2026-01-31 09:09:16.930 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:17.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:17.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:19.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:19.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:20.036 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=92, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=91) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:20.037 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.416 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.417 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.418 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.418 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:09:20 np0005603622 nova_compute[221301]: 2026-01-31 09:09:20.705 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:21.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:21.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:21 np0005603622 nova_compute[221301]: 2026-01-31 09:09:21.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:22 np0005603622 nova_compute[221301]: 2026-01-31 09:09:22.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:22 np0005603622 nova_compute[221301]: 2026-01-31 09:09:22.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:23.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:23.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:24 np0005603622 nova_compute[221301]: 2026-01-31 09:09:24.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:25.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.436 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.437 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.437 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.438 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.438 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:25.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:25 np0005603622 nova_compute[221301]: 2026-01-31 09:09:25.709 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.147 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.709s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.397 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.399 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4265MB free_disk=20.95781707763672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.400 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.401 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.536 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.537 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.703 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.794 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.795 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.814 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.907 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:26 np0005603622 nova_compute[221301]: 2026-01-31 09:09:26.936 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:27 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:27.039 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '92'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:27.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2731125643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.526 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.823s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.533 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.551 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.577 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.578 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.579 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.587 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.588 221324 INFO nova.compute.claims [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:09:27 np0005603622 nova_compute[221301]: 2026-01-31 09:09:27.692 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:09:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2144012847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.156 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.163 221324 DEBUG nova.compute.provider_tree [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.177 221324 DEBUG nova.scheduler.client.report [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.202 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.203 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.252 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.253 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.275 221324 INFO nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.294 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.389 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.391 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.392 221324 INFO nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Creating image(s)#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.431 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.469 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.507 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.512 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.591 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.593 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.594 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.594 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.634 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:28 np0005603622 nova_compute[221301]: 2026-01-31 09:09:28.639 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 36e4f111-bfed-43f1-80db-08645e5f39fe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.067 221324 DEBUG nova.policy [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ebd43008d7a64b8bbf97a2304b1f78b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.154 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 36e4f111-bfed-43f1-80db-08645e5f39fe_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:29.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.238 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] resizing rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.610 221324 DEBUG nova.objects.instance [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'migration_context' on Instance uuid 36e4f111-bfed-43f1-80db-08645e5f39fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.652 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.652 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Ensure instance console log exists: /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.653 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.654 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:29 np0005603622 nova_compute[221301]: 2026-01-31 09:09:29.654 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:29.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:30 np0005603622 nova_compute[221301]: 2026-01-31 09:09:30.147 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Successfully created port: 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:09:30 np0005603622 nova_compute[221301]: 2026-01-31 09:09:30.710 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:30.927 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:30.928 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:30.928 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:31.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.582 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.582 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.669 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Successfully updated port: 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.685 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.685 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquired lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.686 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:09:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:31.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.795 221324 DEBUG nova.compute.manager [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.796 221324 DEBUG nova.compute.manager [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing instance network info cache due to event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.796 221324 DEBUG oslo_concurrency.lockutils [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.940 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:09:31 np0005603622 nova_compute[221301]: 2026-01-31 09:09:31.944 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:33.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.362 221324 DEBUG nova.network.neutron [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updating instance_info_cache with network_info: [{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.387 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Releasing lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.388 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Instance network_info: |[{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.389 221324 DEBUG oslo_concurrency.lockutils [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.389 221324 DEBUG nova.network.neutron [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.395 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Start _get_guest_xml network_info=[{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.401 221324 WARNING nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.407 221324 DEBUG nova.virt.libvirt.host [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.408 221324 DEBUG nova.virt.libvirt.host [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.411 221324 DEBUG nova.virt.libvirt.host [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.412 221324 DEBUG nova.virt.libvirt.host [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.414 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.415 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.416 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.416 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.417 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.417 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.418 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.418 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.419 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.419 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.420 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.420 221324 DEBUG nova.virt.hardware [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.426 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:33.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:09:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3943728407' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.862 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.894 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:33 np0005603622 nova_compute[221301]: 2026-01-31 09:09:33.898 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:09:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2515832843' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.442 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.445 221324 DEBUG nova.virt.libvirt.vif [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=215,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5+OiMCZ+4cXkOPT4esyLqVDpyG8vvIspqelKUYVUzGYCQ9pUR5KKKQtx8Lppb6Gwro5SYBogBTwmpwO+rzWdi8Zd/6CP2YWNQbrVuIAO6nOGxS08n5uVDyxKICFANq5Q==',key_name='tempest-TestSecurityGroupsBasicOps-575953005',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-q5f2u8fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:09:28Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=36e4f111-bfed-43f1-80db-08645e5f39fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.445 221324 DEBUG nova.network.os_vif_util [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.447 221324 DEBUG nova.network.os_vif_util [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.448 221324 DEBUG nova.objects.instance [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'pci_devices' on Instance uuid 36e4f111-bfed-43f1-80db-08645e5f39fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.463 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <uuid>36e4f111-bfed-43f1-80db-08645e5f39fe</uuid>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <name>instance-000000d7</name>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509</nova:name>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:09:33</nova:creationTime>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:user uuid="ebd43008d7a64b8bbf97a2304b1f78b6">tempest-TestSecurityGroupsBasicOps-1802479850-project-member</nova:user>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:project uuid="0c7930b92fc3471f87d9fe78ee56e71e">tempest-TestSecurityGroupsBasicOps-1802479850</nova:project>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <nova:port uuid="1fdf9f5a-c17f-4742-86e1-8b10ccf002a8">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="serial">36e4f111-bfed-43f1-80db-08645e5f39fe</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="uuid">36e4f111-bfed-43f1-80db-08645e5f39fe</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/36e4f111-bfed-43f1-80db-08645e5f39fe_disk">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:94:19:7a"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <target dev="tap1fdf9f5a-c1"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/console.log" append="off"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:09:34 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:09:34 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:09:34 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:09:34 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.465 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Preparing to wait for external event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.466 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.467 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.467 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.469 221324 DEBUG nova.virt.libvirt.vif [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=215,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5+OiMCZ+4cXkOPT4esyLqVDpyG8vvIspqelKUYVUzGYCQ9pUR5KKKQtx8Lppb6Gwro5SYBogBTwmpwO+rzWdi8Zd/6CP2YWNQbrVuIAO6nOGxS08n5uVDyxKICFANq5Q==',key_name='tempest-TestSecurityGroupsBasicOps-575953005',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-q5f2u8fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:09:28Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=36e4f111-bfed-43f1-80db-08645e5f39fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.470 221324 DEBUG nova.network.os_vif_util [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.471 221324 DEBUG nova.network.os_vif_util [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.472 221324 DEBUG os_vif [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.474 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.475 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.480 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.480 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1fdf9f5a-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.481 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1fdf9f5a-c1, col_values=(('external_ids', {'iface-id': '1fdf9f5a-c17f-4742-86e1-8b10ccf002a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:19:7a', 'vm-uuid': '36e4f111-bfed-43f1-80db-08645e5f39fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.484 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:34 np0005603622 NetworkManager[49080]: <info>  [1769850574.4860] manager: (tap1fdf9f5a-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/441)
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.487 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.491 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.493 221324 INFO os_vif [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1')#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.603 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.604 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.604 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] No VIF found with MAC fa:16:3e:94:19:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.605 221324 INFO nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Using config drive#033[00m
Jan 31 04:09:34 np0005603622 nova_compute[221301]: 2026-01-31 09:09:34.642 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:35.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.357 221324 DEBUG nova.network.neutron [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updated VIF entry in instance network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.358 221324 DEBUG nova.network.neutron [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updating instance_info_cache with network_info: [{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.368 221324 INFO nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Creating config drive at /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.372 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptv8gsfna execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.397 221324 DEBUG oslo_concurrency.lockutils [req-c4a0bef9-32cc-428b-90d1-1e71c1be3e08 req-c3b52d5c-348c-404d-8be7-b4a9bca40a7d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.501 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptv8gsfna" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.535 221324 DEBUG nova.storage.rbd_utils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] rbd image 36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.541 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config 36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:09:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.712 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.866 221324 DEBUG oslo_concurrency.processutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config 36e4f111-bfed-43f1-80db-08645e5f39fe_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.324s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.867 221324 INFO nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Deleting local config drive /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe/disk.config because it was imported into RBD.#033[00m
Jan 31 04:09:35 np0005603622 kernel: tap1fdf9f5a-c1: entered promiscuous mode
Jan 31 04:09:35 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:35Z|01005|binding|INFO|Claiming lport 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 for this chassis.
Jan 31 04:09:35 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:35Z|01006|binding|INFO|1fdf9f5a-c17f-4742-86e1-8b10ccf002a8: Claiming fa:16:3e:94:19:7a 10.100.0.14
Jan 31 04:09:35 np0005603622 NetworkManager[49080]: <info>  [1769850575.9282] manager: (tap1fdf9f5a-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/442)
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.928 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.933 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.935 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 systemd-udevd[306010]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.958 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:19:7a 10.100.0.14'], port_security=['fa:16:3e:94:19:7a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '36e4f111-bfed-43f1-80db-08645e5f39fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8d960ddb-d8a7-43fb-9ab8-a2fc06feda80 9ca0ac26-46aa-41cd-a446-60fe40763c76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52ce0987-c7ae-40a7-ba83-62ef9f6799ab, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.959 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 in datapath e227ab25-a1ca-491f-bd95-0b4098741dc1 bound to our chassis#033[00m
Jan 31 04:09:35 np0005603622 systemd-machined[190406]: New machine qemu-104-instance-000000d7.
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.960 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e227ab25-a1ca-491f-bd95-0b4098741dc1#033[00m
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.963 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 NetworkManager[49080]: <info>  [1769850575.9673] device (tap1fdf9f5a-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:09:35 np0005603622 systemd[1]: Started Virtual Machine qemu-104-instance-000000d7.
Jan 31 04:09:35 np0005603622 NetworkManager[49080]: <info>  [1769850575.9692] device (tap1fdf9f5a-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:09:35 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:35Z|01007|binding|INFO|Setting lport 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 ovn-installed in OVS
Jan 31 04:09:35 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:35Z|01008|binding|INFO|Setting lport 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 up in Southbound
Jan 31 04:09:35 np0005603622 nova_compute[221301]: 2026-01-31 09:09:35.970 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.970 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[370c46fc-6c12-4f10-b428-9f6dc7aa42b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.971 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape227ab25-a1 in ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.973 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape227ab25-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.973 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ede1e8cf-ace2-47c3-8f0d-115125dc2897]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.975 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[60c8b285-c74d-497f-8e59-9e57d32557fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.984 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[f44266d6-8378-4a79-bd43-7581ca836ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:35.996 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5fa0fe-60b7-4921-8317-71a73c9773e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.019 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[70e4df0c-50fd-4f7a-bc5e-8a5329feb06b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 NetworkManager[49080]: <info>  [1769850576.0265] manager: (tape227ab25-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/443)
Jan 31 04:09:36 np0005603622 systemd-udevd[306014]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.027 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c473f9-2c1a-4c67-ace9-97a816b976f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.051 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[2186deb1-d087-47ee-a31f-f08b7d5e2754]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.055 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[544a5a0d-ccbf-4faf-aa64-527faf9989ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 NetworkManager[49080]: <info>  [1769850576.0700] device (tape227ab25-a0): carrier: link connected
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.072 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec76096-b2ec-4079-b0cd-e7d3de128d09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.082 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5df5a55a-cc1a-4ff6-bbda-d61f0ce443a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape227ab25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:37:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982741, 'reachable_time': 27078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306044, 'error': None, 'target': 'ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.093 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4c7ea5-df26-4766-a54c-c95a0c491054]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:3739'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 982741, 'tstamp': 982741}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306045, 'error': None, 'target': 'ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.102 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7709478b-ce50-4177-b140-a1134c2623a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape227ab25-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:37:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982741, 'reachable_time': 27078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306046, 'error': None, 'target': 'ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.127 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[867102e4-9944-4f1f-a246-955cfb83c64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.172 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0399382a-ca38-447a-a917-15a294ceedc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.173 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape227ab25-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.174 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.174 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape227ab25-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.176 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:36 np0005603622 NetworkManager[49080]: <info>  [1769850576.1765] manager: (tape227ab25-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/444)
Jan 31 04:09:36 np0005603622 kernel: tape227ab25-a0: entered promiscuous mode
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.179 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape227ab25-a0, col_values=(('external_ids', {'iface-id': 'b5f8be54-ce4e-4242-97c0-b4ccf3fa1dba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:36 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:36Z|01009|binding|INFO|Releasing lport b5f8be54-ce4e-4242-97c0-b4ccf3fa1dba from this chassis (sb_readonly=0)
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.189 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e227ab25-a1ca-491f-bd95-0b4098741dc1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e227ab25-a1ca-491f-bd95-0b4098741dc1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.190 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[acd06a22-01bc-41bd-a2b2-f6287c976f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.190 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-e227ab25-a1ca-491f-bd95-0b4098741dc1
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/e227ab25-a1ca-491f-bd95-0b4098741dc1.pid.haproxy
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID e227ab25-a1ca-491f-bd95-0b4098741dc1
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:09:36 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:09:36.191 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'env', 'PROCESS_TAG=haproxy-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e227ab25-a1ca-491f-bd95-0b4098741dc1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.337 221324 DEBUG nova.compute.manager [req-f6168a21-a067-4bb2-a37c-ab6866f4b7f5 req-93925070-210a-407b-8ded-709ec97a6081 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.338 221324 DEBUG oslo_concurrency.lockutils [req-f6168a21-a067-4bb2-a37c-ab6866f4b7f5 req-93925070-210a-407b-8ded-709ec97a6081 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.339 221324 DEBUG oslo_concurrency.lockutils [req-f6168a21-a067-4bb2-a37c-ab6866f4b7f5 req-93925070-210a-407b-8ded-709ec97a6081 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.339 221324 DEBUG oslo_concurrency.lockutils [req-f6168a21-a067-4bb2-a37c-ab6866f4b7f5 req-93925070-210a-407b-8ded-709ec97a6081 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.339 221324 DEBUG nova.compute.manager [req-f6168a21-a067-4bb2-a37c-ab6866f4b7f5 req-93925070-210a-407b-8ded-709ec97a6081 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Processing event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:09:36 np0005603622 podman[306097]: 2026-01-31 09:09:36.491331493 +0000 UTC m=+0.063301245 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.509 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850576.5095105, 36e4f111-bfed-43f1-80db-08645e5f39fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.510 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] VM Started (Lifecycle Event)#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.512 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.515 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.519 221324 INFO nova.virt.libvirt.driver [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Instance spawned successfully.#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.519 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.533 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:36 np0005603622 podman[306096]: 2026-01-31 09:09:36.537228627 +0000 UTC m=+0.109322473 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.539 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.542 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.543 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.543 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.544 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.544 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.545 221324 DEBUG nova.virt.libvirt.driver [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:09:36 np0005603622 podman[306153]: 2026-01-31 09:09:36.556364263 +0000 UTC m=+0.053462625 container create 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.570 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.570 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850576.5103269, 36e4f111-bfed-43f1-80db-08645e5f39fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.571 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:09:36 np0005603622 systemd[1]: Started libpod-conmon-10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817.scope.
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.593 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.596 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850576.514988, 36e4f111-bfed-43f1-80db-08645e5f39fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.596 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.614 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.616 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:09:36 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.619 221324 INFO nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Took 8.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.620 221324 DEBUG nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:09:36 np0005603622 podman[306153]: 2026-01-31 09:09:36.532516272 +0000 UTC m=+0.029614654 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:09:36 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da00665701be53411abba9bbc656e1d4e9c1f6281a96f54e4a91553ed10288bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.646 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:09:36 np0005603622 podman[306153]: 2026-01-31 09:09:36.654016397 +0000 UTC m=+0.151114839 container init 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Jan 31 04:09:36 np0005603622 podman[306153]: 2026-01-31 09:09:36.657904489 +0000 UTC m=+0.155002891 container start 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:09:36 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [NOTICE]   (306178) : New worker (306180) forked
Jan 31 04:09:36 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [NOTICE]   (306178) : Loading success.
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.684 221324 INFO nova.compute.manager [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Took 9.80 seconds to build instance.#033[00m
Jan 31 04:09:36 np0005603622 nova_compute[221301]: 2026-01-31 09:09:36.701 221324 DEBUG oslo_concurrency.lockutils [None req-37f7ae91-4e8e-4e66-bbff-dd0591403396 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:37.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:37.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.472 221324 DEBUG nova.compute.manager [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.474 221324 DEBUG oslo_concurrency.lockutils [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.475 221324 DEBUG oslo_concurrency.lockutils [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.475 221324 DEBUG oslo_concurrency.lockutils [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.476 221324 DEBUG nova.compute.manager [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] No waiting events found dispatching network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:09:38 np0005603622 nova_compute[221301]: 2026-01-31 09:09:38.477 221324 WARNING nova.compute.manager [req-bfeabbbd-d4a3-4ebd-9984-1e2e5031eb31 req-b3f17fa4-e910-4ed0-9de9-8884ef2234c7 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received unexpected event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:09:38 np0005603622 podman[306361]: 2026-01-31 09:09:38.699656869 +0000 UTC m=+0.370747758 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 04:09:39 np0005603622 podman[306361]: 2026-01-31 09:09:39.003810815 +0000 UTC m=+0.674901684 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 31 04:09:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:39.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:39 np0005603622 nova_compute[221301]: 2026-01-31 09:09:39.484 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:39.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.306 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603622 NetworkManager[49080]: <info>  [1769850580.3096] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/445)
Jan 31 04:09:40 np0005603622 NetworkManager[49080]: <info>  [1769850580.3106] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/446)
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:40Z|01010|binding|INFO|Releasing lport b5f8be54-ce4e-4242-97c0-b4ccf3fa1dba from this chassis (sb_readonly=0)
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.341 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:09:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:40 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.679 221324 DEBUG nova.compute.manager [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.680 221324 DEBUG nova.compute.manager [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing instance network info cache due to event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.681 221324 DEBUG oslo_concurrency.lockutils [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.681 221324 DEBUG oslo_concurrency.lockutils [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.682 221324 DEBUG nova.network.neutron [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:09:40 np0005603622 nova_compute[221301]: 2026-01-31 09:09:40.716 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:41.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:41.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:42 np0005603622 nova_compute[221301]: 2026-01-31 09:09:42.220 221324 DEBUG nova.network.neutron [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updated VIF entry in instance network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:09:42 np0005603622 nova_compute[221301]: 2026-01-31 09:09:42.221 221324 DEBUG nova.network.neutron [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updating instance_info_cache with network_info: [{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:09:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:42 np0005603622 nova_compute[221301]: 2026-01-31 09:09:42.241 221324 DEBUG oslo_concurrency.lockutils [req-2cd25500-1c60-4003-a54e-925e46848652 req-b0d40f6b-9e3e-4697-95b4-63e429621593 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:09:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:43.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:43.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:44 np0005603622 nova_compute[221301]: 2026-01-31 09:09:44.486 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:45.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:45 np0005603622 nova_compute[221301]: 2026-01-31 09:09:45.717 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:45.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:46 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:09:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:47.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:47.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:49.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:49 np0005603622 nova_compute[221301]: 2026-01-31 09:09:49.489 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:49.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:49 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:49Z|01011|binding|INFO|Releasing lport b5f8be54-ce4e-4242-97c0-b4ccf3fa1dba from this chassis (sb_readonly=0)
Jan 31 04:09:50 np0005603622 nova_compute[221301]: 2026-01-31 09:09:50.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:50 np0005603622 nova_compute[221301]: 2026-01-31 09:09:50.720 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:50 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 31 04:09:50 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:50Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:94:19:7a 10.100.0.14
Jan 31 04:09:51 np0005603622 ovn_controller[130109]: 2026-01-31T09:09:51Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:94:19:7a 10.100.0.14
Jan 31 04:09:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:51.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:53.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:53.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:54 np0005603622 nova_compute[221301]: 2026-01-31 09:09:54.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:09:54 np0005603622 nova_compute[221301]: 2026-01-31 09:09:54.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:55.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:55 np0005603622 nova_compute[221301]: 2026-01-31 09:09:55.722 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:09:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:55.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:09:57 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:09:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:09:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:09:59.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:09:59 np0005603622 nova_compute[221301]: 2026-01-31 09:09:59.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:09:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:09:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:09:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:09:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:09:59 np0005603622 nova_compute[221301]: 2026-01-31 09:09:59.784 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.570 221324 DEBUG nova.compute.manager [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.570 221324 DEBUG nova.compute.manager [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing instance network info cache due to event network-changed-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.570 221324 DEBUG oslo_concurrency.lockutils [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.571 221324 DEBUG oslo_concurrency.lockutils [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.571 221324 DEBUG nova.network.neutron [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Refreshing network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.609 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.610 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.611 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.611 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.612 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.614 221324 INFO nova.compute.manager [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Terminating instance#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.616 221324 DEBUG nova.compute.manager [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:10:00 np0005603622 kernel: tap1fdf9f5a-c1 (unregistering): left promiscuous mode
Jan 31 04:10:00 np0005603622 NetworkManager[49080]: <info>  [1769850600.7176] device (tap1fdf9f5a-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.718 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:00Z|01012|binding|INFO|Releasing lport 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 from this chassis (sb_readonly=0)
Jan 31 04:10:00 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:00Z|01013|binding|INFO|Setting lport 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 down in Southbound
Jan 31 04:10:00 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:00Z|01014|binding|INFO|Removing iface tap1fdf9f5a-c1 ovn-installed in OVS
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.725 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.729 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:00.737 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:19:7a 10.100.0.14'], port_security=['fa:16:3e:94:19:7a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '36e4f111-bfed-43f1-80db-08645e5f39fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c7930b92fc3471f87d9fe78ee56e71e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d960ddb-d8a7-43fb-9ab8-a2fc06feda80 9ca0ac26-46aa-41cd-a446-60fe40763c76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52ce0987-c7ae-40a7-ba83-62ef9f6799ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:10:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:00.739 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 in datapath e227ab25-a1ca-491f-bd95-0b4098741dc1 unbound from our chassis#033[00m
Jan 31 04:10:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:00.740 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e227ab25-a1ca-491f-bd95-0b4098741dc1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:10:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:00.741 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0251c362-94a5-4785-ab3e-0399247ce6a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:00.742 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1 namespace which is not needed anymore#033[00m
Jan 31 04:10:00 np0005603622 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d7.scope: Deactivated successfully.
Jan 31 04:10:00 np0005603622 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d7.scope: Consumed 13.638s CPU time.
Jan 31 04:10:00 np0005603622 systemd-machined[190406]: Machine qemu-104-instance-000000d7 terminated.
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.855 221324 INFO nova.virt.libvirt.driver [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Instance destroyed successfully.#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.856 221324 DEBUG nova.objects.instance [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lazy-loading 'resources' on Instance uuid 36e4f111-bfed-43f1-80db-08645e5f39fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.886 221324 DEBUG nova.virt.libvirt.vif [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:09:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1802479850-access_point-1425164509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1802479850-ac',id=215,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5+OiMCZ+4cXkOPT4esyLqVDpyG8vvIspqelKUYVUzGYCQ9pUR5KKKQtx8Lppb6Gwro5SYBogBTwmpwO+rzWdi8Zd/6CP2YWNQbrVuIAO6nOGxS08n5uVDyxKICFANq5Q==',key_name='tempest-TestSecurityGroupsBasicOps-575953005',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:09:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0c7930b92fc3471f87d9fe78ee56e71e',ramdisk_id='',reservation_id='r-q5f2u8fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1802479850',owner_user_name='tempest-TestSecurityGroupsBasicOps-1802479850-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:09:36Z,user_data=None,user_id='ebd43008d7a64b8bbf97a2304b1f78b6',uuid=36e4f111-bfed-43f1-80db-08645e5f39fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.886 221324 DEBUG nova.network.os_vif_util [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converting VIF {"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.888 221324 DEBUG nova.network.os_vif_util [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.888 221324 DEBUG os_vif [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.891 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.892 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1fdf9f5a-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.894 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:10:00 np0005603622 nova_compute[221301]: 2026-01-31 09:10:00.901 221324 INFO os_vif [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:19:7a,bridge_name='br-int',has_traffic_filtering=True,id=1fdf9f5a-c17f-4742-86e1-8b10ccf002a8,network=Network(e227ab25-a1ca-491f-bd95-0b4098741dc1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fdf9f5a-c1')#033[00m
Jan 31 04:10:00 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [NOTICE]   (306178) : haproxy version is 2.8.14-c23fe91
Jan 31 04:10:00 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [NOTICE]   (306178) : path to executable is /usr/sbin/haproxy
Jan 31 04:10:00 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [WARNING]  (306178) : Exiting Master process...
Jan 31 04:10:00 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [ALERT]    (306178) : Current worker (306180) exited with code 143 (Terminated)
Jan 31 04:10:00 np0005603622 neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1[306174]: [WARNING]  (306178) : All workers exited. Exiting... (0)
Jan 31 04:10:00 np0005603622 systemd[1]: libpod-10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817.scope: Deactivated successfully.
Jan 31 04:10:00 np0005603622 podman[306691]: 2026-01-31 09:10:00.914979254 +0000 UTC m=+0.060151742 container died 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:10:00 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817-userdata-shm.mount: Deactivated successfully.
Jan 31 04:10:00 np0005603622 systemd[1]: var-lib-containers-storage-overlay-da00665701be53411abba9bbc656e1d4e9c1f6281a96f54e4a91553ed10288bc-merged.mount: Deactivated successfully.
Jan 31 04:10:00 np0005603622 podman[306691]: 2026-01-31 09:10:00.95979031 +0000 UTC m=+0.104962778 container cleanup 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 31 04:10:00 np0005603622 systemd[1]: libpod-conmon-10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817.scope: Deactivated successfully.
Jan 31 04:10:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 04:10:01 np0005603622 podman[306743]: 2026-01-31 09:10:01.025448937 +0000 UTC m=+0.046207944 container remove 10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.029 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb6cd2d-c482-4874-b4b9-4ddd721b778e]: (4, ('Sat Jan 31 09:10:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1 (10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817)\n10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817\nSat Jan 31 09:10:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1 (10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817)\n10b86e12ae9b1cc6544ab0ddda93caddf3b221d42d32474dcfefbbcc8be00817\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.032 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2634adf6-c7b1-413f-8a82-0dfe3a37be9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.033 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape227ab25-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:01 np0005603622 nova_compute[221301]: 2026-01-31 09:10:01.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:01 np0005603622 kernel: tape227ab25-a0: left promiscuous mode
Jan 31 04:10:01 np0005603622 nova_compute[221301]: 2026-01-31 09:10:01.041 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:01 np0005603622 nova_compute[221301]: 2026-01-31 09:10:01.043 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.046 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a74f2b93-8128-4f87-bc9b-85a7b235a344]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.064 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4d60cbac-849c-414e-9f1e-a1921461a85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.066 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d1f96e-7f64-4ba1-a2b9-713c297724e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.079 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a95e6d66-cc32-4fdb-9c3b-ef27161c77e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 982735, 'reachable_time': 28306, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306758, 'error': None, 'target': 'ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.082 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e227ab25-a1ca-491f-bd95-0b4098741dc1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:10:01 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:01.082 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[50e0e9ae-2fce-46ab-a992-a46f92d29bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:01 np0005603622 systemd[1]: run-netns-ovnmeta\x2de227ab25\x2da1ca\x2d491f\x2dbd95\x2d0b4098741dc1.mount: Deactivated successfully.
Jan 31 04:10:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:01.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:01.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:02 np0005603622 nova_compute[221301]: 2026-01-31 09:10:02.132 221324 DEBUG nova.network.neutron [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updated VIF entry in instance network info cache for port 1fdf9f5a-c17f-4742-86e1-8b10ccf002a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:10:02 np0005603622 nova_compute[221301]: 2026-01-31 09:10:02.133 221324 DEBUG nova.network.neutron [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updating instance_info_cache with network_info: [{"id": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "address": "fa:16:3e:94:19:7a", "network": {"id": "e227ab25-a1ca-491f-bd95-0b4098741dc1", "bridge": "br-int", "label": "tempest-network-smoke--1277558740", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0c7930b92fc3471f87d9fe78ee56e71e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fdf9f5a-c1", "ovs_interfaceid": "1fdf9f5a-c17f-4742-86e1-8b10ccf002a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:10:02 np0005603622 nova_compute[221301]: 2026-01-31 09:10:02.166 221324 DEBUG oslo_concurrency.lockutils [req-8e1807fd-8b09-431a-ab3d-6f81643cbc6f req-7f34ce71-d180-4b78-9b2a-ace7635e0b94 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-36e4f111-bfed-43f1-80db-08645e5f39fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:10:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:02 np0005603622 nova_compute[221301]: 2026-01-31 09:10:02.643 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.007 221324 DEBUG nova.compute.manager [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-unplugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.008 221324 DEBUG oslo_concurrency.lockutils [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.008 221324 DEBUG oslo_concurrency.lockutils [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.009 221324 DEBUG oslo_concurrency.lockutils [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.009 221324 DEBUG nova.compute.manager [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] No waiting events found dispatching network-vif-unplugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.010 221324 DEBUG nova.compute.manager [req-8dfa3a38-a1f0-479f-879f-fc16114fb32e req-51904c4a-bb67-4bdd-9bf9-38bc3306f4a1 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-unplugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:10:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:03.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.338 221324 INFO nova.virt.libvirt.driver [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Deleting instance files /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe_del#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.338 221324 INFO nova.virt.libvirt.driver [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Deletion of /var/lib/nova/instances/36e4f111-bfed-43f1-80db-08645e5f39fe_del complete#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.394 221324 INFO nova.compute.manager [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Took 2.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.395 221324 DEBUG oslo.service.loopingcall [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.395 221324 DEBUG nova.compute.manager [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:10:03 np0005603622 nova_compute[221301]: 2026-01-31 09:10:03.396 221324 DEBUG nova.network.neutron [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:10:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:03.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.345 221324 DEBUG nova.network.neutron [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.362 221324 INFO nova.compute.manager [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Took 0.97 seconds to deallocate network for instance.#033[00m
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.409 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.410 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.412 221324 DEBUG nova.compute.manager [req-482aeaf1-92c3-4bcb-9697-9ac245717c61 req-a57eb000-8f3e-429c-8314-76b5bf7204af fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-deleted-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:04 np0005603622 nova_compute[221301]: 2026-01-31 09:10:04.460 221324 DEBUG oslo_concurrency.processutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.106 221324 DEBUG nova.compute.manager [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.107 221324 DEBUG oslo_concurrency.lockutils [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.108 221324 DEBUG oslo_concurrency.lockutils [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.108 221324 DEBUG oslo_concurrency.lockutils [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.109 221324 DEBUG nova.compute.manager [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] No waiting events found dispatching network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.109 221324 WARNING nova.compute.manager [req-5342a844-c3b8-4826-87c1-b53af7275377 req-8817221e-83d8-4b25-96f1-9e6a09efd4fc fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Received unexpected event network-vif-plugged-1fdf9f5a-c17f-4742-86e1-8b10ccf002a8 for instance with vm_state deleted and task_state None.#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.111 221324 DEBUG oslo_concurrency.processutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.651s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.118 221324 DEBUG nova.compute.provider_tree [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.136 221324 DEBUG nova.scheduler.client.report [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.163 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.200 221324 INFO nova.scheduler.client.report [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Deleted allocations for instance 36e4f111-bfed-43f1-80db-08645e5f39fe#033[00m
Jan 31 04:10:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:05.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.351 221324 DEBUG oslo_concurrency.lockutils [None req-8f2473b0-c8f2-4ba5-b803-0b0e8995fe45 ebd43008d7a64b8bbf97a2304b1f78b6 0c7930b92fc3471f87d9fe78ee56e71e - - default default] Lock "36e4f111-bfed-43f1-80db-08645e5f39fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:05.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:05 np0005603622 nova_compute[221301]: 2026-01-31 09:10:05.894 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:07 np0005603622 podman[306783]: 2026-01-31 09:10:07.14408571 +0000 UTC m=+0.067790475 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 31 04:10:07 np0005603622 podman[306782]: 2026-01-31 09:10:07.199218328 +0000 UTC m=+0.125879421 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:10:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:07.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:07.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:09.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:09.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:10 np0005603622 nova_compute[221301]: 2026-01-31 09:10:10.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:10 np0005603622 nova_compute[221301]: 2026-01-31 09:10:10.738 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:10 np0005603622 nova_compute[221301]: 2026-01-31 09:10:10.895 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:11.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:11.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:12 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:12 np0005603622 nova_compute[221301]: 2026-01-31 09:10:12.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:12 np0005603622 nova_compute[221301]: 2026-01-31 09:10:12.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:13.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:13.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:15.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:15 np0005603622 nova_compute[221301]: 2026-01-31 09:10:15.742 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:15.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:15 np0005603622 nova_compute[221301]: 2026-01-31 09:10:15.853 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850600.851938, 36e4f111-bfed-43f1-80db-08645e5f39fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:10:15 np0005603622 nova_compute[221301]: 2026-01-31 09:10:15.854 221324 INFO nova.compute.manager [-] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:10:15 np0005603622 nova_compute[221301]: 2026-01-31 09:10:15.897 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:15 np0005603622 nova_compute[221301]: 2026-01-31 09:10:15.982 221324 DEBUG nova.compute.manager [None req-43325a58-c030-49e5-af6f-9739d8854834 - - - - - -] [instance: 36e4f111-bfed-43f1-80db-08645e5f39fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:10:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:17.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:17.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:19.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:19.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:20.169 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=93, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=92) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:10:20 np0005603622 nova_compute[221301]: 2026-01-31 09:10:20.169 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:20.172 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:10:20 np0005603622 nova_compute[221301]: 2026-01-31 09:10:20.745 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:20 np0005603622 nova_compute[221301]: 2026-01-31 09:10:20.899 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:21.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:21 np0005603622 nova_compute[221301]: 2026-01-31 09:10:21.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:21 np0005603622 nova_compute[221301]: 2026-01-31 09:10:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:10:21 np0005603622 nova_compute[221301]: 2026-01-31 09:10:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:10:21 np0005603622 nova_compute[221301]: 2026-01-31 09:10:21.424 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:10:21 np0005603622 nova_compute[221301]: 2026-01-31 09:10:21.424 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:21.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:22 np0005603622 nova_compute[221301]: 2026-01-31 09:10:22.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:22 np0005603622 nova_compute[221301]: 2026-01-31 09:10:22.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:10:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:23.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.705 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.706 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.730 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:10:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:23.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.875 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.875 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.888 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:10:23 np0005603622 nova_compute[221301]: 2026-01-31 09:10:23.889 221324 INFO nova.compute.claims [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.019 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2123057976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.487 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.492 221324 DEBUG nova.compute.provider_tree [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.511 221324 DEBUG nova.scheduler.client.report [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.581 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.582 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.691 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.691 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.747 221324 INFO nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:10:24 np0005603622 nova_compute[221301]: 2026-01-31 09:10:24.810 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.050 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.051 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.052 221324 INFO nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Creating image(s)#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.088 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.123 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.157 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.162 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.192 221324 DEBUG nova.policy [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b7233f93367f4dcd8eb2b6b115680192', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.241 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.242 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.243 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.243 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "b1c202daae0a5d5b639e0239462ea0d46fe633d6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.275 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.280 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:25.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.649 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b1c202daae0a5d5b639e0239462ea0d46fe633d6 f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.750 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] resizing rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 31 04:10:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:25.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.826 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.896 221324 DEBUG nova.objects.instance [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'migration_context' on Instance uuid f0b8c4bd-36b6-479c-9160-9adb3a86dc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.931 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.932 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Ensure instance console log exists: /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.932 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.933 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:25 np0005603622 nova_compute[221301]: 2026-01-31 09:10:25.934 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.420 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.420 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.421 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.421 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.573 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Successfully created port: 47a41dbd-4926-429e-a493-080c5c5784b1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:10:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4230113655' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.833 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.990 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.991 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4262MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.992 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:26 np0005603622 nova_compute[221301]: 2026-01-31 09:10:26.992 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.080 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance f0b8c4bd-36b6-479c-9160-9adb3a86dc6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.081 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.081 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.130 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:27.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:10:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4046732467' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.616 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.622 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.640 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.659 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:10:27 np0005603622 nova_compute[221301]: 2026-01-31 09:10:27.659 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:27.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.751 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Successfully updated port: 47a41dbd-4926-429e-a493-080c5c5784b1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.820 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.820 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquired lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.820 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.953 221324 DEBUG nova.compute.manager [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.954 221324 DEBUG nova.compute.manager [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing instance network info cache due to event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:10:28 np0005603622 nova_compute[221301]: 2026-01-31 09:10:28.955 221324 DEBUG oslo_concurrency.lockutils [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:10:29 np0005603622 nova_compute[221301]: 2026-01-31 09:10:29.095 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:10:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:29.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:29.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:30.174 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '93'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:30 np0005603622 nova_compute[221301]: 2026-01-31 09:10:30.658 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:30 np0005603622 nova_compute[221301]: 2026-01-31 09:10:30.749 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:30 np0005603622 nova_compute[221301]: 2026-01-31 09:10:30.902 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:30.928 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:30.929 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:30.929 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.163 221324 DEBUG nova.network.neutron [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.198 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Releasing lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.198 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Instance network_info: |[{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.199 221324 DEBUG oslo_concurrency.lockutils [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.199 221324 DEBUG nova.network.neutron [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.203 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Start _get_guest_xml network_info=[{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.210 221324 WARNING nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.217 221324 DEBUG nova.virt.libvirt.host [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.218 221324 DEBUG nova.virt.libvirt.host [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.222 221324 DEBUG nova.virt.libvirt.host [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.223 221324 DEBUG nova.virt.libvirt.host [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.225 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.225 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-31T07:43:39Z,direct_url=<?>,disk_format='qcow2',id=37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='89e274acfc5c4097be7194f5ef1fabd3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-31T07:43:44Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.226 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.227 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.227 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.227 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.228 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.228 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.229 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.229 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.229 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.230 221324 DEBUG nova.virt.hardware [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.237 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:31.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:10:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:31.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:10:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2888472160' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.840 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.603s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.873 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:31 np0005603622 nova_compute[221301]: 2026-01-31 09:10:31.878 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:10:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2298628847' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.921 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.924 221324 DEBUG nova.virt.libvirt.vif [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1489078581',display_name='tempest-TestSnapshotPattern-server-1489078581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1489078581',id=216,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-3y458y73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:10:24Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=f0b8c4bd-36b6-479c-9160-9adb3a86dc6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.925 221324 DEBUG nova.network.os_vif_util [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.927 221324 DEBUG nova.network.os_vif_util [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.929 221324 DEBUG nova.objects.instance [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'pci_devices' on Instance uuid f0b8c4bd-36b6-479c-9160-9adb3a86dc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.972 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <uuid>f0b8c4bd-36b6-479c-9160-9adb3a86dc6f</uuid>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <name>instance-000000d8</name>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestSnapshotPattern-server-1489078581</nova:name>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:10:31</nova:creationTime>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:user uuid="b7233f93367f4dcd8eb2b6b115680192">tempest-TestSnapshotPattern-418405266-project-member</nova:user>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:project uuid="c0be57039fd34aa9a2d05d9086ccff13">tempest-TestSnapshotPattern-418405266</nova:project>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <nova:port uuid="47a41dbd-4926-429e-a493-080c5c5784b1">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="serial">f0b8c4bd-36b6-479c-9160-9adb3a86dc6f</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="uuid">f0b8c4bd-36b6-479c-9160-9adb3a86dc6f</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:63:49:7e"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <target dev="tap47a41dbd-49"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/console.log" append="off"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:10:32 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:10:32 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:10:32 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:10:32 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.974 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Preparing to wait for external event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.974 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.975 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.976 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.977 221324 DEBUG nova.virt.libvirt.vif [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1489078581',display_name='tempest-TestSnapshotPattern-server-1489078581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1489078581',id=216,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-3y458y73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:10:24Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=f0b8c4bd-36b6-479c-9160-9adb3a86dc6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.978 221324 DEBUG nova.network.os_vif_util [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.979 221324 DEBUG nova.network.os_vif_util [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.980 221324 DEBUG os_vif [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.981 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.982 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.983 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.988 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.988 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47a41dbd-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.989 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47a41dbd-49, col_values=(('external_ids', {'iface-id': '47a41dbd-4926-429e-a493-080c5c5784b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:49:7e', 'vm-uuid': 'f0b8c4bd-36b6-479c-9160-9adb3a86dc6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:32 np0005603622 NetworkManager[49080]: <info>  [1769850632.9955] manager: (tap47a41dbd-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/447)
Jan 31 04:10:32 np0005603622 nova_compute[221301]: 2026-01-31 09:10:32.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.003 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.006 221324 INFO os_vif [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49')#033[00m
Jan 31 04:10:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:33.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.418 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.418 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.419 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No VIF found with MAC fa:16:3e:63:49:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.420 221324 INFO nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Using config drive#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.457 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.465 221324 DEBUG nova.network.neutron [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updated VIF entry in instance network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.466 221324 DEBUG nova.network.neutron [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:10:33 np0005603622 nova_compute[221301]: 2026-01-31 09:10:33.503 221324 DEBUG oslo_concurrency.lockutils [req-d6a1ed76-98e5-41f1-b823-064767b6d413 req-9cdee0fb-3b0d-4284-a746-be1a9720f29f fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:10:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:33.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:34 np0005603622 nova_compute[221301]: 2026-01-31 09:10:34.290 221324 INFO nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Creating config drive at /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config#033[00m
Jan 31 04:10:34 np0005603622 nova_compute[221301]: 2026-01-31 09:10:34.297 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp69bg03_1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:35.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:35 np0005603622 nova_compute[221301]: 2026-01-31 09:10:35.339 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp69bg03_1" returned: 0 in 1.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:35.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:37.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:37.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:38 np0005603622 podman[307155]: 2026-01-31 09:10:38.158996092 +0000 UTC m=+0.069632333 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 31 04:10:38 np0005603622 podman[307154]: 2026-01-31 09:10:38.207407222 +0000 UTC m=+0.120401566 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:10:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:39.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:39.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:39 np0005603622 nova_compute[221301]: 2026-01-31 09:10:39.998 221324 DEBUG nova.storage.rbd_utils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] rbd image f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.004 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.168 221324 DEBUG oslo_concurrency.processutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.169 221324 INFO nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Deleting local config drive /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f/disk.config because it was imported into RBD.#033[00m
Jan 31 04:10:40 np0005603622 kernel: tap47a41dbd-49: entered promiscuous mode
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.2087] manager: (tap47a41dbd-49): new Tun device (/org/freedesktop/NetworkManager/Devices/448)
Jan 31 04:10:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:40Z|01015|binding|INFO|Claiming lport 47a41dbd-4926-429e-a493-080c5c5784b1 for this chassis.
Jan 31 04:10:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:40Z|01016|binding|INFO|47a41dbd-4926-429e-a493-080c5c5784b1: Claiming fa:16:3e:63:49:7e 10.100.0.4
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.214 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 systemd-udevd[307236]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:10:40 np0005603622 systemd-machined[190406]: New machine qemu-105-instance-000000d8.
Jan 31 04:10:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:40Z|01017|binding|INFO|Setting lport 47a41dbd-4926-429e-a493-080c5c5784b1 ovn-installed in OVS
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.236 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.2459] device (tap47a41dbd-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.2465] device (tap47a41dbd-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:10:40 np0005603622 systemd[1]: Started Virtual Machine qemu-105-instance-000000d8.
Jan 31 04:10:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:40Z|01018|binding|INFO|Setting lport 47a41dbd-4926-429e-a493-080c5c5784b1 up in Southbound
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.297 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:49:7e 10.100.0.4'], port_security=['fa:16:3e:63:49:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f0b8c4bd-36b6-479c-9160-9adb3a86dc6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08c50cf6-ab45-467a-a4d2-628200ead973', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b8739db-40d8-4ddd-aaaf-640fa0a0d612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df4ff59b-6bd6-4dbb-9483-bf928652de1a, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=47a41dbd-4926-429e-a493-080c5c5784b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.298 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 47a41dbd-4926-429e-a493-080c5c5784b1 in datapath 08c50cf6-ab45-467a-a4d2-628200ead973 bound to our chassis#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.299 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08c50cf6-ab45-467a-a4d2-628200ead973#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.306 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eb580b52-25e8-46d8-ba1d-15b61f364e53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.307 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08c50cf6-a1 in ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.310 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08c50cf6-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.310 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f6405e70-d7e5-438e-a7b5-07c545eba53b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.311 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[eef88e2f-aed9-4bbd-9545-3197461aa92d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.323 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[4a62a901-6710-4146-a114-54e091cb6923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.333 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab5ec5f-8496-4a21-aee6-1803ded2f1e5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.357 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[0abea146-d0c2-49c6-a6b2-f84484c2b2c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.363 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca0f895-7885-4255-b871-4b0b8a82b022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.3641] manager: (tap08c50cf6-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/449)
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.386 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[77551ee2-c476-443b-8fc2-b5775c77e6f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.389 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6037dd-493b-4c92-8224-441b38bfc8cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.4011] device (tap08c50cf6-a0): carrier: link connected
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.403 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[f406f16e-a734-45ad-8b76-a98b8bdf7dd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.413 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[9b528cf7-5679-45b8-b9e8-2450f43bc57d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08c50cf6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:ae:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 305], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 989174, 'reachable_time': 20127, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307270, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.422 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[71ee1b5f-c455-4e21-af08-24ac33cfff26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:ae8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 989174, 'tstamp': 989174}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307271, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.433 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4f2985-a35e-4ee3-a81c-8a4c48770375]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08c50cf6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:ae:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 305], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 989174, 'reachable_time': 20127, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307272, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.449 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[db02b310-41aa-44c0-8548-0802820f64db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.490 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[ca21379c-b744-48e1-9039-21b5996b4d19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.493 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08c50cf6-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.493 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.494 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08c50cf6-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:40 np0005603622 kernel: tap08c50cf6-a0: entered promiscuous mode
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.496 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 NetworkManager[49080]: <info>  [1769850640.4972] manager: (tap08c50cf6-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/450)
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.503 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08c50cf6-a0, col_values=(('external_ids', {'iface-id': 'e8e83b87-bd31-4052-864a-9e5dbf11e897'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.504 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:40Z|01019|binding|INFO|Releasing lport e8e83b87-bd31-4052-864a-9e5dbf11e897 from this chassis (sb_readonly=0)
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.506 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.507 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7873ff3b-d240-4fc4-ade3-8dac06608176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.508 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-08c50cf6-ab45-467a-a4d2-628200ead973
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/08c50cf6-ab45-467a-a4d2-628200ead973.pid.haproxy
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 08c50cf6-ab45-467a-a4d2-628200ead973
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:10:40.509 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'env', 'PROCESS_TAG=haproxy-08c50cf6-ab45-467a-a4d2-628200ead973', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08c50cf6-ab45-467a-a4d2-628200ead973.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:10:40 np0005603622 nova_compute[221301]: 2026-01-31 09:10:40.755 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:40 np0005603622 podman[307302]: 2026-01-31 09:10:40.813543751 +0000 UTC m=+0.016621871 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:10:40 np0005603622 podman[307302]: 2026-01-31 09:10:40.919556475 +0000 UTC m=+0.122634625 container create 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:10:40 np0005603622 systemd[1]: Started libpod-conmon-4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016.scope.
Jan 31 04:10:40 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:10:40 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a506396069cedfbe07c3d393a9875cf7ca3522ebd519e45259a6e900bee6002/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:10:41 np0005603622 podman[307302]: 2026-01-31 09:10:41.033046487 +0000 UTC m=+0.236124618 container init 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.036 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850641.0356715, f0b8c4bd-36b6-479c-9160-9adb3a86dc6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.036 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] VM Started (Lifecycle Event)#033[00m
Jan 31 04:10:41 np0005603622 podman[307302]: 2026-01-31 09:10:41.037832944 +0000 UTC m=+0.240911044 container start 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:10:41 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [NOTICE]   (307364) : New worker (307366) forked
Jan 31 04:10:41 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [NOTICE]   (307364) : Loading success.
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.071 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.075 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850641.0358407, f0b8c4bd-36b6-479c-9160-9adb3a86dc6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.076 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.117 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.121 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.151 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:10:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.488 221324 DEBUG nova.compute.manager [req-af0bacab-43bf-45e7-a30f-9e39162b667a req-03af0a82-4667-4e15-a6ba-761e67af4ca6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.489 221324 DEBUG oslo_concurrency.lockutils [req-af0bacab-43bf-45e7-a30f-9e39162b667a req-03af0a82-4667-4e15-a6ba-761e67af4ca6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.490 221324 DEBUG oslo_concurrency.lockutils [req-af0bacab-43bf-45e7-a30f-9e39162b667a req-03af0a82-4667-4e15-a6ba-761e67af4ca6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.490 221324 DEBUG oslo_concurrency.lockutils [req-af0bacab-43bf-45e7-a30f-9e39162b667a req-03af0a82-4667-4e15-a6ba-761e67af4ca6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.491 221324 DEBUG nova.compute.manager [req-af0bacab-43bf-45e7-a30f-9e39162b667a req-03af0a82-4667-4e15-a6ba-761e67af4ca6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Processing event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.493 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.497 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769850641.4970274, f0b8c4bd-36b6-479c-9160-9adb3a86dc6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.497 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.499 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.502 221324 INFO nova.virt.libvirt.driver [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Instance spawned successfully.#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.502 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.538 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.543 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.544 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.544 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.545 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.546 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.547 221324 DEBUG nova.virt.libvirt.driver [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.555 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.627 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.751 221324 INFO nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Took 16.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.752 221324 DEBUG nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:10:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:41.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.941 221324 INFO nova.compute.manager [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Took 18.11 seconds to build instance.#033[00m
Jan 31 04:10:41 np0005603622 nova_compute[221301]: 2026-01-31 09:10:41.982 221324 DEBUG oslo_concurrency.lockutils [None req-6654bb3d-9689-425c-a65c-f9c835a233eb b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:43.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.746 221324 DEBUG nova.compute.manager [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.747 221324 DEBUG oslo_concurrency.lockutils [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.747 221324 DEBUG oslo_concurrency.lockutils [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.747 221324 DEBUG oslo_concurrency.lockutils [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.747 221324 DEBUG nova.compute.manager [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] No waiting events found dispatching network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:10:43 np0005603622 nova_compute[221301]: 2026-01-31 09:10:43.747 221324 WARNING nova.compute.manager [req-8e92bca8-3a65-4fda-94f0-b50f67b9c644 req-0a446db0-9779-49d4-9def-79792804f414 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received unexpected event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 for instance with vm_state active and task_state None.#033[00m
Jan 31 04:10:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:43.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:45 np0005603622 nova_compute[221301]: 2026-01-31 09:10:45.035 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:45.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:45 np0005603622 nova_compute[221301]: 2026-01-31 09:10:45.758 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:47.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:10:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:10:49 np0005603622 NetworkManager[49080]: <info>  [1769850649.1817] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/451)
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.180 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:49 np0005603622 NetworkManager[49080]: <info>  [1769850649.1829] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/452)
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:49 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:49Z|01020|binding|INFO|Releasing lport e8e83b87-bd31-4052-864a-9e5dbf11e897 from this chassis (sb_readonly=0)
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.255 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:49.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.810 221324 DEBUG nova.compute.manager [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.810 221324 DEBUG nova.compute.manager [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing instance network info cache due to event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.810 221324 DEBUG oslo_concurrency.lockutils [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.810 221324 DEBUG oslo_concurrency.lockutils [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:10:49 np0005603622 nova_compute[221301]: 2026-01-31 09:10:49.811 221324 DEBUG nova.network.neutron [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:10:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:49.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:50 np0005603622 nova_compute[221301]: 2026-01-31 09:10:50.038 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:50 np0005603622 nova_compute[221301]: 2026-01-31 09:10:50.760 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:51 np0005603622 nova_compute[221301]: 2026-01-31 09:10:51.663 221324 DEBUG nova.network.neutron [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updated VIF entry in instance network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:10:51 np0005603622 nova_compute[221301]: 2026-01-31 09:10:51.663 221324 DEBUG nova.network.neutron [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:10:51 np0005603622 nova_compute[221301]: 2026-01-31 09:10:51.698 221324 DEBUG oslo_concurrency.lockutils [req-fed80ad6-e50f-4e6a-9f6e-277223fea7cf req-afaace59-004f-456d-bbdd-8e00df6ac444 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:10:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:51.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:53.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:53.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:10:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:10:55 np0005603622 nova_compute[221301]: 2026-01-31 09:10:55.040 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:10:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:55.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:10:55 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:55Z|00136|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:49:7e 10.100.0.4
Jan 31 04:10:55 np0005603622 ovn_controller[130109]: 2026-01-31T09:10:55Z|00137|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:49:7e 10.100.0.4
Jan 31 04:10:55 np0005603622 nova_compute[221301]: 2026-01-31 09:10:55.763 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:10:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:55.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:10:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:10:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:57.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:10:59.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:10:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:10:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:10:59.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:10:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:00 np0005603622 nova_compute[221301]: 2026-01-31 09:11:00.043 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:00 np0005603622 nova_compute[221301]: 2026-01-31 09:11:00.766 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:01.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:01.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:03.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:03.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:05 np0005603622 nova_compute[221301]: 2026-01-31 09:11:05.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:05.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:05 np0005603622 nova_compute[221301]: 2026-01-31 09:11:05.768 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:05.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:06 np0005603622 nova_compute[221301]: 2026-01-31 09:11:06.648 221324 DEBUG nova.compute.manager [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:11:06 np0005603622 nova_compute[221301]: 2026-01-31 09:11:06.704 221324 INFO nova.compute.manager [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] instance snapshotting#033[00m
Jan 31 04:11:07 np0005603622 nova_compute[221301]: 2026-01-31 09:11:07.067 221324 INFO nova.virt.libvirt.driver [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Beginning live snapshot process#033[00m
Jan 31 04:11:07 np0005603622 nova_compute[221301]: 2026-01-31 09:11:07.279 221324 DEBUG nova.virt.libvirt.imagebackend [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] No parent info for 37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 31 04:11:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:07.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:07 np0005603622 nova_compute[221301]: 2026-01-31 09:11:07.480 221324 DEBUG nova.storage.rbd_utils [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] creating snapshot(b48aa0e1afac41fab03b582b438ca779) on rbd image(f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:11:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:07.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 31 04:11:08 np0005603622 nova_compute[221301]: 2026-01-31 09:11:08.285 221324 DEBUG nova.storage.rbd_utils [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] cloning vms/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk@b48aa0e1afac41fab03b582b438ca779 to images/f398889c-e272-449f-a032-36096e95d18f clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 04:11:08 np0005603622 nova_compute[221301]: 2026-01-31 09:11:08.584 221324 DEBUG nova.storage.rbd_utils [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] flattening images/f398889c-e272-449f-a032-36096e95d18f flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 31 04:11:09 np0005603622 podman[307666]: 2026-01-31 09:11:09.160191574 +0000 UTC m=+0.071354789 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:11:09 np0005603622 podman[307665]: 2026-01-31 09:11:09.204018712 +0000 UTC m=+0.114491188 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:11:09 np0005603622 nova_compute[221301]: 2026-01-31 09:11:09.310 221324 DEBUG nova.storage.rbd_utils [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] removing snapshot(b48aa0e1afac41fab03b582b438ca779) on rbd image(f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 31 04:11:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:09.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e414 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:09.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:10 np0005603622 nova_compute[221301]: 2026-01-31 09:11:10.050 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 31 04:11:10 np0005603622 nova_compute[221301]: 2026-01-31 09:11:10.186 221324 DEBUG nova.storage.rbd_utils [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] creating snapshot(snap) on rbd image(f398889c-e272-449f-a032-36096e95d18f) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 31 04:11:10 np0005603622 nova_compute[221301]: 2026-01-31 09:11:10.770 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 31 04:11:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:11.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:11.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:13.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:13 np0005603622 nova_compute[221301]: 2026-01-31 09:11:13.490 221324 INFO nova.virt.libvirt.driver [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Snapshot image upload complete#033[00m
Jan 31 04:11:13 np0005603622 nova_compute[221301]: 2026-01-31 09:11:13.491 221324 INFO nova.compute.manager [None req-c0060399-211c-449e-802e-eda256b7a469 b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Took 6.78 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 31 04:11:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:13.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:15 np0005603622 nova_compute[221301]: 2026-01-31 09:11:15.053 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:15.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:15 np0005603622 nova_compute[221301]: 2026-01-31 09:11:15.773 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:15.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:17.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:17.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 31 04:11:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:19.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:19.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:20 np0005603622 nova_compute[221301]: 2026-01-31 09:11:20.057 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:20 np0005603622 nova_compute[221301]: 2026-01-31 09:11:20.281 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:20.281 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=94, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=93) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:11:20 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:20.283 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:11:20 np0005603622 nova_compute[221301]: 2026-01-31 09:11:20.775 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:21.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.604 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.604 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquired lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.605 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 31 04:11:21 np0005603622 nova_compute[221301]: 2026-01-31 09:11:21.605 221324 DEBUG nova.objects.instance [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f0b8c4bd-36b6-479c-9160-9adb3a86dc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:11:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:21.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:23.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.514 221324 DEBUG nova.network.neutron [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.535 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Releasing lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.536 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.537 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.537 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:23 np0005603622 nova_compute[221301]: 2026-01-31 09:11:23.538 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:11:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:23.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:25 np0005603622 nova_compute[221301]: 2026-01-31 09:11:25.061 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:25.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:25 np0005603622 nova_compute[221301]: 2026-01-31 09:11:25.776 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:25.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:26.286 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '94'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:11:26 np0005603622 nova_compute[221301]: 2026-01-31 09:11:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:26 np0005603622 nova_compute[221301]: 2026-01-31 09:11:26.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:27.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.422 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.422 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.422 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1411008260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.869 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.958 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:28 np0005603622 nova_compute[221301]: 2026-01-31 09:11:28.960 221324 DEBUG nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.139 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.140 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4036MB free_disk=20.876209259033203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.141 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.141 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.221 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance f0b8c4bd-36b6-479c-9160-9adb3a86dc6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.222 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.222 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.270 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:11:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:29.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:11:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1654214035' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.721 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.728 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.802 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.840 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:11:29 np0005603622 nova_compute[221301]: 2026-01-31 09:11:29.841 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:29.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:30 np0005603622 nova_compute[221301]: 2026-01-31 09:11:30.065 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:30 np0005603622 nova_compute[221301]: 2026-01-31 09:11:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:30 np0005603622 nova_compute[221301]: 2026-01-31 09:11:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:30 np0005603622 nova_compute[221301]: 2026-01-31 09:11:30.779 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:30.930 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:30.931 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:11:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:11:30.932 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:11:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:31.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:31 np0005603622 nova_compute[221301]: 2026-01-31 09:11:31.419 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:31.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:33.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:33.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:35 np0005603622 nova_compute[221301]: 2026-01-31 09:11:35.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:35.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:35 np0005603622 nova_compute[221301]: 2026-01-31 09:11:35.782 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:35.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:37.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:37.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.098695) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698098769, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 1771, "num_deletes": 253, "total_data_size": 4143877, "memory_usage": 4192760, "flush_reason": "Manual Compaction"}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698239398, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 1654898, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 87627, "largest_seqno": 89393, "table_properties": {"data_size": 1649225, "index_size": 2812, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15106, "raw_average_key_size": 21, "raw_value_size": 1636655, "raw_average_value_size": 2298, "num_data_blocks": 125, "num_entries": 712, "num_filter_entries": 712, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850546, "oldest_key_time": 1769850546, "file_creation_time": 1769850698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 140771 microseconds, and 6563 cpu microseconds.
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.239475) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 1654898 bytes OK
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.239499) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.244092) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.244119) EVENT_LOG_v1 {"time_micros": 1769850698244111, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.244142) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 4135859, prev total WAL file size 4135859, number of live WAL files 2.
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.245253) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303038' seq:72057594037927935, type:22 .. '6D6772737461740033323631' seq:0, type:0; will stop at (end)
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(1616KB)], [180(12MB)]
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698245406, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 15100761, "oldest_snapshot_seqno": -1}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 10930 keys, 12261669 bytes, temperature: kUnknown
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698507030, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 12261669, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12194024, "index_size": 39244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 287783, "raw_average_key_size": 26, "raw_value_size": 12006311, "raw_average_value_size": 1098, "num_data_blocks": 1486, "num_entries": 10930, "num_filter_entries": 10930, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.507308) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 12261669 bytes
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.511586) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 57.7 rd, 46.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.8 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(16.5) write-amplify(7.4) OK, records in: 11390, records dropped: 460 output_compression: NoCompression
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.511607) EVENT_LOG_v1 {"time_micros": 1769850698511597, "job": 116, "event": "compaction_finished", "compaction_time_micros": 261690, "compaction_time_cpu_micros": 46778, "output_level": 6, "num_output_files": 1, "total_output_size": 12261669, "num_input_records": 11390, "num_output_records": 10930, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698511937, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850698513216, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.245095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.513423) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.513434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.513437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.513440) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:11:38.513443) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:11:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:39.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:39.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:40 np0005603622 nova_compute[221301]: 2026-01-31 09:11:40.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:40 np0005603622 podman[307792]: 2026-01-31 09:11:40.176467442 +0000 UTC m=+0.089086028 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:11:40 np0005603622 podman[307791]: 2026-01-31 09:11:40.178556167 +0000 UTC m=+0.096214826 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:11:40 np0005603622 nova_compute[221301]: 2026-01-31 09:11:40.783 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:41.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:41.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:43.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:43.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:45 np0005603622 nova_compute[221301]: 2026-01-31 09:11:45.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:45.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:45 np0005603622 nova_compute[221301]: 2026-01-31 09:11:45.786 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:45.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:47.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:47.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:49.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:49.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:50 np0005603622 nova_compute[221301]: 2026-01-31 09:11:50.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:50 np0005603622 nova_compute[221301]: 2026-01-31 09:11:50.788 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:51.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:11:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:53.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:11:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:53.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:11:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:11:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:11:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:55 np0005603622 nova_compute[221301]: 2026-01-31 09:11:55.080 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:11:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:55.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:11:55 np0005603622 nova_compute[221301]: 2026-01-31 09:11:55.790 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:55.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:57.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:57.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:58 np0005603622 nova_compute[221301]: 2026-01-31 09:11:58.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:11:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:11:59.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:11:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:11:59Z|01021|binding|INFO|Releasing lport e8e83b87-bd31-4052-864a-9e5dbf11e897 from this chassis (sb_readonly=0)
Jan 31 04:11:59 np0005603622 nova_compute[221301]: 2026-01-31 09:11:59.585 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:11:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:11:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:11:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:11:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:11:59.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:00 np0005603622 nova_compute[221301]: 2026-01-31 09:12:00.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:12:00 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:12:00 np0005603622 nova_compute[221301]: 2026-01-31 09:12:00.793 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:01.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:12:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 67K writes, 269K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.04 MB/s#012Cumulative WAL: 67K writes, 24K syncs, 2.73 writes per sync, written: 0.26 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6455 writes, 25K keys, 6455 commit groups, 1.0 writes per commit group, ingest: 27.90 MB, 0.05 MB/s#012Interval WAL: 6456 writes, 2525 syncs, 2.56 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:12:02 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 31 04:12:03 np0005603622 nova_compute[221301]: 2026-01-31 09:12:03.360 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:03 np0005603622 nova_compute[221301]: 2026-01-31 09:12:03.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:03 np0005603622 nova_compute[221301]: 2026-01-31 09:12:03.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:12:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:03.661 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=95, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=94) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:03 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:03.662 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:12:03 np0005603622 nova_compute[221301]: 2026-01-31 09:12:03.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:03.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e419 e419: 3 total, 3 up, 3 in
Jan 31 04:12:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:05 np0005603622 nova_compute[221301]: 2026-01-31 09:12:05.086 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:05.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:05 np0005603622 nova_compute[221301]: 2026-01-31 09:12:05.795 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e420 e420: 3 total, 3 up, 3 in
Jan 31 04:12:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:05.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:06 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:06.666 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '95'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:07 np0005603622 nova_compute[221301]: 2026-01-31 09:12:07.376 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:07.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:07 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e421 e421: 3 total, 3 up, 3 in
Jan 31 04:12:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:07.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:09 np0005603622 nova_compute[221301]: 2026-01-31 09:12:09.420 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:09 np0005603622 nova_compute[221301]: 2026-01-31 09:12:09.420 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:12:09 np0005603622 nova_compute[221301]: 2026-01-31 09:12:09.446 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:12:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:09.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:09 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:09.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:10 np0005603622 nova_compute[221301]: 2026-01-31 09:12:10.090 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:10 np0005603622 nova_compute[221301]: 2026-01-31 09:12:10.798 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:11 np0005603622 podman[308020]: 2026-01-31 09:12:11.139765318 +0000 UTC m=+0.060199743 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:12:11 np0005603622 podman[308019]: 2026-01-31 09:12:11.199634242 +0000 UTC m=+0.120031196 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 31 04:12:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:11.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:11.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e422 e422: 3 total, 3 up, 3 in
Jan 31 04:12:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:13.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:13.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e423 e423: 3 total, 3 up, 3 in
Jan 31 04:12:15 np0005603622 nova_compute[221301]: 2026-01-31 09:12:15.093 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:15.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:15 np0005603622 nova_compute[221301]: 2026-01-31 09:12:15.801 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:15.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.749 221324 DEBUG nova.compute.manager [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.750 221324 DEBUG nova.compute.manager [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing instance network info cache due to event network-changed-47a41dbd-4926-429e-a493-080c5c5784b1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.750 221324 DEBUG oslo_concurrency.lockutils [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.750 221324 DEBUG oslo_concurrency.lockutils [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.751 221324 DEBUG nova.network.neutron [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Refreshing network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.847 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.848 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.849 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.849 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.849 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.851 221324 INFO nova.compute.manager [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Terminating instance#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.852 221324 DEBUG nova.compute.manager [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:12:16 np0005603622 kernel: tap47a41dbd-49 (unregistering): left promiscuous mode
Jan 31 04:12:16 np0005603622 NetworkManager[49080]: <info>  [1769850736.9255] device (tap47a41dbd-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:16 np0005603622 ovn_controller[130109]: 2026-01-31T09:12:16Z|01022|binding|INFO|Releasing lport 47a41dbd-4926-429e-a493-080c5c5784b1 from this chassis (sb_readonly=0)
Jan 31 04:12:16 np0005603622 ovn_controller[130109]: 2026-01-31T09:12:16Z|01023|binding|INFO|Setting lport 47a41dbd-4926-429e-a493-080c5c5784b1 down in Southbound
Jan 31 04:12:16 np0005603622 ovn_controller[130109]: 2026-01-31T09:12:16Z|01024|binding|INFO|Removing iface tap47a41dbd-49 ovn-installed in OVS
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.942 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:16.947 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:49:7e 10.100.0.4'], port_security=['fa:16:3e:63:49:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f0b8c4bd-36b6-479c-9160-9adb3a86dc6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08c50cf6-ab45-467a-a4d2-628200ead973', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0be57039fd34aa9a2d05d9086ccff13', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b8739db-40d8-4ddd-aaaf-640fa0a0d612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=df4ff59b-6bd6-4dbb-9483-bf928652de1a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=47a41dbd-4926-429e-a493-080c5c5784b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:16.949 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 47a41dbd-4926-429e-a493-080c5c5784b1 in datapath 08c50cf6-ab45-467a-a4d2-628200ead973 unbound from our chassis#033[00m
Jan 31 04:12:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:16.951 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08c50cf6-ab45-467a-a4d2-628200ead973, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:12:16 np0005603622 nova_compute[221301]: 2026-01-31 09:12:16.952 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:16.955 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f85b36-6a6f-4648-bf65-a404eadee003]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:16 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:16.956 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 namespace which is not needed anymore#033[00m
Jan 31 04:12:16 np0005603622 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000d8.scope: Deactivated successfully.
Jan 31 04:12:16 np0005603622 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000d8.scope: Consumed 18.024s CPU time.
Jan 31 04:12:16 np0005603622 systemd-machined[190406]: Machine qemu-105-instance-000000d8 terminated.
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.084 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [NOTICE]   (307364) : haproxy version is 2.8.14-c23fe91
Jan 31 04:12:17 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [NOTICE]   (307364) : path to executable is /usr/sbin/haproxy
Jan 31 04:12:17 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [WARNING]  (307364) : Exiting Master process...
Jan 31 04:12:17 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [ALERT]    (307364) : Current worker (307366) exited with code 143 (Terminated)
Jan 31 04:12:17 np0005603622 neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973[307358]: [WARNING]  (307364) : All workers exited. Exiting... (0)
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.102 221324 INFO nova.virt.libvirt.driver [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Instance destroyed successfully.#033[00m
Jan 31 04:12:17 np0005603622 systemd[1]: libpod-4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016.scope: Deactivated successfully.
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.103 221324 DEBUG nova.objects.instance [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lazy-loading 'resources' on Instance uuid f0b8c4bd-36b6-479c-9160-9adb3a86dc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:12:17 np0005603622 podman[308086]: 2026-01-31 09:12:17.109385439 +0000 UTC m=+0.060702247 container died 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.135 221324 DEBUG nova.virt.libvirt.vif [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:10:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1489078581',display_name='tempest-TestSnapshotPattern-server-1489078581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1489078581',id=216,image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKpOUB83OLuX57XNlwrNvNm9A+WqJpxKoMPAfeJaapb6yUdxSW7lKu+x7yQPy1sLXzgH0zh++G8qQAE8XC1z1HkX9voX2EUsKpZ4pSJBKy01SGRO5BQ6pTGTKexhMNezAw==',key_name='tempest-TestSnapshotPattern-475590621',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:10:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c0be57039fd34aa9a2d05d9086ccff13',ramdisk_id='',reservation_id='r-3y458y73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-418405266',owner_user_name='tempest-TestSnapshotPattern-418405266-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:11:13Z,user_data=None,user_id='b7233f93367f4dcd8eb2b6b115680192',uuid=f0b8c4bd-36b6-479c-9160-9adb3a86dc6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.136 221324 DEBUG nova.network.os_vif_util [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converting VIF {"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016-userdata-shm.mount: Deactivated successfully.
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.137 221324 DEBUG nova.network.os_vif_util [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.138 221324 DEBUG os_vif [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:12:17 np0005603622 systemd[1]: var-lib-containers-storage-overlay-9a506396069cedfbe07c3d393a9875cf7ca3522ebd519e45259a6e900bee6002-merged.mount: Deactivated successfully.
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.142 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.142 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47a41dbd-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 podman[308086]: 2026-01-31 09:12:17.148819112 +0000 UTC m=+0.100135910 container cleanup 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.150 221324 INFO os_vif [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:49:7e,bridge_name='br-int',has_traffic_filtering=True,id=47a41dbd-4926-429e-a493-080c5c5784b1,network=Network(08c50cf6-ab45-467a-a4d2-628200ead973),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a41dbd-49')#033[00m
Jan 31 04:12:17 np0005603622 systemd[1]: libpod-conmon-4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016.scope: Deactivated successfully.
Jan 31 04:12:17 np0005603622 podman[308126]: 2026-01-31 09:12:17.221895274 +0000 UTC m=+0.051350248 container remove 4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.226 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a46c14ba-599c-4eff-a342-5f69407942b6]: (4, ('Sat Jan 31 09:12:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 (4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016)\n4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016\nSat Jan 31 09:12:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 (4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016)\n4564e6b6570500a3472d4ce86c38865661902b0d2dc665f685542fc766ead016\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.228 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5a191c92-2c4b-4eeb-95fa-afd22a20f7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.229 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08c50cf6-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:17 np0005603622 kernel: tap08c50cf6-a0: left promiscuous mode
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.241 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b76901-72eb-4c00-a660-5c354a60d16a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.259 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9a4653-d4e4-4d4b-9003-ea735bb46121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.260 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a9210f-fd0c-4c87-9bd3-a336c2dc4713]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.277 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[594eff89-fed4-4503-a412-d94e740940f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 989169, 'reachable_time': 30398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308157, 'error': None, 'target': 'ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.281 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08c50cf6-ab45-467a-a4d2-628200ead973 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:12:17 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:17.281 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[41611450-8f77-4a48-9179-c7047fadc073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:12:17 np0005603622 systemd[1]: run-netns-ovnmeta\x2d08c50cf6\x2dab45\x2d467a\x2da4d2\x2d628200ead973.mount: Deactivated successfully.
Jan 31 04:12:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.659 221324 INFO nova.virt.libvirt.driver [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Deleting instance files /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_del#033[00m
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.660 221324 INFO nova.virt.libvirt.driver [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Deletion of /var/lib/nova/instances/f0b8c4bd-36b6-479c-9160-9adb3a86dc6f_del complete#033[00m
Jan 31 04:12:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:17.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:17 np0005603622 nova_compute[221301]: 2026-01-31 09:12:17.986 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e424 e424: 3 total, 3 up, 3 in
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.188 221324 DEBUG nova.compute.manager [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-unplugged-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.189 221324 DEBUG oslo_concurrency.lockutils [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.190 221324 DEBUG oslo_concurrency.lockutils [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.190 221324 DEBUG oslo_concurrency.lockutils [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.191 221324 DEBUG nova.compute.manager [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] No waiting events found dispatching network-vif-unplugged-47a41dbd-4926-429e-a493-080c5c5784b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:12:18 np0005603622 nova_compute[221301]: 2026-01-31 09:12:18.191 221324 DEBUG nova.compute.manager [req-07298e8b-8c95-481d-afa7-af7555d2ab95 req-617250d6-0132-4273-a988-ed0ea7daab11 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-unplugged-47a41dbd-4926-429e-a493-080c5c5784b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.212 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Triggering sync for uuid f0b8c4bd-36b6-479c-9160-9adb3a86dc6f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.213 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.214 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.252 221324 INFO nova.compute.manager [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Took 2.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.253 221324 DEBUG oslo.service.loopingcall [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.253 221324 DEBUG nova.compute.manager [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:12:19 np0005603622 nova_compute[221301]: 2026-01-31 09:12:19.253 221324 DEBUG nova.network.neutron [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:12:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:19.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:19.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.300 221324 DEBUG nova.compute.manager [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.301 221324 DEBUG oslo_concurrency.lockutils [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.301 221324 DEBUG oslo_concurrency.lockutils [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.302 221324 DEBUG oslo_concurrency.lockutils [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.302 221324 DEBUG nova.compute.manager [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] No waiting events found dispatching network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.302 221324 WARNING nova.compute.manager [req-25061c12-6990-4366-801f-627e8407a2cc req-2db14bfe-d84b-4c41-8b01-9d725b5c09f6 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received unexpected event network-vif-plugged-47a41dbd-4926-429e-a493-080c5c5784b1 for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.382 221324 DEBUG nova.network.neutron [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updated VIF entry in instance network info cache for port 47a41dbd-4926-429e-a493-080c5c5784b1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.383 221324 DEBUG nova.network.neutron [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [{"id": "47a41dbd-4926-429e-a493-080c5c5784b1", "address": "fa:16:3e:63:49:7e", "network": {"id": "08c50cf6-ab45-467a-a4d2-628200ead973", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1952053028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0be57039fd34aa9a2d05d9086ccff13", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a41dbd-49", "ovs_interfaceid": "47a41dbd-4926-429e-a493-080c5c5784b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.387 221324 DEBUG nova.network.neutron [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.423 221324 DEBUG oslo_concurrency.lockutils [req-df02f29a-7cc7-49d5-9896-b1185fa03cb7 req-9ea9548f-0cdf-4012-a984-21962d915678 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.424 221324 INFO nova.compute.manager [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Took 1.17 seconds to deallocate network for instance.#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.501 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.502 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.584 221324 DEBUG oslo_concurrency.processutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:20 np0005603622 nova_compute[221301]: 2026-01-31 09:12:20.804 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2230952600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.019 221324 DEBUG oslo_concurrency.processutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.027 221324 DEBUG nova.compute.provider_tree [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.075 221324 DEBUG nova.scheduler.client.report [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.103 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.134 221324 INFO nova.scheduler.client.report [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Deleted allocations for instance f0b8c4bd-36b6-479c-9160-9adb3a86dc6f#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.217 221324 DEBUG oslo_concurrency.lockutils [None req-28534763-96e5-435c-bd71-c9aa03cbd42d b7233f93367f4dcd8eb2b6b115680192 c0be57039fd34aa9a2d05d9086ccff13 - - default default] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.218 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.219 221324 INFO nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 31 04:12:21 np0005603622 nova_compute[221301]: 2026-01-31 09:12:21.219 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "f0b8c4bd-36b6-479c-9160-9adb3a86dc6f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:21.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:21.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:22 np0005603622 nova_compute[221301]: 2026-01-31 09:12:22.145 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:22 np0005603622 nova_compute[221301]: 2026-01-31 09:12:22.411 221324 DEBUG nova.compute.manager [req-bf736194-1069-4a4b-901b-2499552d1e4c req-f4ef6ed7-7bb8-4457-ae35-566b72d521ea fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Received event network-vif-deleted-47a41dbd-4926-429e-a493-080c5c5784b1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:12:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 e425: 3 total, 3 up, 3 in
Jan 31 04:12:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:23.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:23 np0005603622 nova_compute[221301]: 2026-01-31 09:12:23.656 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:23 np0005603622 nova_compute[221301]: 2026-01-31 09:12:23.656 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:12:23 np0005603622 nova_compute[221301]: 2026-01-31 09:12:23.657 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:12:23 np0005603622 nova_compute[221301]: 2026-01-31 09:12:23.674 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:12:23 np0005603622 nova_compute[221301]: 2026-01-31 09:12:23.674 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:23.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:24 np0005603622 nova_compute[221301]: 2026-01-31 09:12:24.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:24 np0005603622 nova_compute[221301]: 2026-01-31 09:12:24.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:12:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:25 np0005603622 nova_compute[221301]: 2026-01-31 09:12:25.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:25.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:25 np0005603622 nova_compute[221301]: 2026-01-31 09:12:25.806 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:25.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:27 np0005603622 nova_compute[221301]: 2026-01-31 09:12:27.148 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:12:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 18K writes, 89K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1634 writes, 7725 keys, 1634 commit groups, 1.0 writes per commit group, ingest: 16.15 MB, 0.03 MB/s#012Interval WAL: 1634 writes, 1634 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     36.3      3.08              0.36        58    0.053       0      0       0.0       0.0#012  L6      1/0   11.69 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.3     85.8     73.4      8.09              1.87        57    0.142    445K    30K       0.0       0.0#012 Sum      1/0   11.69 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.3     62.1     63.2     11.17              2.23       115    0.097    445K    30K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     82.5     80.9      0.82              0.20        10    0.082     55K   2551       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0     85.8     73.4      8.09              1.87        57    0.142    445K    30K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     36.3      3.08              0.36        57    0.054       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.109, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.69 GB write, 0.11 MB/s write, 0.68 GB read, 0.11 MB/s read, 11.2 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 74.68 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.000657 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4233,71.50 MB,23.5207%) FilterBlock(115,1.20 MB,0.394776%) IndexBlock(115,1.98 MB,0.651485%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:12:27 np0005603622 nova_compute[221301]: 2026-01-31 09:12:27.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:27.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:27.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.417 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.417 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.418 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.418 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.419 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1546093110' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:29 np0005603622 nova_compute[221301]: 2026-01-31 09:12:29.865 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:29.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.103 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.106 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4247MB free_disk=20.96752166748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.107 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.107 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.172 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.172 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.204 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.225 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.226 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.244 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.272 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.300 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:12:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:12:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2043448308' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.767 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.776 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.808 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.816 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.853 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:12:30 np0005603622 nova_compute[221301]: 2026-01-31 09:12:30.854 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:30.930 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:30.931 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:12:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:30.931 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:12:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:31.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:31.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:32 np0005603622 nova_compute[221301]: 2026-01-31 09:12:32.098 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769850737.0956564, f0b8c4bd-36b6-479c-9160-9adb3a86dc6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:12:32 np0005603622 nova_compute[221301]: 2026-01-31 09:12:32.098 221324 INFO nova.compute.manager [-] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:12:32 np0005603622 nova_compute[221301]: 2026-01-31 09:12:32.122 221324 DEBUG nova.compute.manager [None req-3a9a6471-0b52-45ff-af2b-a7554fb672b9 - - - - - -] [instance: f0b8c4bd-36b6-479c-9160-9adb3a86dc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:12:32 np0005603622 nova_compute[221301]: 2026-01-31 09:12:32.151 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:32 np0005603622 nova_compute[221301]: 2026-01-31 09:12:32.855 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:33 np0005603622 nova_compute[221301]: 2026-01-31 09:12:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:12:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:33.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:33.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:35.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:35 np0005603622 nova_compute[221301]: 2026-01-31 09:12:35.810 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:35.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.263633) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756263778, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 925, "num_deletes": 254, "total_data_size": 1726565, "memory_usage": 1753992, "flush_reason": "Manual Compaction"}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756271005, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 1137546, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 89398, "largest_seqno": 90318, "table_properties": {"data_size": 1133183, "index_size": 2014, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9995, "raw_average_key_size": 20, "raw_value_size": 1124337, "raw_average_value_size": 2275, "num_data_blocks": 88, "num_entries": 494, "num_filter_entries": 494, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850699, "oldest_key_time": 1769850699, "file_creation_time": 1769850756, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 7340 microseconds, and 2723 cpu microseconds.
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.271051) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 1137546 bytes OK
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.271064) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273073) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273110) EVENT_LOG_v1 {"time_micros": 1769850756273100, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273135) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 1721864, prev total WAL file size 1721864, number of live WAL files 2.
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273920) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(1110KB)], [183(11MB)]
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756273985, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 13399215, "oldest_snapshot_seqno": -1}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10901 keys, 11469761 bytes, temperature: kUnknown
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756358923, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11469761, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11402902, "index_size": 38522, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27269, "raw_key_size": 287997, "raw_average_key_size": 26, "raw_value_size": 11216177, "raw_average_value_size": 1028, "num_data_blocks": 1449, "num_entries": 10901, "num_filter_entries": 10901, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850756, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.359439) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11469761 bytes
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.361006) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.5 rd, 134.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 11.7 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(21.9) write-amplify(10.1) OK, records in: 11424, records dropped: 523 output_compression: NoCompression
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.361037) EVENT_LOG_v1 {"time_micros": 1769850756361023, "job": 118, "event": "compaction_finished", "compaction_time_micros": 85087, "compaction_time_cpu_micros": 42222, "output_level": 6, "num_output_files": 1, "total_output_size": 11469761, "num_input_records": 11424, "num_output_records": 10901, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756361428, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850756363647, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.273773) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.363691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.363698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.363702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.363706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:36 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:12:36.363710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:12:37 np0005603622 nova_compute[221301]: 2026-01-31 09:12:37.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:37.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:37.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:39.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:39.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:40 np0005603622 nova_compute[221301]: 2026-01-31 09:12:40.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:41.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:42.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:42 np0005603622 podman[308228]: 2026-01-31 09:12:42.114244685 +0000 UTC m=+0.044863777 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 04:12:42 np0005603622 nova_compute[221301]: 2026-01-31 09:12:42.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:42 np0005603622 podman[308227]: 2026-01-31 09:12:42.189526047 +0000 UTC m=+0.116039831 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 31 04:12:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:44.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:45.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:45 np0005603622 nova_compute[221301]: 2026-01-31 09:12:45.813 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:46.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:47 np0005603622 nova_compute[221301]: 2026-01-31 09:12:47.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:47 np0005603622 nova_compute[221301]: 2026-01-31 09:12:47.451 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:47.453 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=96, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=95) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:12:47 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:47.455 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:12:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:47.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:48.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:49.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:50.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:50 np0005603622 nova_compute[221301]: 2026-01-31 09:12:50.815 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:52.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:52 np0005603622 nova_compute[221301]: 2026-01-31 09:12:52.162 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:53.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:54.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:54 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:12:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:55.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:55 np0005603622 nova_compute[221301]: 2026-01-31 09:12:55.817 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:12:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:56.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:12:57 np0005603622 nova_compute[221301]: 2026-01-31 09:12:57.166 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:12:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:12:57.458 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '96'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:12:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:12:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:57.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:12:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:12:58.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:12:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:12:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:12:59.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:12:59 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:00.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:00 np0005603622 nova_compute[221301]: 2026-01-31 09:13:00.821 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:01.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:02.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:02 np0005603622 nova_compute[221301]: 2026-01-31 09:13:02.171 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:13:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:03 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:13:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:03.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:04.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:04 np0005603622 ovn_controller[130109]: 2026-01-31T09:13:04Z|01025|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 31 04:13:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:05.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:05 np0005603622 nova_compute[221301]: 2026-01-31 09:13:05.821 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:06.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:07 np0005603622 nova_compute[221301]: 2026-01-31 09:13:07.174 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:07.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:08.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:09.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:10.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:13:10 np0005603622 nova_compute[221301]: 2026-01-31 09:13:10.823 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:11.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:12.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:12 np0005603622 nova_compute[221301]: 2026-01-31 09:13:12.178 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:13 np0005603622 podman[308457]: 2026-01-31 09:13:13.140508099 +0000 UTC m=+0.058740075 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 31 04:13:13 np0005603622 podman[308456]: 2026-01-31 09:13:13.187557094 +0000 UTC m=+0.109548099 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:13:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:13.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:14.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:15.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:15 np0005603622 nova_compute[221301]: 2026-01-31 09:13:15.825 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:16.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:17 np0005603622 nova_compute[221301]: 2026-01-31 09:13:17.181 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:17.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:18.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:19.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:20.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:20 np0005603622 nova_compute[221301]: 2026-01-31 09:13:20.827 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:21.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:22.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:22 np0005603622 nova_compute[221301]: 2026-01-31 09:13:22.186 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:23.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:24.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:24 np0005603622 nova_compute[221301]: 2026-01-31 09:13:24.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:24 np0005603622 nova_compute[221301]: 2026-01-31 09:13:24.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:13:24 np0005603622 nova_compute[221301]: 2026-01-31 09:13:24.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:13:24 np0005603622 nova_compute[221301]: 2026-01-31 09:13:24.448 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:13:24 np0005603622 nova_compute[221301]: 2026-01-31 09:13:24.448 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:25 np0005603622 nova_compute[221301]: 2026-01-31 09:13:25.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:25 np0005603622 nova_compute[221301]: 2026-01-31 09:13:25.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:13:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:25.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:25 np0005603622 nova_compute[221301]: 2026-01-31 09:13:25.829 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:26.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:26 np0005603622 nova_compute[221301]: 2026-01-31 09:13:26.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:27 np0005603622 nova_compute[221301]: 2026-01-31 09:13:27.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:27 np0005603622 nova_compute[221301]: 2026-01-31 09:13:27.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:27.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:28.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:29.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:30.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.555 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.556 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.556 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.556 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.557 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.749 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=97, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=96) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.750 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.751 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '97'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.751 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:30 np0005603622 nova_compute[221301]: 2026-01-31 09:13:30.831 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.930 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.931 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:13:30.931 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:13:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4001140958' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.180 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.345 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.348 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4238MB free_disk=20.94251251220703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.349 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.349 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.422 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.423 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.439 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:13:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:31.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:13:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3447076535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.866 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.875 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.898 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.900 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:13:31 np0005603622 nova_compute[221301]: 2026-01-31 09:13:31.900 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:13:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:32.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:32 np0005603622 nova_compute[221301]: 2026-01-31 09:13:32.192 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:32 np0005603622 nova_compute[221301]: 2026-01-31 09:13:32.901 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:33.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:34.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:35 np0005603622 nova_compute[221301]: 2026-01-31 09:13:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:13:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:35 np0005603622 nova_compute[221301]: 2026-01-31 09:13:35.833 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:36.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:37 np0005603622 nova_compute[221301]: 2026-01-31 09:13:37.196 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:13:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:37.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.070154) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818070223, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 853, "num_deletes": 257, "total_data_size": 1693643, "memory_usage": 1720384, "flush_reason": "Manual Compaction"}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 31 04:13:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:38.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818176083, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1117811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90323, "largest_seqno": 91171, "table_properties": {"data_size": 1113742, "index_size": 1784, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 8905, "raw_average_key_size": 19, "raw_value_size": 1105644, "raw_average_value_size": 2382, "num_data_blocks": 78, "num_entries": 464, "num_filter_entries": 464, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850757, "oldest_key_time": 1769850757, "file_creation_time": 1769850818, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 106339 microseconds, and 6237 cpu microseconds.
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.176475) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1117811 bytes OK
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.176539) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.260742) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.260795) EVENT_LOG_v1 {"time_micros": 1769850818260784, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.260829) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 1689263, prev total WAL file size 1689263, number of live WAL files 2.
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.261668) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353134' seq:72057594037927935, type:22 .. '6C6F676D0033373637' seq:0, type:0; will stop at (end)
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(1091KB)], [186(10MB)]
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818261723, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12587572, "oldest_snapshot_seqno": -1}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10836 keys, 12446846 bytes, temperature: kUnknown
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818489990, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12446846, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12378939, "index_size": 39698, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 287581, "raw_average_key_size": 26, "raw_value_size": 12191986, "raw_average_value_size": 1125, "num_data_blocks": 1498, "num_entries": 10836, "num_filter_entries": 10836, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850818, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.490363) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12446846 bytes
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.492628) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 55.1 rd, 54.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(22.4) write-amplify(11.1) OK, records in: 11365, records dropped: 529 output_compression: NoCompression
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.492658) EVENT_LOG_v1 {"time_micros": 1769850818492645, "job": 120, "event": "compaction_finished", "compaction_time_micros": 228345, "compaction_time_cpu_micros": 40751, "output_level": 6, "num_output_files": 1, "total_output_size": 12446846, "num_input_records": 11365, "num_output_records": 10836, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818492947, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850818494742, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.261582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.494830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.494836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.494837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.494839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:38 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:13:38.494840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:13:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:39.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:40.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:40 np0005603622 nova_compute[221301]: 2026-01-31 09:13:40.834 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:41.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:42.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:42 np0005603622 nova_compute[221301]: 2026-01-31 09:13:42.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:43.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:44.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:44 np0005603622 podman[308543]: 2026-01-31 09:13:44.174505794 +0000 UTC m=+0.088811580 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:13:44 np0005603622 podman[308544]: 2026-01-31 09:13:44.177252877 +0000 UTC m=+0.082009481 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Jan 31 04:13:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:45.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:45 np0005603622 nova_compute[221301]: 2026-01-31 09:13:45.836 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:46.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:47 np0005603622 nova_compute[221301]: 2026-01-31 09:13:47.207 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:47.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:48.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:49.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:50.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:50 np0005603622 nova_compute[221301]: 2026-01-31 09:13:50.839 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:51.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:52.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:52 np0005603622 nova_compute[221301]: 2026-01-31 09:13:52.209 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:53.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:54.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:13:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:13:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:55.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:13:55 np0005603622 nova_compute[221301]: 2026-01-31 09:13:55.842 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:56.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:57 np0005603622 nova_compute[221301]: 2026-01-31 09:13:57.212 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:13:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:57.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:13:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:13:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:13:58.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:13:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:13:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:13:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:13:59.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:00.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:00 np0005603622 nova_compute[221301]: 2026-01-31 09:14:00.843 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:01.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:02.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:02 np0005603622 nova_compute[221301]: 2026-01-31 09:14:02.215 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:03 np0005603622 nova_compute[221301]: 2026-01-31 09:14:03.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:03.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:04.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:05.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:05 np0005603622 nova_compute[221301]: 2026-01-31 09:14:05.845 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:06.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:07 np0005603622 nova_compute[221301]: 2026-01-31 09:14:07.219 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:07.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:08.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:09.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:10.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:14:10 np0005603622 nova_compute[221301]: 2026-01-31 09:14:10.848 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:11.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:11 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:14:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:12.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:12 np0005603622 nova_compute[221301]: 2026-01-31 09:14:12.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:13.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:14.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:15 np0005603622 podman[308719]: 2026-01-31 09:14:15.153172069 +0000 UTC m=+0.075294862 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 31 04:14:15 np0005603622 podman[308718]: 2026-01-31 09:14:15.19440827 +0000 UTC m=+0.116528803 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Jan 31 04:14:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:15.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:15 np0005603622 nova_compute[221301]: 2026-01-31 09:14:15.850 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:15 np0005603622 nova_compute[221301]: 2026-01-31 09:14:15.947 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:15.947 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=98, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=97) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:14:15 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:15.949 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:14:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:16.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:14:17 np0005603622 nova_compute[221301]: 2026-01-31 09:14:17.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:17.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:18.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:19.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:20 np0005603622 nova_compute[221301]: 2026-01-31 09:14:20.852 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:21.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:22.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:22 np0005603622 nova_compute[221301]: 2026-01-31 09:14:22.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:22 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:22.952 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '98'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:14:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:23.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:24.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:24 np0005603622 nova_compute[221301]: 2026-01-31 09:14:24.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:24 np0005603622 nova_compute[221301]: 2026-01-31 09:14:24.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:14:24 np0005603622 nova_compute[221301]: 2026-01-31 09:14:24.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:14:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:25 np0005603622 nova_compute[221301]: 2026-01-31 09:14:25.606 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:14:25 np0005603622 nova_compute[221301]: 2026-01-31 09:14:25.606 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:25 np0005603622 nova_compute[221301]: 2026-01-31 09:14:25.606 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:14:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:25.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:25 np0005603622 nova_compute[221301]: 2026-01-31 09:14:25.855 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:26.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:26 np0005603622 nova_compute[221301]: 2026-01-31 09:14:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:27 np0005603622 nova_compute[221301]: 2026-01-31 09:14:27.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:27 np0005603622 nova_compute[221301]: 2026-01-31 09:14:27.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:27.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:29 np0005603622 nova_compute[221301]: 2026-01-31 09:14:29.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:29.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:14:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:30.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:14:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:30 np0005603622 nova_compute[221301]: 2026-01-31 09:14:30.857 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:30.932 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:30.932 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:30.933 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:31 np0005603622 nova_compute[221301]: 2026-01-31 09:14:31.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:14:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:31.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:14:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:32.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:32 np0005603622 nova_compute[221301]: 2026-01-31 09:14:32.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:32 np0005603622 nova_compute[221301]: 2026-01-31 09:14:32.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:32 np0005603622 nova_compute[221301]: 2026-01-31 09:14:32.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:33.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:35.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:35 np0005603622 nova_compute[221301]: 2026-01-31 09:14:35.861 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:36.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.241 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:37.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.832 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.832 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.833 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.833 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:14:37 np0005603622 nova_compute[221301]: 2026-01-31 09:14:37.834 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:14:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:38.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:14:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1919024951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.295 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.447 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.448 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4239MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.448 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.448 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.728 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.728 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:14:38 np0005603622 nova_compute[221301]: 2026-01-31 09:14:38.942 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:14:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:14:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2775150076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:14:39 np0005603622 nova_compute[221301]: 2026-01-31 09:14:39.415 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:14:39 np0005603622 nova_compute[221301]: 2026-01-31 09:14:39.420 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:14:39 np0005603622 nova_compute[221301]: 2026-01-31 09:14:39.536 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:14:39 np0005603622 nova_compute[221301]: 2026-01-31 09:14:39.611 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:14:39 np0005603622 nova_compute[221301]: 2026-01-31 09:14:39.612 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:14:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:39.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:40.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:40 np0005603622 nova_compute[221301]: 2026-01-31 09:14:40.863 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:41.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:42.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:42 np0005603622 nova_compute[221301]: 2026-01-31 09:14:42.244 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:42 np0005603622 nova_compute[221301]: 2026-01-31 09:14:42.612 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:14:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:43.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:44.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:45.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:45 np0005603622 nova_compute[221301]: 2026-01-31 09:14:45.865 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:46 np0005603622 ovn_controller[130109]: 2026-01-31T09:14:46Z|01026|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 31 04:14:46 np0005603622 podman[308859]: 2026-01-31 09:14:46.157842672 +0000 UTC m=+0.071220395 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:14:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:46.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:46 np0005603622 podman[308858]: 2026-01-31 09:14:46.21709529 +0000 UTC m=+0.134442978 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 31 04:14:47 np0005603622 nova_compute[221301]: 2026-01-31 09:14:47.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:47.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:48.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:14:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:50.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:14:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:50 np0005603622 nova_compute[221301]: 2026-01-31 09:14:50.868 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:51.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:52.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:52 np0005603622 nova_compute[221301]: 2026-01-31 09:14:52.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:53.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:54.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:14:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:55.676 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=99, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=98) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:14:55 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:55.677 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:14:55 np0005603622 nova_compute[221301]: 2026-01-31 09:14:55.677 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:14:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:55.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:14:55 np0005603622 nova_compute[221301]: 2026-01-31 09:14:55.872 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:56 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:14:56.680 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '99'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:14:57 np0005603622 nova_compute[221301]: 2026-01-31 09:14:57.253 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:14:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:57.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:14:58.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:14:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:14:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:14:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:14:59.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:00.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:00 np0005603622 nova_compute[221301]: 2026-01-31 09:15:00.874 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:01.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:02.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:02 np0005603622 nova_compute[221301]: 2026-01-31 09:15:02.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:03.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:05.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:05 np0005603622 nova_compute[221301]: 2026-01-31 09:15:05.877 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:06.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:07 np0005603622 nova_compute[221301]: 2026-01-31 09:15:07.259 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:07.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:08.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:09.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:10.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:10 np0005603622 nova_compute[221301]: 2026-01-31 09:15:10.878 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:11.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:12.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:12 np0005603622 nova_compute[221301]: 2026-01-31 09:15:12.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:13.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:14.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:15.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:15 np0005603622 nova_compute[221301]: 2026-01-31 09:15:15.881 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:16.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:15:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:15:17 np0005603622 podman[309039]: 2026-01-31 09:15:17.155398479 +0000 UTC m=+0.074645545 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:15:17 np0005603622 podman[309038]: 2026-01-31 09:15:17.193076416 +0000 UTC m=+0.113397431 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:15:17 np0005603622 nova_compute[221301]: 2026-01-31 09:15:17.263 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:18.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:19.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:20.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:20 np0005603622 nova_compute[221301]: 2026-01-31 09:15:20.884 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:21.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:22.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:22 np0005603622 nova_compute[221301]: 2026-01-31 09:15:22.266 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:15:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:23.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:24.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.416 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.417 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.418 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:15:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:25.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:25 np0005603622 nova_compute[221301]: 2026-01-31 09:15:25.885 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:26.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:26 np0005603622 nova_compute[221301]: 2026-01-31 09:15:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:27 np0005603622 nova_compute[221301]: 2026-01-31 09:15:27.268 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:27 np0005603622 nova_compute[221301]: 2026-01-31 09:15:27.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:27.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:28.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:29.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:30.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:30 np0005603622 nova_compute[221301]: 2026-01-31 09:15:30.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:30 np0005603622 nova_compute[221301]: 2026-01-31 09:15:30.888 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:15:30.932 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:15:30.933 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:15:30.933 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:31.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:32.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:32 np0005603622 nova_compute[221301]: 2026-01-31 09:15:32.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:32 np0005603622 nova_compute[221301]: 2026-01-31 09:15:32.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:33 np0005603622 nova_compute[221301]: 2026-01-31 09:15:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:33.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:34.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.447 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.447 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.448 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.448 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:15:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:15:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2098916702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:15:34 np0005603622 nova_compute[221301]: 2026-01-31 09:15:34.901 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.046 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.048 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4244MB free_disk=20.967517852783203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.048 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.048 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.128 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.129 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.145 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:15:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:15:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3626806972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.609 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.617 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.655 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.659 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.659 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:15:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:35.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:35 np0005603622 nova_compute[221301]: 2026-01-31 09:15:35.890 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:36.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:37 np0005603622 nova_compute[221301]: 2026-01-31 09:15:37.276 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:37.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:38.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:38 np0005603622 nova_compute[221301]: 2026-01-31 09:15:38.660 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:15:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:40.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:40 np0005603622 nova_compute[221301]: 2026-01-31 09:15:40.893 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:41.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:42 np0005603622 nova_compute[221301]: 2026-01-31 09:15:42.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:42.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:43.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:44.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:45 np0005603622 nova_compute[221301]: 2026-01-31 09:15:45.896 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:46.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:47 np0005603622 nova_compute[221301]: 2026-01-31 09:15:47.282 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:15:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:47.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:15:48 np0005603622 podman[309179]: 2026-01-31 09:15:48.170464389 +0000 UTC m=+0.088986495 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 31 04:15:48 np0005603622 podman[309178]: 2026-01-31 09:15:48.185056575 +0000 UTC m=+0.100559721 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:15:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:48.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:49.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:50.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:50 np0005603622 nova_compute[221301]: 2026-01-31 09:15:50.898 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:51.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:52 np0005603622 nova_compute[221301]: 2026-01-31 09:15:52.286 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:52.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:54.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:15:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:55.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:55 np0005603622 nova_compute[221301]: 2026-01-31 09:15:55.900 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:56.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:57 np0005603622 nova_compute[221301]: 2026-01-31 09:15:57.289 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:15:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:57.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:15:57 np0005603622 nova_compute[221301]: 2026-01-31 09:15:57.819 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:15:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:15:57.819 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=100, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=99) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:15:57 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:15:57.821 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:15:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:15:58.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:15:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:15:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:15:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:15:59.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:00.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:16:00.824 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '100'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:16:00 np0005603622 nova_compute[221301]: 2026-01-31 09:16:00.901 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:01.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:02 np0005603622 nova_compute[221301]: 2026-01-31 09:16:02.293 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:02.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:03.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:04.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:05.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:05 np0005603622 nova_compute[221301]: 2026-01-31 09:16:05.905 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:06.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:06 np0005603622 nova_compute[221301]: 2026-01-31 09:16:06.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:07 np0005603622 nova_compute[221301]: 2026-01-31 09:16:07.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:07.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:08.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:09.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:10.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:10 np0005603622 nova_compute[221301]: 2026-01-31 09:16:10.907 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:11.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:12 np0005603622 nova_compute[221301]: 2026-01-31 09:16:12.299 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:12.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:13.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:14.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:15.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:15 np0005603622 nova_compute[221301]: 2026-01-31 09:16:15.910 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:16.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:17 np0005603622 nova_compute[221301]: 2026-01-31 09:16:17.302 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:17.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:18.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:19 np0005603622 podman[309225]: 2026-01-31 09:16:19.146634051 +0000 UTC m=+0.059186497 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 31 04:16:19 np0005603622 podman[309224]: 2026-01-31 09:16:19.175098014 +0000 UTC m=+0.094079330 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:16:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:19.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:20.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:20 np0005603622 nova_compute[221301]: 2026-01-31 09:16:20.913 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:21.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:22 np0005603622 nova_compute[221301]: 2026-01-31 09:16:22.303 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:22.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.626692) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982626823, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1758, "num_deletes": 251, "total_data_size": 4252995, "memory_usage": 4306264, "flush_reason": "Manual Compaction"}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982651157, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 2795093, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91177, "largest_seqno": 92929, "table_properties": {"data_size": 2787734, "index_size": 4365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15032, "raw_average_key_size": 20, "raw_value_size": 2773206, "raw_average_value_size": 3707, "num_data_blocks": 192, "num_entries": 748, "num_filter_entries": 748, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850818, "oldest_key_time": 1769850818, "file_creation_time": 1769850982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 24521 microseconds, and 5953 cpu microseconds.
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.651218) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 2795093 bytes OK
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.651247) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.652969) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.652983) EVENT_LOG_v1 {"time_micros": 1769850982652979, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.653009) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 4245154, prev total WAL file size 4245154, number of live WAL files 2.
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.653857) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(2729KB)], [189(11MB)]
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982654026, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 15241939, "oldest_snapshot_seqno": -1}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 11067 keys, 13334134 bytes, temperature: kUnknown
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982751247, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 13334134, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13263968, "index_size": 41381, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27717, "raw_key_size": 293043, "raw_average_key_size": 26, "raw_value_size": 13072131, "raw_average_value_size": 1181, "num_data_blocks": 1566, "num_entries": 11067, "num_filter_entries": 11067, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769850982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.751589) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 13334134 bytes
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.753175) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.6 rd, 137.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 11.9 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(10.2) write-amplify(4.8) OK, records in: 11584, records dropped: 517 output_compression: NoCompression
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.753203) EVENT_LOG_v1 {"time_micros": 1769850982753191, "job": 122, "event": "compaction_finished", "compaction_time_micros": 97358, "compaction_time_cpu_micros": 36486, "output_level": 6, "num_output_files": 1, "total_output_size": 13334134, "num_input_records": 11584, "num_output_records": 11067, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982753938, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769850982755856, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.653684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.755895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.755900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.755902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.755904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:16:22.755905) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:23 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:16:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:24.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:25 np0005603622 nova_compute[221301]: 2026-01-31 09:16:25.915 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:26.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:26 np0005603622 nova_compute[221301]: 2026-01-31 09:16:26.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:26 np0005603622 nova_compute[221301]: 2026-01-31 09:16:26.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:26 np0005603622 nova_compute[221301]: 2026-01-31 09:16:26.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:16:27 np0005603622 nova_compute[221301]: 2026-01-31 09:16:27.307 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:27 np0005603622 nova_compute[221301]: 2026-01-31 09:16:27.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:27 np0005603622 nova_compute[221301]: 2026-01-31 09:16:27.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:16:27 np0005603622 nova_compute[221301]: 2026-01-31 09:16:27.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:16:27 np0005603622 nova_compute[221301]: 2026-01-31 09:16:27.413 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:16:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:27.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:28.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:16:29 np0005603622 nova_compute[221301]: 2026-01-31 09:16:29.409 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:30.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:30 np0005603622 nova_compute[221301]: 2026-01-31 09:16:30.918 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:16:30.933 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:16:30.934 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:16:30.934 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:31 np0005603622 nova_compute[221301]: 2026-01-31 09:16:31.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:32 np0005603622 nova_compute[221301]: 2026-01-31 09:16:32.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:32.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:32 np0005603622 nova_compute[221301]: 2026-01-31 09:16:32.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:33.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:34.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.428 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.428 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:16:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:16:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2607766379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:16:34 np0005603622 nova_compute[221301]: 2026-01-31 09:16:34.872 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.004 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.006 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4228MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.006 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.006 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.169 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.170 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.196 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:16:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:16:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3480866657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.665 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.673 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.701 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.704 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.705 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:16:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:35.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:35 np0005603622 nova_compute[221301]: 2026-01-31 09:16:35.921 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:36.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:37 np0005603622 nova_compute[221301]: 2026-01-31 09:16:37.314 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:37.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:38.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:38 np0005603622 nova_compute[221301]: 2026-01-31 09:16:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:38 np0005603622 nova_compute[221301]: 2026-01-31 09:16:38.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:16:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:39.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:40.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:40 np0005603622 nova_compute[221301]: 2026-01-31 09:16:40.924 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:41.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:42 np0005603622 nova_compute[221301]: 2026-01-31 09:16:42.316 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:42.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:43.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:44.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:45 np0005603622 nova_compute[221301]: 2026-01-31 09:16:45.929 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:46.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:47 np0005603622 nova_compute[221301]: 2026-01-31 09:16:47.319 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:47.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:48.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:49.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:50 np0005603622 podman[309616]: 2026-01-31 09:16:50.159937322 +0000 UTC m=+0.073465414 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:16:50 np0005603622 podman[309615]: 2026-01-31 09:16:50.191341783 +0000 UTC m=+0.105794490 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:16:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:50.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:50 np0005603622 nova_compute[221301]: 2026-01-31 09:16:50.931 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:51.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:52 np0005603622 nova_compute[221301]: 2026-01-31 09:16:52.323 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:52.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:53.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:54.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:16:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:16:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:55.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:16:55 np0005603622 nova_compute[221301]: 2026-01-31 09:16:55.932 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:56.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:57 np0005603622 nova_compute[221301]: 2026-01-31 09:16:57.326 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:16:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:57.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:16:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:16:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:16:58.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:16:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:16:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:16:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:16:59.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:00.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:00 np0005603622 nova_compute[221301]: 2026-01-31 09:17:00.934 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:01.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:02 np0005603622 nova_compute[221301]: 2026-01-31 09:17:02.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:04.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:05 np0005603622 nova_compute[221301]: 2026-01-31 09:17:05.937 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:06.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:07 np0005603622 nova_compute[221301]: 2026-01-31 09:17:07.331 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:07 np0005603622 nova_compute[221301]: 2026-01-31 09:17:07.414 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:07 np0005603622 nova_compute[221301]: 2026-01-31 09:17:07.414 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:17:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:07.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:08.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:09.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:10.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:10 np0005603622 nova_compute[221301]: 2026-01-31 09:17:10.940 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:11.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:12 np0005603622 nova_compute[221301]: 2026-01-31 09:17:12.337 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:12.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:13.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:14.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:15.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:15 np0005603622 nova_compute[221301]: 2026-01-31 09:17:15.941 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:16.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:17 np0005603622 nova_compute[221301]: 2026-01-31 09:17:17.340 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:17.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e426 e426: 3 total, 3 up, 3 in
Jan 31 04:17:18 np0005603622 nova_compute[221301]: 2026-01-31 09:17:18.412 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:18 np0005603622 nova_compute[221301]: 2026-01-31 09:17:18.413 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:17:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:18.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:18 np0005603622 nova_compute[221301]: 2026-01-31 09:17:18.440 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:17:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:19.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:20 np0005603622 podman[309659]: 2026-01-31 09:17:20.35312311 +0000 UTC m=+0.057280316 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:17:20 np0005603622 podman[309658]: 2026-01-31 09:17:20.401239783 +0000 UTC m=+0.105096551 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:17:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:20.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:20 np0005603622 nova_compute[221301]: 2026-01-31 09:17:20.946 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e427 e427: 3 total, 3 up, 3 in
Jan 31 04:17:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:22 np0005603622 nova_compute[221301]: 2026-01-31 09:17:22.342 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:22.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e428 e428: 3 total, 3 up, 3 in
Jan 31 04:17:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:23.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e429 e429: 3 total, 3 up, 3 in
Jan 31 04:17:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:24.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e429 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:25 np0005603622 nova_compute[221301]: 2026-01-31 09:17:25.947 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:25.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:26.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:27 np0005603622 nova_compute[221301]: 2026-01-31 09:17:27.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:27 np0005603622 nova_compute[221301]: 2026-01-31 09:17:27.423 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:27 np0005603622 nova_compute[221301]: 2026-01-31 09:17:27.423 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:17:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:27.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 e430: 3 total, 3 up, 3 in
Jan 31 04:17:28 np0005603622 nova_compute[221301]: 2026-01-31 09:17:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:28 np0005603622 nova_compute[221301]: 2026-01-31 09:17:28.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:17:28 np0005603622 nova_compute[221301]: 2026-01-31 09:17:28.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:17:28 np0005603622 nova_compute[221301]: 2026-01-31 09:17:28.416 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:17:28 np0005603622 nova_compute[221301]: 2026-01-31 09:17:28.417 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:28.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.410 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:17:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.838 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.841 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.867 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.947 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.948 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.958 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 31 04:17:29 np0005603622 nova_compute[221301]: 2026-01-31 09:17:29.958 221324 INFO nova.compute.claims [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 31 04:17:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:29.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.059 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:30.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:17:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2210038117' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.524 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.533 221324 DEBUG nova.compute.provider_tree [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.550 221324 DEBUG nova.scheduler.client.report [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.575 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.577 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.628 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.629 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.653 221324 INFO nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.673 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.795 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.798 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.798 221324 INFO nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Creating image(s)#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.844 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.873 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.908 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.911 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "75534d4db6f9dac8401594c75991e6c285fa51c2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.912 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "75534d4db6f9dac8401594c75991e6c285fa51c2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:30.934 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:30.935 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:30.935 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:30 np0005603622 nova_compute[221301]: 2026-01-31 09:17:30.949 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.175 221324 DEBUG nova.virt.libvirt.imagebackend [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Image locations are: [{'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0d074876-a373-4784-8303-5b3716508074/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0d074876-a373-4784-8303-5b3716508074/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.261 221324 DEBUG nova.virt.libvirt.imagebackend [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Selected location: {'url': 'rbd://2f5ab832-5f2e-5a84-bd93-cf8bab960ee2/images/0d074876-a373-4784-8303-5b3716508074/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.261 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] cloning images/0d074876-a373-4784-8303-5b3716508074@snap to None/7f870e76-5f0a-46db-b844-1788cc790f7c_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.428 221324 DEBUG nova.policy [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f25568607234a398bc35cbb67eb406f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390709b3e5174dc4afdc6b04fdae67e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.462 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "75534d4db6f9dac8401594c75991e6c285fa51c2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.662 221324 DEBUG nova.objects.instance [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.681 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.682 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Ensure instance console log exists: /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.682 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.683 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:31 np0005603622 nova_compute[221301]: 2026-01-31 09:17:31.683 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:31.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:32 np0005603622 nova_compute[221301]: 2026-01-31 09:17:32.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:32.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:32 np0005603622 nova_compute[221301]: 2026-01-31 09:17:32.545 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:32.547 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=101, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=100) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:17:32 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:32.549 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:17:33 np0005603622 nova_compute[221301]: 2026-01-31 09:17:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:33 np0005603622 nova_compute[221301]: 2026-01-31 09:17:33.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:33 np0005603622 nova_compute[221301]: 2026-01-31 09:17:33.528 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Successfully created port: 700f6b80-6690-412c-a1e0-2e1acd90551c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 31 04:17:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:33.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:34.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.420 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.420 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.421 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.422 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:35 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:35.553 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '101'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.582 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Successfully updated port: 700f6b80-6690-412c-a1e0-2e1acd90551c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.596 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.597 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquired lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.597 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.664 221324 DEBUG nova.compute.manager [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.665 221324 DEBUG nova.compute.manager [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing instance network info cache due to event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.665 221324 DEBUG oslo_concurrency.lockutils [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.740 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 31 04:17:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:17:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/747963476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.917 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:35 np0005603622 nova_compute[221301]: 2026-01-31 09:17:35.951 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:35.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.148 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.149 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4252MB free_disk=20.942646026611328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.149 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.150 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.221 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Instance 7f870e76-5f0a-46db-b844-1788cc790f7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.221 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.222 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.238 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.253 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.254 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.292 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.324 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.357 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:36.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:17:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:17:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3436580230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.860 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.868 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.890 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.919 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:17:36 np0005603622 nova_compute[221301]: 2026-01-31 09:17:36.919 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.352 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.485 221324 DEBUG nova.network.neutron [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating instance_info_cache with network_info: [{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.506 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Releasing lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.506 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Instance network_info: |[{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.507 221324 DEBUG oslo_concurrency.lockutils [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.507 221324 DEBUG nova.network.neutron [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.509 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Start _get_guest_xml network_info=[{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T09:17:19Z,direct_url=<?>,disk_format='raw',id=0d074876-a373-4784-8303-5b3716508074,min_disk=1,min_ram=0,name='tempest-TestStampPatternsnapshot-1049747782',owner='390709b3e5174dc4afdc6b04fdae67e3',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T09:17:25Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'encrypted': False, 'size': 0, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'image_id': '0d074876-a373-4784-8303-5b3716508074'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.514 221324 WARNING nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.519 221324 DEBUG nova.virt.libvirt.host [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.520 221324 DEBUG nova.virt.libvirt.host [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.530 221324 DEBUG nova.virt.libvirt.host [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.531 221324 DEBUG nova.virt.libvirt.host [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.533 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.534 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-31T07:43:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a01eb4f0-fd80-416b-a750-75de320394d8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-31T09:17:19Z,direct_url=<?>,disk_format='raw',id=0d074876-a373-4784-8303-5b3716508074,min_disk=1,min_ram=0,name='tempest-TestStampPatternsnapshot-1049747782',owner='390709b3e5174dc4afdc6b04fdae67e3',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-31T09:17:25Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.535 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.535 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.536 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.536 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.537 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.537 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.538 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.538 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.539 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.539 221324 DEBUG nova.virt.hardware [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 31 04:17:37 np0005603622 nova_compute[221301]: 2026-01-31 09:17:37.546 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:37.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:17:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1945851342' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.018 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.051 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.057 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:17:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2462405667' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:17:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:38.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.469 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.472 221324 DEBUG nova.virt.libvirt.vif [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:17:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1868417645',display_name='tempest-TestStampPattern-server-1868417645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-1868417645',id=224,image_ref='0d074876-a373-4784-8303-5b3716508074',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD/aBEl4PBIoIshqTDNOjhhhoUeVGicNgguOr3MSdHgT0ltB1LqQrhXegMG9XeJiExk1ZCoew1VKJC1u0bcRchycTTDnBsbTgXLKYBMMmensD0uk0uwm2aHKUBZ2bnKdBA==',key_name='tempest-TestStampPattern-80801695',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390709b3e5174dc4afdc6b04fdae67e3',ramdisk_id='',reservation_id='r-0hnq2b00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='4e6fd1c3-3988-4a7f-a30d-b599226c25a0',image_min_disk='1',image_min_ram='0',image_owner_id='390709b3e5174dc4afdc6b04fdae67e3',image_owner_project_name='tempest-TestStampPattern-1434051857',image_owner_user_name='tempest-TestStampPattern-1434051857-project-member',image_user_id='4f25568607234a398bc35cbb67eb406f',network_allocated='True',owner_project_name='tempest-TestStampPattern-1434051857',owner_user_name='tempest-TestStampPattern-1434051857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:17:30Z,user_data=None,user_id='4f25568607234a398bc35cbb67eb406f',uuid=7f870e76-5f0a-46db-b844-1788cc790f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.472 221324 DEBUG nova.network.os_vif_util [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converting VIF {"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.474 221324 DEBUG nova.network.os_vif_util [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.476 221324 DEBUG nova.objects.instance [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.493 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] End _get_guest_xml xml=<domain type="kvm">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <uuid>7f870e76-5f0a-46db-b844-1788cc790f7c</uuid>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <name>instance-000000e0</name>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <memory>131072</memory>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <vcpu>1</vcpu>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <metadata>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:name>tempest-TestStampPattern-server-1868417645</nova:name>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:creationTime>2026-01-31 09:17:37</nova:creationTime>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:flavor name="m1.nano">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:memory>128</nova:memory>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:disk>1</nova:disk>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:swap>0</nova:swap>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:ephemeral>0</nova:ephemeral>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:vcpus>1</nova:vcpus>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </nova:flavor>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:owner>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:user uuid="4f25568607234a398bc35cbb67eb406f">tempest-TestStampPattern-1434051857-project-member</nova:user>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:project uuid="390709b3e5174dc4afdc6b04fdae67e3">tempest-TestStampPattern-1434051857</nova:project>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </nova:owner>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:root type="image" uuid="0d074876-a373-4784-8303-5b3716508074"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <nova:ports>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <nova:port uuid="700f6b80-6690-412c-a1e0-2e1acd90551c">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        </nova:port>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </nova:ports>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </nova:instance>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </metadata>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <sysinfo type="smbios">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <system>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="manufacturer">RDO</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="product">OpenStack Compute</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="serial">7f870e76-5f0a-46db-b844-1788cc790f7c</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="uuid">7f870e76-5f0a-46db-b844-1788cc790f7c</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <entry name="family">Virtual Machine</entry>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </system>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </sysinfo>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <os>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <boot dev="hd"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <smbios mode="sysinfo"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </os>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <features>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <acpi/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <apic/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <vmcoreinfo/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </features>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <clock offset="utc">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <timer name="pit" tickpolicy="delay"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <timer name="hpet" present="no"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </clock>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <cpu mode="custom" match="exact">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <model>Nehalem</model>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <topology sockets="1" cores="1" threads="1"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </cpu>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  <devices>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <disk type="network" device="disk">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7f870e76-5f0a-46db-b844-1788cc790f7c_disk">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <target dev="vda" bus="virtio"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <disk type="network" device="cdrom">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <driver type="raw" cache="none"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <source protocol="rbd" name="vms/7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.100" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.102" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <host name="192.168.122.101" port="6789"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </source>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <auth username="openstack">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:        <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      </auth>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <target dev="sda" bus="sata"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </disk>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <interface type="ethernet">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <mac address="fa:16:3e:14:97:91"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <driver name="vhost" rx_queue_size="512"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <mtu size="1442"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <target dev="tap700f6b80-66"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </interface>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <serial type="pty">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <log file="/var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/console.log" append="off"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </serial>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <video>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <model type="virtio"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </video>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <input type="tablet" bus="usb"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <input type="keyboard" bus="usb"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <rng model="virtio">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <backend model="random">/dev/urandom</backend>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </rng>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="pci" model="pcie-root-port"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <controller type="usb" index="0"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    <memballoon model="virtio">
Jan 31 04:17:38 np0005603622 nova_compute[221301]:      <stats period="10"/>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:    </memballoon>
Jan 31 04:17:38 np0005603622 nova_compute[221301]:  </devices>
Jan 31 04:17:38 np0005603622 nova_compute[221301]: </domain>
Jan 31 04:17:38 np0005603622 nova_compute[221301]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.495 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Preparing to wait for external event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.495 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.496 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.496 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.497 221324 DEBUG nova.virt.libvirt.vif [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-31T09:17:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1868417645',display_name='tempest-TestStampPattern-server-1868417645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-1868417645',id=224,image_ref='0d074876-a373-4784-8303-5b3716508074',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD/aBEl4PBIoIshqTDNOjhhhoUeVGicNgguOr3MSdHgT0ltB1LqQrhXegMG9XeJiExk1ZCoew1VKJC1u0bcRchycTTDnBsbTgXLKYBMMmensD0uk0uwm2aHKUBZ2bnKdBA==',key_name='tempest-TestStampPattern-80801695',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390709b3e5174dc4afdc6b04fdae67e3',ramdisk_id='',reservation_id='r-0hnq2b00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='4e6fd1c3-3988-4a7f-a30d-b599226c25a0',image_min_disk='1',image_min_ram='0',image_owner_id='390709b3e5174dc4afdc6b04fdae67e3',image_owner_project_name='tempest-TestStampPattern-1434051857',image_owner_user_name='tempest-TestStampPattern-1434051857-project-member',image_user_id='4f25568607234a398bc35cbb67eb406f',network_allocated='True',owner_project_name='tempest-TestStampPattern-1434051857',owner_user_name='tempest-TestStampPattern-1434051857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-31T09:17:30Z,user_data=None,user_id='4f25568607234a398bc35cbb67eb406f',uuid=7f870e76-5f0a-46db-b844-1788cc790f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.498 221324 DEBUG nova.network.os_vif_util [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converting VIF {"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.499 221324 DEBUG nova.network.os_vif_util [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.499 221324 DEBUG os_vif [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.500 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.501 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.501 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.508 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.509 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap700f6b80-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.511 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap700f6b80-66, col_values=(('external_ids', {'iface-id': '700f6b80-6690-412c-a1e0-2e1acd90551c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:97:91', 'vm-uuid': '7f870e76-5f0a-46db-b844-1788cc790f7c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.514 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:38 np0005603622 NetworkManager[49080]: <info>  [1769851058.5162] manager: (tap700f6b80-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/453)
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.524 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.525 221324 INFO os_vif [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66')#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.569 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.570 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.570 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No VIF found with MAC fa:16:3e:14:97:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.571 221324 INFO nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Using config drive#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.607 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.617 221324 DEBUG nova.network.neutron [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updated VIF entry in instance network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.617 221324 DEBUG nova.network.neutron [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating instance_info_cache with network_info: [{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.647 221324 DEBUG oslo_concurrency.lockutils [req-33243b74-b21b-448e-b127-914cc98c7b14 req-d7f17190-8691-4197-be31-a7816ba14a22 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.993 221324 INFO nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Creating config drive at /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config#033[00m
Jan 31 04:17:38 np0005603622 nova_compute[221301]: 2026-01-31 09:17:38.999 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsuswm723 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:39 np0005603622 nova_compute[221301]: 2026-01-31 09:17:39.133 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpsuswm723" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:39 np0005603622 nova_compute[221301]: 2026-01-31 09:17:39.164 221324 DEBUG nova.storage.rbd_utils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] rbd image 7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 31 04:17:39 np0005603622 nova_compute[221301]: 2026-01-31 09:17:39.170 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config 7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:17:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:39.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.183 221324 DEBUG oslo_concurrency.processutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config 7f870e76-5f0a-46db-b844-1788cc790f7c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.184 221324 INFO nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Deleting local config drive /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c/disk.config because it was imported into RBD.#033[00m
Jan 31 04:17:40 np0005603622 kernel: tap700f6b80-66: entered promiscuous mode
Jan 31 04:17:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:40Z|01027|binding|INFO|Claiming lport 700f6b80-6690-412c-a1e0-2e1acd90551c for this chassis.
Jan 31 04:17:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:40Z|01028|binding|INFO|700f6b80-6690-412c-a1e0-2e1acd90551c: Claiming fa:16:3e:14:97:91 10.100.0.6
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.2354] manager: (tap700f6b80-66): new Tun device (/org/freedesktop/NetworkManager/Devices/454)
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.2503] manager: (patch-br-int-to-provnet-9633882b-fa09-4c13-9ab8-69ba69661845): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/455)
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.2509] manager: (patch-provnet-9633882b-fa09-4c13-9ab8-69ba69661845-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/456)
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.256 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:97:91 10.100.0.6'], port_security=['fa:16:3e:14:97:91 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7f870e76-5f0a-46db-b844-1788cc790f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390709b3e5174dc4afdc6b04fdae67e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41fc052d-b084-4adc-a493-522d3d569e9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c4a3744-72a7-4358-ad8b-910d4ad4af10, chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=700f6b80-6690-412c-a1e0-2e1acd90551c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.258 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 700f6b80-6690-412c-a1e0-2e1acd90551c in datapath 3f3cc872-5825-455b-b8f4-03469e3aacf8 bound to our chassis#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.260 139793 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f3cc872-5825-455b-b8f4-03469e3aacf8#033[00m
Jan 31 04:17:40 np0005603622 systemd-udevd[310260]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:17:40 np0005603622 systemd-machined[190406]: New machine qemu-106-instance-000000e0.
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.277 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8cd67c-26f8-4fe0-89aa-05d1fd0364a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.279 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f3cc872-51 in ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.280 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.2852] device (tap700f6b80-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.2861] device (tap700f6b80-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.283 223997 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f3cc872-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.284 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d6eedb-10fb-4b63-959f-4887d6b2fa54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.287 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[d48a7ae1-938a-41c8-a944-e72faf35a949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 systemd[1]: Started Virtual Machine qemu-106-instance-000000e0.
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.292 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:40Z|01029|binding|INFO|Setting lport 700f6b80-6690-412c-a1e0-2e1acd90551c ovn-installed in OVS
Jan 31 04:17:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:40Z|01030|binding|INFO|Setting lport 700f6b80-6690-412c-a1e0-2e1acd90551c up in Southbound
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.302 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[4786c5ad-6340-401d-b1c0-7a1daabfbead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.315 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[94626400-32c2-4f5a-b642-0cf087af490d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.348 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[d0325005-5e09-4858-85eb-df2d5e0a3254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.354 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[a04755f2-d06e-496a-8ba0-7a4119bb669c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.3554] manager: (tap3f3cc872-50): new Veth device (/org/freedesktop/NetworkManager/Devices/457)
Jan 31 04:17:40 np0005603622 systemd-udevd[310264]: Network interface NamePolicy= disabled on kernel command line.
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.383 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1bd346-0ece-469a-9a60-1e032c09c610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.388 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[b730cf3f-9b06-4934-a82f-59ba19a2cadc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.4123] device (tap3f3cc872-50): carrier: link connected
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.418 224184 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7ebaf6-74e2-45e4-9375-0dd439c77527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.437 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[7c683bb4-1999-41b4-88b8-8c08ba0e3b9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3cc872-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:17:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 308], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031175, 'reachable_time': 16282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310294, 'error': None, 'target': 'ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.449 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bb24decc-ae66-4689-9a70-ce6d6b7cb347]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:17ea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1031175, 'tstamp': 1031175}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310295, 'error': None, 'target': 'ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.469 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[39858715-2c68-4c64-82e1-f2ce8cf6d1e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f3cc872-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:17:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 308], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031175, 'reachable_time': 16282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310296, 'error': None, 'target': 'ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.495 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[639979b6-3341-41dc-8559-2839f7859366]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.551 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[546c8418-902f-4883-ac44-4c7e93ee5363]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.553 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3cc872-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.554 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.554 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f3cc872-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:40 np0005603622 NetworkManager[49080]: <info>  [1769851060.5576] manager: (tap3f3cc872-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/458)
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.556 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 kernel: tap3f3cc872-50: entered promiscuous mode
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.562 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f3cc872-50, col_values=(('external_ids', {'iface-id': '5bebd274-c8f9-4e5f-96fa-6c8eecac7fa3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:17:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:40Z|01031|binding|INFO|Releasing lport 5bebd274-c8f9-4e5f-96fa-6c8eecac7fa3 from this chassis (sb_readonly=0)
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.565 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.572 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.574 139793 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f3cc872-5825-455b-b8f4-03469e3aacf8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f3cc872-5825-455b-b8f4-03469e3aacf8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.575 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[37eb54bf-4a5b-40ee-bef8-120453ffc4a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.576 139793 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: global
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    log         /dev/log local0 debug
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    log-tag     haproxy-metadata-proxy-3f3cc872-5825-455b-b8f4-03469e3aacf8
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    user        root
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    group       root
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    maxconn     1024
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    pidfile     /var/lib/neutron/external/pids/3f3cc872-5825-455b-b8f4-03469e3aacf8.pid.haproxy
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    daemon
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: defaults
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    log global
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    mode http
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    option httplog
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    option dontlognull
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    option http-server-close
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    option forwardfor
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    retries                 3
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-request    30s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    timeout connect         30s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    timeout client          32s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    timeout server          32s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    timeout http-keep-alive 30s
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: listen listener
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    bind 169.254.169.254:80
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    server metadata /var/lib/neutron/metadata_proxy
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]:    http-request add-header X-OVN-Network-ID 3f3cc872-5825-455b-b8f4-03469e3aacf8
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 31 04:17:40 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:17:40.577 139793 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'env', 'PROCESS_TAG=haproxy-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f3cc872-5825-455b-b8f4-03469e3aacf8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.587 221324 DEBUG nova.compute.manager [req-cf0f60e4-bb53-44bc-b0f7-1eb57c7d1d84 req-9ec08233-644f-416c-9fc3-2917d501a9bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.588 221324 DEBUG oslo_concurrency.lockutils [req-cf0f60e4-bb53-44bc-b0f7-1eb57c7d1d84 req-9ec08233-644f-416c-9fc3-2917d501a9bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.589 221324 DEBUG oslo_concurrency.lockutils [req-cf0f60e4-bb53-44bc-b0f7-1eb57c7d1d84 req-9ec08233-644f-416c-9fc3-2917d501a9bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.589 221324 DEBUG oslo_concurrency.lockutils [req-cf0f60e4-bb53-44bc-b0f7-1eb57c7d1d84 req-9ec08233-644f-416c-9fc3-2917d501a9bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.589 221324 DEBUG nova.compute.manager [req-cf0f60e4-bb53-44bc-b0f7-1eb57c7d1d84 req-9ec08233-644f-416c-9fc3-2917d501a9bb fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Processing event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.920 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:17:40 np0005603622 nova_compute[221301]: 2026-01-31 09:17:40.955 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:41 np0005603622 podman[310328]: 2026-01-31 09:17:40.922494017 +0000 UTC m=+0.026415059 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 31 04:17:41 np0005603622 podman[310328]: 2026-01-31 09:17:41.112558365 +0000 UTC m=+0.216479377 container create 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.189 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769851061.1890693, 7f870e76-5f0a-46db-b844-1788cc790f7c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.190 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] VM Started (Lifecycle Event)#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.191 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.195 221324 DEBUG nova.virt.libvirt.driver [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.199 221324 INFO nova.virt.libvirt.driver [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Instance spawned successfully.#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.199 221324 INFO nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Took 10.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.199 221324 DEBUG nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.207 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.210 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:17:41 np0005603622 systemd[1]: Started libpod-conmon-4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5.scope.
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.244 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.244 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769851061.1891813, 7f870e76-5f0a-46db-b844-1788cc790f7c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.244 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] VM Paused (Lifecycle Event)#033[00m
Jan 31 04:17:41 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:17:41 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e09bafae55bb2d568472bfa0b80ad1ed41a1831021bc413e096ad41062947f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.280 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.282 221324 INFO nova.compute.manager [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Took 11.37 seconds to build instance.#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.286 221324 DEBUG nova.virt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Emitting event <LifecycleEvent: 1769851061.1937323, 7f870e76-5f0a-46db-b844-1788cc790f7c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.287 221324 INFO nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] VM Resumed (Lifecycle Event)#033[00m
Jan 31 04:17:41 np0005603622 podman[310328]: 2026-01-31 09:17:41.293607454 +0000 UTC m=+0.397528436 container init 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 31 04:17:41 np0005603622 podman[310328]: 2026-01-31 09:17:41.300403414 +0000 UTC m=+0.404324386 container start 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.309 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.310 221324 DEBUG oslo_concurrency.lockutils [None req-df5a0f7e-5c0a-48a4-b24e-ff47ac24202f 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:41 np0005603622 nova_compute[221301]: 2026-01-31 09:17:41.314 221324 DEBUG nova.compute.manager [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 31 04:17:41 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [NOTICE]   (310389) : New worker (310391) forked
Jan 31 04:17:41 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [NOTICE]   (310389) : Loading success.
Jan 31 04:17:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:41.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.676 221324 DEBUG nova.compute.manager [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.677 221324 DEBUG oslo_concurrency.lockutils [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.677 221324 DEBUG oslo_concurrency.lockutils [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.677 221324 DEBUG oslo_concurrency.lockutils [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.678 221324 DEBUG nova.compute.manager [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] No waiting events found dispatching network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:17:42 np0005603622 nova_compute[221301]: 2026-01-31 09:17:42.678 221324 WARNING nova.compute.manager [req-201a0992-6dc0-4192-a467-0f7039cf95ef req-3b0e3ca3-d856-4e69-9df3-f2e6056fe46c fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received unexpected event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c for instance with vm_state active and task_state None.#033[00m
Jan 31 04:17:43 np0005603622 nova_compute[221301]: 2026-01-31 09:17:43.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:43.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:44.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:45 np0005603622 nova_compute[221301]: 2026-01-31 09:17:45.958 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:45.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:46.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:46 np0005603622 nova_compute[221301]: 2026-01-31 09:17:46.755 221324 DEBUG nova.compute.manager [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:17:46 np0005603622 nova_compute[221301]: 2026-01-31 09:17:46.756 221324 DEBUG nova.compute.manager [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing instance network info cache due to event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:17:46 np0005603622 nova_compute[221301]: 2026-01-31 09:17:46.756 221324 DEBUG oslo_concurrency.lockutils [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:17:46 np0005603622 nova_compute[221301]: 2026-01-31 09:17:46.757 221324 DEBUG oslo_concurrency.lockutils [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:17:46 np0005603622 nova_compute[221301]: 2026-01-31 09:17:46.757 221324 DEBUG nova.network.neutron [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:17:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:47.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:48 np0005603622 nova_compute[221301]: 2026-01-31 09:17:48.337 221324 DEBUG nova.network.neutron [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updated VIF entry in instance network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:17:48 np0005603622 nova_compute[221301]: 2026-01-31 09:17:48.338 221324 DEBUG nova.network.neutron [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating instance_info_cache with network_info: [{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:17:48 np0005603622 nova_compute[221301]: 2026-01-31 09:17:48.359 221324 DEBUG oslo_concurrency.lockutils [req-04d1997a-7421-4359-8457-92314d4d34bc req-9f0a265d-9433-407e-81a2-856956ff91d9 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:17:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:48.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:48 np0005603622 nova_compute[221301]: 2026-01-31 09:17:48.514 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:49.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:50.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:50 np0005603622 nova_compute[221301]: 2026-01-31 09:17:50.960 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:51 np0005603622 podman[310402]: 2026-01-31 09:17:51.132989008 +0000 UTC m=+0.047103847 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 31 04:17:51 np0005603622 podman[310401]: 2026-01-31 09:17:51.154134157 +0000 UTC m=+0.074159162 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:17:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:51.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:52.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:53 np0005603622 nova_compute[221301]: 2026-01-31 09:17:53.516 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:53.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:54 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:54Z|00138|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.6
Jan 31 04:17:54 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:54Z|00139|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:14:97:91 10.100.0.6
Jan 31 04:17:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:17:55 np0005603622 nova_compute[221301]: 2026-01-31 09:17:55.966 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:56.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:17:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:56.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:17:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:17:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:17:58.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:17:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:17:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:17:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:17:58.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:17:58 np0005603622 nova_compute[221301]: 2026-01-31 09:17:58.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:17:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:59Z|00140|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.6
Jan 31 04:17:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:59Z|00141|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:14:97:91 10.100.0.6
Jan 31 04:17:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:59Z|00142|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:97:91 10.100.0.6
Jan 31 04:17:59 np0005603622 ovn_controller[130109]: 2026-01-31T09:17:59Z|00143|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:97:91 10.100.0.6
Jan 31 04:18:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:00.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:00.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:00 np0005603622 nova_compute[221301]: 2026-01-31 09:18:00.969 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:02.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:02.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:03 np0005603622 nova_compute[221301]: 2026-01-31 09:18:03.520 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:04.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:04.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:05 np0005603622 nova_compute[221301]: 2026-01-31 09:18:05.972 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:06.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:06.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:08.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:08 np0005603622 nova_compute[221301]: 2026-01-31 09:18:08.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:08.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:08 np0005603622 nova_compute[221301]: 2026-01-31 09:18:08.522 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:10.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:10.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:10 np0005603622 nova_compute[221301]: 2026-01-31 09:18:10.974 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:12.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:12.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:13 np0005603622 nova_compute[221301]: 2026-01-31 09:18:13.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:14.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:14.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:15 np0005603622 nova_compute[221301]: 2026-01-31 09:18:15.976 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:16.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:16 np0005603622 ovn_controller[130109]: 2026-01-31T09:18:16Z|01032|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Jan 31 04:18:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:16.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:18.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:18.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:18 np0005603622 nova_compute[221301]: 2026-01-31 09:18:18.527 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.081 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.082 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.100 221324 DEBUG nova.objects.instance [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'flavor' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.134 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.339 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.339 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.340 221324 INFO nova.compute.manager [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Attaching volume 9c54af70-f6c6-42c1-90a6-95ba3f7c6db1 to /dev/vdb#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.455 221324 DEBUG os_brick.utils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.459 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.484 228913 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.484 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[e643d6cf-2a99-4778-9006-91ae6df4a6ab]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.486 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.497 228913 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.497 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b75973-aed8-4f5f-b01a-d3721c8d0e13]: (4, ('InitiatorName=iqn.1994-05.com.redhat:a7d972b74f57', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.499 228913 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.510 228913 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.510 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8bf6c7-f727-414d-9072-d4e4d2d22451]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.513 228913 DEBUG oslo.privsep.daemon [-] privsep: reply[d7acbfba-f9d4-433c-89da-ec4ceb17d9b8]: (4, 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.513 221324 DEBUG oslo_concurrency.processutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.547 221324 DEBUG oslo_concurrency.processutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "nvme version" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.550 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.551 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.551 221324 DEBUG os_brick.initiator.connectors.lightos [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822 dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.552 221324 DEBUG os_brick.utils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] <== get_connector_properties: return (95ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:a7d972b74f57', 'do_local_attach': False, 'nvme_hostid': '0156c751-d05d-449e-959d-30f482d5b822', 'system uuid': 'ff96e0bf-56de-4f3e-ba96-02bb86eece1f', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:0156c751-d05d-449e-959d-30f482d5b822', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 31 04:18:19 np0005603622 nova_compute[221301]: 2026-01-31 09:18:19.552 221324 DEBUG nova.virt.block_device [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating existing volume attachment record: 2ae9d449-0fd6-46de-870e-c46c37894cb9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 31 04:18:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:20.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 31 04:18:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2457238487' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.233 221324 DEBUG nova.objects.instance [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'flavor' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.258 221324 DEBUG nova.virt.libvirt.driver [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Attempting to attach volume 9c54af70-f6c6-42c1-90a6-95ba3f7c6db1 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.262 221324 DEBUG nova.virt.libvirt.guest [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] attach device xml: <disk type="network" device="disk">
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9c54af70-f6c6-42c1-90a6-95ba3f7c6db1">
Jan 31 04:18:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  <auth username="openstack">
Jan 31 04:18:20 np0005603622 nova_compute[221301]:    <secret type="ceph" uuid="2f5ab832-5f2e-5a84-bd93-cf8bab960ee2"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  </auth>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:18:20 np0005603622 nova_compute[221301]:  <serial>9c54af70-f6c6-42c1-90a6-95ba3f7c6db1</serial>
Jan 31 04:18:20 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:18:20 np0005603622 nova_compute[221301]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 31 04:18:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:20.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.629 221324 DEBUG nova.virt.libvirt.driver [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.630 221324 DEBUG nova.virt.libvirt.driver [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.630 221324 DEBUG nova.virt.libvirt.driver [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.631 221324 DEBUG nova.virt.libvirt.driver [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] No VIF found with MAC fa:16:3e:14:97:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.808 221324 DEBUG oslo_concurrency.lockutils [None req-3aa5e59c-46c6-4a37-b4cb-83a871ddc3a2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:20 np0005603622 nova_compute[221301]: 2026-01-31 09:18:20.978 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:22.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:22 np0005603622 podman[310475]: 2026-01-31 09:18:22.142660473 +0000 UTC m=+0.060966313 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Jan 31 04:18:22 np0005603622 podman[310474]: 2026-01-31 09:18:22.189073891 +0000 UTC m=+0.108704466 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 04:18:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:22.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.211 221324 DEBUG oslo_concurrency.lockutils [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.212 221324 DEBUG oslo_concurrency.lockutils [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.232 221324 INFO nova.compute.manager [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Detaching volume 9c54af70-f6c6-42c1-90a6-95ba3f7c6db1#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.371 221324 INFO nova.virt.block_device [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Attempting to driver detach volume 9c54af70-f6c6-42c1-90a6-95ba3f7c6db1 from mountpoint /dev/vdb#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.385 221324 DEBUG nova.virt.libvirt.driver [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Attempting to detach device vdb from instance 7f870e76-5f0a-46db-b844-1788cc790f7c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.386 221324 DEBUG nova.virt.libvirt.guest [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9c54af70-f6c6-42c1-90a6-95ba3f7c6db1">
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <serial>9c54af70-f6c6-42c1-90a6-95ba3f7c6db1</serial>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:18:23 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.398 221324 INFO nova.virt.libvirt.driver [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Successfully detached device vdb from instance 7f870e76-5f0a-46db-b844-1788cc790f7c from the persistent domain config.#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.398 221324 DEBUG nova.virt.libvirt.driver [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 7f870e76-5f0a-46db-b844-1788cc790f7c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.399 221324 DEBUG nova.virt.libvirt.guest [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <source protocol="rbd" name="volumes/volume-9c54af70-f6c6-42c1-90a6-95ba3f7c6db1">
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.100" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.102" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:    <host name="192.168.122.101" port="6789"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  </source>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <target dev="vdb" bus="virtio"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <serial>9c54af70-f6c6-42c1-90a6-95ba3f7c6db1</serial>
Jan 31 04:18:23 np0005603622 nova_compute[221301]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 31 04:18:23 np0005603622 nova_compute[221301]: </disk>
Jan 31 04:18:23 np0005603622 nova_compute[221301]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.525 221324 DEBUG nova.virt.libvirt.driver [None req-8e25cb84-bfda-45ba-a5c8-6d4b52c150ff - - - - - -] Received event <DeviceRemovedEvent: 1769851103.5249846, 7f870e76-5f0a-46db-b844-1788cc790f7c => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.527 221324 DEBUG nova.virt.libvirt.driver [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 7f870e76-5f0a-46db-b844-1788cc790f7c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.529 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.533 221324 INFO nova.virt.libvirt.driver [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Successfully detached device vdb from instance 7f870e76-5f0a-46db-b844-1788cc790f7c from the live domain config.#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.697 221324 DEBUG nova.objects.instance [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'flavor' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:18:23 np0005603622 nova_compute[221301]: 2026-01-31 09:18:23.754 221324 DEBUG oslo_concurrency.lockutils [None req-0e2210fe-d9c9-4a98-b7ef-550579cb28b2 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:24.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:24.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:25 np0005603622 nova_compute[221301]: 2026-01-31 09:18:25.982 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:26.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.439 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=102, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=101) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.441 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:18:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:26.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.642 221324 DEBUG nova.compute.manager [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.643 221324 DEBUG nova.compute.manager [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing instance network info cache due to event network-changed-700f6b80-6690-412c-a1e0-2e1acd90551c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.644 221324 DEBUG oslo_concurrency.lockutils [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.644 221324 DEBUG oslo_concurrency.lockutils [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquired lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.645 221324 DEBUG nova.network.neutron [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Refreshing network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.709 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.710 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.711 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.711 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.712 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.714 221324 INFO nova.compute.manager [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Terminating instance#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.716 221324 DEBUG nova.compute.manager [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 31 04:18:26 np0005603622 kernel: tap700f6b80-66 (unregistering): left promiscuous mode
Jan 31 04:18:26 np0005603622 NetworkManager[49080]: <info>  [1769851106.7853] device (tap700f6b80-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 31 04:18:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:18:26Z|01033|binding|INFO|Releasing lport 700f6b80-6690-412c-a1e0-2e1acd90551c from this chassis (sb_readonly=0)
Jan 31 04:18:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:18:26Z|01034|binding|INFO|Setting lport 700f6b80-6690-412c-a1e0-2e1acd90551c down in Southbound
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.796 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 ovn_controller[130109]: 2026-01-31T09:18:26Z|01035|binding|INFO|Removing iface tap700f6b80-66 ovn-installed in OVS
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.799 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.808 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:97:91 10.100.0.6'], port_security=['fa:16:3e:14:97:91 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7f870e76-5f0a-46db-b844-1788cc790f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390709b3e5174dc4afdc6b04fdae67e3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41fc052d-b084-4adc-a493-522d3d569e9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c4a3744-72a7-4358-ad8b-910d4ad4af10, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>], logical_port=700f6b80-6690-412c-a1e0-2e1acd90551c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f018ea98640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.811 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.812 139793 INFO neutron.agent.ovn.metadata.agent [-] Port 700f6b80-6690-412c-a1e0-2e1acd90551c in datapath 3f3cc872-5825-455b-b8f4-03469e3aacf8 unbound from our chassis#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.814 139793 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f3cc872-5825-455b-b8f4-03469e3aacf8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.816 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[e53d54e8-a6ad-4c17-8569-6ff44f78cabd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:26 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:26.818 139793 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8 namespace which is not needed anymore#033[00m
Jan 31 04:18:26 np0005603622 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000e0.scope: Deactivated successfully.
Jan 31 04:18:26 np0005603622 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000e0.scope: Consumed 16.273s CPU time.
Jan 31 04:18:26 np0005603622 systemd-machined[190406]: Machine qemu-106-instance-000000e0 terminated.
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.939 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.946 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.957 221324 INFO nova.virt.libvirt.driver [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Instance destroyed successfully.#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.958 221324 DEBUG nova.objects.instance [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lazy-loading 'resources' on Instance uuid 7f870e76-5f0a-46db-b844-1788cc790f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.974 221324 DEBUG nova.virt.libvirt.vif [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-31T09:17:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestStampPattern-server-1868417645',display_name='tempest-TestStampPattern-server-1868417645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-1868417645',id=224,image_ref='0d074876-a373-4784-8303-5b3716508074',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD/aBEl4PBIoIshqTDNOjhhhoUeVGicNgguOr3MSdHgT0ltB1LqQrhXegMG9XeJiExk1ZCoew1VKJC1u0bcRchycTTDnBsbTgXLKYBMMmensD0uk0uwm2aHKUBZ2bnKdBA==',key_name='tempest-TestStampPattern-80801695',keypairs=<?>,launch_index=0,launched_at=2026-01-31T09:17:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390709b3e5174dc4afdc6b04fdae67e3',ramdisk_id='',reservation_id='r-0hnq2b00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='37c0ea6b-d0de-4997-aa0a-7a7de3dd2f16',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='4e6fd1c3-3988-4a7f-a30d-b599226c25a0',image_min_disk='1',image_min_ram='0',image_owner_id='390709b3e5174dc4afdc6b04fdae67e3',image_owner_project_name='tempest-TestStampPattern-1434051857',image_owner_user_name='tempest-TestStampPattern-1434051857-project-member',image_user_id='4f25568607234a398bc35cbb67eb406f',owner_project_name='tempest-TestStampPattern-1434051857',owner_user_name='tempest-TestStampPattern-1434051857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-31T09:17:41Z,user_data=None,user_id='4f25568607234a398bc35cbb67eb406f',uuid=7f870e76-5f0a-46db-b844-1788cc790f7c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.975 221324 DEBUG nova.network.os_vif_util [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converting VIF {"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.978 221324 DEBUG nova.network.os_vif_util [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.979 221324 DEBUG os_vif [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.981 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.982 221324 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap700f6b80-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.984 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:26 np0005603622 nova_compute[221301]: 2026-01-31 09:18:26.989 221324 INFO os_vif [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:97:91,bridge_name='br-int',has_traffic_filtering=True,id=700f6b80-6690-412c-a1e0-2e1acd90551c,network=Network(3f3cc872-5825-455b-b8f4-03469e3aacf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap700f6b80-66')#033[00m
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [NOTICE]   (310389) : haproxy version is 2.8.14-c23fe91
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [NOTICE]   (310389) : path to executable is /usr/sbin/haproxy
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [WARNING]  (310389) : Exiting Master process...
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [WARNING]  (310389) : Exiting Master process...
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [ALERT]    (310389) : Current worker (310391) exited with code 143 (Terminated)
Jan 31 04:18:27 np0005603622 neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8[310385]: [WARNING]  (310389) : All workers exited. Exiting... (0)
Jan 31 04:18:27 np0005603622 systemd[1]: libpod-4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5.scope: Deactivated successfully.
Jan 31 04:18:27 np0005603622 podman[310550]: 2026-01-31 09:18:27.10659907 +0000 UTC m=+0.173982933 container died 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.619 221324 DEBUG nova.compute.manager [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-unplugged-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.619 221324 DEBUG oslo_concurrency.lockutils [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.620 221324 DEBUG oslo_concurrency.lockutils [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.620 221324 DEBUG oslo_concurrency.lockutils [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.621 221324 DEBUG nova.compute.manager [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] No waiting events found dispatching network-vif-unplugged-700f6b80-6690-412c-a1e0-2e1acd90551c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:18:27 np0005603622 nova_compute[221301]: 2026-01-31 09:18:27.621 221324 DEBUG nova.compute.manager [req-4e640f5c-0efb-4089-86ac-36f90e960ffa req-c5670696-03d2-455f-b391-dce0da8e39b8 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-unplugged-700f6b80-6690-412c-a1e0-2e1acd90551c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 31 04:18:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5-userdata-shm.mount: Deactivated successfully.
Jan 31 04:18:27 np0005603622 systemd[1]: var-lib-containers-storage-overlay-7e09bafae55bb2d568472bfa0b80ad1ed41a1831021bc413e096ad41062947f6-merged.mount: Deactivated successfully.
Jan 31 04:18:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:28.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:28 np0005603622 podman[310550]: 2026-01-31 09:18:28.186701331 +0000 UTC m=+1.254085214 container cleanup 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Jan 31 04:18:28 np0005603622 systemd[1]: libpod-conmon-4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5.scope: Deactivated successfully.
Jan 31 04:18:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:28.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.623 221324 DEBUG nova.network.neutron [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updated VIF entry in instance network info cache for port 700f6b80-6690-412c-a1e0-2e1acd90551c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.624 221324 DEBUG nova.network.neutron [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating instance_info_cache with network_info: [{"id": "700f6b80-6690-412c-a1e0-2e1acd90551c", "address": "fa:16:3e:14:97:91", "network": {"id": "3f3cc872-5825-455b-b8f4-03469e3aacf8", "bridge": "br-int", "label": "tempest-TestStampPattern-915378736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390709b3e5174dc4afdc6b04fdae67e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap700f6b80-66", "ovs_interfaceid": "700f6b80-6690-412c-a1e0-2e1acd90551c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.650 221324 DEBUG oslo_concurrency.lockutils [req-3c204c36-a748-4a04-a0e0-2e4e05c17f8e req-23802499-ee58-45bc-8812-ea317ba6f4cd fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Releasing lock "refresh_cache-7f870e76-5f0a-46db-b844-1788cc790f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 31 04:18:28 np0005603622 podman[310612]: 2026-01-31 09:18:28.724477977 +0000 UTC m=+0.514499981 container remove 4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.730 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[59aabd0b-9c4e-425a-bb10-029a19efb6d8]: (4, ('Sat Jan 31 09:18:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8 (4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5)\n4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5\nSat Jan 31 09:18:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8 (4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5)\n4b1cea684ad65713a22ad36aa6a31b50a7a84afa8d296354dd7be3607e74b8d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.732 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[1df037be-308f-4f89-b164-8fc9e515eb45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.733 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f3cc872-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.736 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:28 np0005603622 kernel: tap3f3cc872-50: left promiscuous mode
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.743 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:28 np0005603622 nova_compute[221301]: 2026-01-31 09:18:28.744 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.746 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b21fed-a5de-4b48-a012-0ade7f22ac4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.762 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef9397a-c9f5-4b77-9325-a4a13c96d600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.763 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[bf474466-4894-4e8e-8bc4-8706953b1115]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.780 223997 DEBUG oslo.privsep.daemon [-] privsep: reply[72057994-81f5-4b6a-ba7f-0b7630ed316e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031168, 'reachable_time': 22263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310626, 'error': None, 'target': 'ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.784 140181 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f3cc872-5825-455b-b8f4-03469e3aacf8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 31 04:18:28 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:28.784 140181 DEBUG oslo.privsep.daemon [-] privsep: reply[04eea0f0-8c39-4bad-bd0c-291d4f770b97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 31 04:18:28 np0005603622 systemd[1]: run-netns-ovnmeta\x2d3f3cc872\x2d5825\x2d455b\x2db8f4\x2d03469e3aacf8.mount: Deactivated successfully.
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:29 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:29.444 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '102'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.882 221324 INFO nova.virt.libvirt.driver [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Deleting instance files /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c_del#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.883 221324 INFO nova.virt.libvirt.driver [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Deletion of /var/lib/nova/instances/7f870e76-5f0a-46db-b844-1788cc790f7c_del complete#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.942 221324 DEBUG nova.compute.manager [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.942 221324 DEBUG oslo_concurrency.lockutils [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Acquiring lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.943 221324 DEBUG oslo_concurrency.lockutils [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.943 221324 DEBUG oslo_concurrency.lockutils [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.943 221324 DEBUG nova.compute.manager [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] No waiting events found dispatching network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 31 04:18:29 np0005603622 nova_compute[221301]: 2026-01-31 09:18:29.944 221324 WARNING nova.compute.manager [req-5a43190a-22e3-4df2-9676-c8151aa48367 req-03b77844-6773-4b48-834b-9f3fc823b88d fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received unexpected event network-vif-plugged-700f6b80-6690-412c-a1e0-2e1acd90551c for instance with vm_state active and task_state deleting.#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.021 221324 INFO nova.compute.manager [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Took 3.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.021 221324 DEBUG oslo.service.loopingcall [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.022 221324 DEBUG nova.compute.manager [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.022 221324 DEBUG nova.network.neutron [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 31 04:18:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:30.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.459 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.460 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:18:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:30.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:30.935 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:30.936 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:18:30.936 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:30 np0005603622 nova_compute[221301]: 2026-01-31 09:18:30.983 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.667 221324 DEBUG nova.network.neutron [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.687 221324 INFO nova.compute.manager [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.735 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.736 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.804 221324 DEBUG nova.compute.manager [req-c3967475-cbe7-4dea-9f4e-c072e88d9d33 req-020226bb-7d14-4cb3-9ca6-1b02a6a4cdf2 fe07f3865a614b72984f087353419503 cb417a6cb57d4693a4fbcf628ac13cdc - - default default] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Received event network-vif-deleted-700f6b80-6690-412c-a1e0-2e1acd90551c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.911 221324 DEBUG oslo_concurrency.processutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:31 np0005603622 nova_compute[221301]: 2026-01-31 09:18:31.986 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:32.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:18:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1082888484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.421 221324 DEBUG oslo_concurrency.processutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.429 221324 DEBUG nova.compute.provider_tree [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.480 221324 DEBUG nova.scheduler.client.report [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:18:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:32.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.546 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.630 221324 INFO nova.scheduler.client.report [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Deleted allocations for instance 7f870e76-5f0a-46db-b844-1788cc790f7c#033[00m
Jan 31 04:18:32 np0005603622 nova_compute[221301]: 2026-01-31 09:18:32.851 221324 DEBUG oslo_concurrency.lockutils [None req-578e4d15-7ffe-4645-ab77-e957b439a4a7 4f25568607234a398bc35cbb67eb406f 390709b3e5174dc4afdc6b04fdae67e3 - - default default] Lock "7f870e76-5f0a-46db-b844-1788cc790f7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:33 np0005603622 nova_compute[221301]: 2026-01-31 09:18:33.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:34.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:34.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:35 np0005603622 nova_compute[221301]: 2026-01-31 09:18:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:35 np0005603622 nova_compute[221301]: 2026-01-31 09:18:35.985 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:36.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.457 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.457 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.457 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.458 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.458 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:36.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:18:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1303458499' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.912 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:18:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:18:36 np0005603622 nova_compute[221301]: 2026-01-31 09:18:36.990 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.099 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.101 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4243MB free_disk=20.94265365600586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.102 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.102 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.251 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.251 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.285 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:18:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e431 e431: 3 total, 3 up, 3 in
Jan 31 04:18:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:18:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3808140900' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.720 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.727 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.764 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.820 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:18:37 np0005603622 nova_compute[221301]: 2026-01-31 09:18:37.821 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:18:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:38.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:38.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:38 np0005603622 nova_compute[221301]: 2026-01-31 09:18:38.822 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:40.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:40.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:40 np0005603622 nova_compute[221301]: 2026-01-31 09:18:40.989 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:41 np0005603622 nova_compute[221301]: 2026-01-31 09:18:41.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:18:41 np0005603622 nova_compute[221301]: 2026-01-31 09:18:41.952 221324 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769851106.9513328, 7f870e76-5f0a-46db-b844-1788cc790f7c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 31 04:18:41 np0005603622 nova_compute[221301]: 2026-01-31 09:18:41.953 221324 INFO nova.compute.manager [-] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] VM Stopped (Lifecycle Event)#033[00m
Jan 31 04:18:41 np0005603622 nova_compute[221301]: 2026-01-31 09:18:41.996 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:42 np0005603622 nova_compute[221301]: 2026-01-31 09:18:42.004 221324 DEBUG nova.compute.manager [None req-e2e7083f-0b81-4f1a-860f-27442faa6479 - - - - - -] [instance: 7f870e76-5f0a-46db-b844-1788cc790f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 31 04:18:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:42.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:42.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:44.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:44.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:18:45 np0005603622 nova_compute[221301]: 2026-01-31 09:18:45.993 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:46.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e432 e432: 3 total, 3 up, 3 in
Jan 31 04:18:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:46.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:47 np0005603622 nova_compute[221301]: 2026-01-31 09:18:47.000 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:48.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e433 e433: 3 total, 3 up, 3 in
Jan 31 04:18:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:48.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:50.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:18:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:50.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:18:50 np0005603622 nova_compute[221301]: 2026-01-31 09:18:50.997 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:52 np0005603622 nova_compute[221301]: 2026-01-31 09:18:52.003 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:52.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:52.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:53 np0005603622 podman[310879]: 2026-01-31 09:18:53.177128263 +0000 UTC m=+0.084212028 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:18:53 np0005603622 podman[310878]: 2026-01-31 09:18:53.201771595 +0000 UTC m=+0.113276777 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:18:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 e434: 3 total, 3 up, 3 in
Jan 31 04:18:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:54.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:54.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:18:55 np0005603622 nova_compute[221301]: 2026-01-31 09:18:55.998 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:56.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:18:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:18:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/325590217' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:18:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:18:56 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/325590217' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:18:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:56.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:57 np0005603622 nova_compute[221301]: 2026-01-31 09:18:57.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:18:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:18:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:18:58.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:18:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:18:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:18:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:18:58.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:00.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:00.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:00 np0005603622 nova_compute[221301]: 2026-01-31 09:19:00.879 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:00 np0005603622 nova_compute[221301]: 2026-01-31 09:19:00.927 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:01 np0005603622 nova_compute[221301]: 2026-01-31 09:19:01.001 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:02 np0005603622 nova_compute[221301]: 2026-01-31 09:19:02.009 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:02.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:02.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:04.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:04.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:06 np0005603622 nova_compute[221301]: 2026-01-31 09:19:06.004 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:06.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:06.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:07 np0005603622 nova_compute[221301]: 2026-01-31 09:19:07.012 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:08.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:10.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:10.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:11 np0005603622 nova_compute[221301]: 2026-01-31 09:19:11.006 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:12 np0005603622 nova_compute[221301]: 2026-01-31 09:19:12.017 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:12.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:12.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:14.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:14.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:19:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3411258334' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:19:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:19:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3411258334' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:19:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:16 np0005603622 nova_compute[221301]: 2026-01-31 09:19:16.008 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:16.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:16.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:17 np0005603622 nova_compute[221301]: 2026-01-31 09:19:17.020 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:18.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:18.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:20.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:20.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:21 np0005603622 nova_compute[221301]: 2026-01-31 09:19:21.011 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:22 np0005603622 nova_compute[221301]: 2026-01-31 09:19:22.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:22.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:22.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:24.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:24 np0005603622 podman[310925]: 2026-01-31 09:19:24.183033528 +0000 UTC m=+0.102865723 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:19:24 np0005603622 podman[310926]: 2026-01-31 09:19:24.184064565 +0000 UTC m=+0.099892464 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:19:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:24.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:26 np0005603622 nova_compute[221301]: 2026-01-31 09:19:26.013 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:26.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:26.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:27 np0005603622 nova_compute[221301]: 2026-01-31 09:19:27.025 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:28.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:28 np0005603622 nova_compute[221301]: 2026-01-31 09:19:28.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:28 np0005603622 nova_compute[221301]: 2026-01-31 09:19:28.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:19:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:28.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:29 np0005603622 nova_compute[221301]: 2026-01-31 09:19:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:30.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:30 np0005603622 nova_compute[221301]: 2026-01-31 09:19:30.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:30.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:19:30.936 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:19:30.937 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:19:30.937 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:31 np0005603622 nova_compute[221301]: 2026-01-31 09:19:31.014 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:32 np0005603622 nova_compute[221301]: 2026-01-31 09:19:32.028 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:32.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:32 np0005603622 nova_compute[221301]: 2026-01-31 09:19:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:32 np0005603622 nova_compute[221301]: 2026-01-31 09:19:32.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:19:32 np0005603622 nova_compute[221301]: 2026-01-31 09:19:32.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:19:32 np0005603622 nova_compute[221301]: 2026-01-31 09:19:32.419 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:19:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:32.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:34.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:34.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:35 np0005603622 nova_compute[221301]: 2026-01-31 09:19:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:35 np0005603622 nova_compute[221301]: 2026-01-31 09:19:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.016 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:36.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.460 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.461 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.461 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.462 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.462 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:19:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:36.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:19:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1671356379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:19:36 np0005603622 nova_compute[221301]: 2026-01-31 09:19:36.946 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.130 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.131 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4245MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.131 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.131 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.283 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.284 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.302 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:19:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:19:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4271766967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.724 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.731 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.758 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.803 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:19:37 np0005603622 nova_compute[221301]: 2026-01-31 09:19:37.804 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:19:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:38.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:38.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:40 np0005603622 ovn_controller[130109]: 2026-01-31T09:19:40Z|01036|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 31 04:19:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:40.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:40.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:40 np0005603622 nova_compute[221301]: 2026-01-31 09:19:40.803 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:41 np0005603622 nova_compute[221301]: 2026-01-31 09:19:41.019 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:42 np0005603622 nova_compute[221301]: 2026-01-31 09:19:42.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:42.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:42 np0005603622 nova_compute[221301]: 2026-01-31 09:19:42.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:19:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:42.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:44.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:44.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:46 np0005603622 nova_compute[221301]: 2026-01-31 09:19:46.022 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:46.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:46 np0005603622 podman[311187]: 2026-01-31 09:19:46.349821119 +0000 UTC m=+0.063065319 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 31 04:19:46 np0005603622 podman[311187]: 2026-01-31 09:19:46.452580657 +0000 UTC m=+0.165824837 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 31 04:19:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:46.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:47 np0005603622 nova_compute[221301]: 2026-01-31 09:19:47.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:47 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:19:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:19:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:48.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:48.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.608559) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190608643, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2424, "num_deletes": 254, "total_data_size": 5803911, "memory_usage": 5888224, "flush_reason": "Manual Compaction"}
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:50.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190780825, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3784825, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 92934, "largest_seqno": 95353, "table_properties": {"data_size": 3774968, "index_size": 6286, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20537, "raw_average_key_size": 20, "raw_value_size": 3755142, "raw_average_value_size": 3777, "num_data_blocks": 274, "num_entries": 994, "num_filter_entries": 994, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769850982, "oldest_key_time": 1769850982, "file_creation_time": 1769851190, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 172341 microseconds, and 10156 cpu microseconds.
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.780897) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3784825 bytes OK
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.780924) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.951775) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.951829) EVENT_LOG_v1 {"time_micros": 1769851190951816, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.951881) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5793325, prev total WAL file size 5793325, number of live WAL files 2.
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.954012) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3696KB)], [192(12MB)]
Jan 31 04:19:50 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851190954083, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 17118959, "oldest_snapshot_seqno": -1}
Jan 31 04:19:51 np0005603622 nova_compute[221301]: 2026-01-31 09:19:51.023 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11531 keys, 15134742 bytes, temperature: kUnknown
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851191143892, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 15134742, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15059871, "index_size": 44934, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 303547, "raw_average_key_size": 26, "raw_value_size": 14858357, "raw_average_value_size": 1288, "num_data_blocks": 1712, "num_entries": 11531, "num_filter_entries": 11531, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851190, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.144262) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 15134742 bytes
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.147077) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.1 rd, 79.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 12.7 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(8.5) write-amplify(4.0) OK, records in: 12061, records dropped: 530 output_compression: NoCompression
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.147134) EVENT_LOG_v1 {"time_micros": 1769851191147111, "job": 124, "event": "compaction_finished", "compaction_time_micros": 189936, "compaction_time_cpu_micros": 53629, "output_level": 6, "num_output_files": 1, "total_output_size": 15134742, "num_input_records": 12061, "num_output_records": 11531, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851191147999, "job": 124, "event": "table_file_deletion", "file_number": 194}
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851191150202, "job": 124, "event": "table_file_deletion", "file_number": 192}
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:50.953933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.150248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.150256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.150259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.150262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:51 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:51.150265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:52 np0005603622 nova_compute[221301]: 2026-01-31 09:19:52.039 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:19:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:19:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:52.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:52 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:19:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:19:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:54.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:19:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:54.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:55 np0005603622 podman[311494]: 2026-01-31 09:19:55.174472141 +0000 UTC m=+0.092590230 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:19:55 np0005603622 podman[311493]: 2026-01-31 09:19:55.177950083 +0000 UTC m=+0.097370537 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:19:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:19:56 np0005603622 nova_compute[221301]: 2026-01-31 09:19:56.027 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:56.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:56.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:57 np0005603622 nova_compute[221301]: 2026-01-31 09:19:57.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #196. Immutable memtables: 0.
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.177662) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 125] Flushing memtable with next log file: 196
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198177702, "job": 125, "event": "flush_started", "num_memtables": 1, "num_entries": 331, "num_deletes": 250, "total_data_size": 284081, "memory_usage": 291424, "flush_reason": "Manual Compaction"}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 125] Level-0 flush table #197: started
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198182754, "cf_name": "default", "job": 125, "event": "table_file_creation", "file_number": 197, "file_size": 187260, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95358, "largest_seqno": 95684, "table_properties": {"data_size": 185088, "index_size": 335, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5833, "raw_average_key_size": 20, "raw_value_size": 180822, "raw_average_value_size": 634, "num_data_blocks": 14, "num_entries": 285, "num_filter_entries": 285, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851191, "oldest_key_time": 1769851191, "file_creation_time": 1769851198, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 197, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 125] Flush lasted 5172 microseconds, and 1688 cpu microseconds.
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.182832) [db/flush_job.cc:967] [default] [JOB 125] Level-0 flush table #197: 187260 bytes OK
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.182853) [db/memtable_list.cc:519] [default] Level-0 commit table #197 started
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.184441) [db/memtable_list.cc:722] [default] Level-0 commit table #197: memtable #1 done
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.184471) EVENT_LOG_v1 {"time_micros": 1769851198184461, "job": 125, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.184496) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 125] Try to delete WAL files size 281766, prev total WAL file size 281766, number of live WAL files 2.
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000193.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.185023) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033323630' seq:72057594037927935, type:22 .. '6D6772737461740033353131' seq:0, type:0; will stop at (end)
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 126] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 125 Base level 0, inputs: [197(182KB)], [195(14MB)]
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198185087, "job": 126, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [197], "files_L6": [195], "score": -1, "input_data_size": 15322002, "oldest_snapshot_seqno": -1}
Jan 31 04:19:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:19:58.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 126] Generated table #198: 11305 keys, 11486161 bytes, temperature: kUnknown
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198271220, "cf_name": "default", "job": 126, "event": "table_file_creation", "file_number": 198, "file_size": 11486161, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11417577, "index_size": 39223, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28293, "raw_key_size": 299027, "raw_average_key_size": 26, "raw_value_size": 11224695, "raw_average_value_size": 992, "num_data_blocks": 1473, "num_entries": 11305, "num_filter_entries": 11305, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851198, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 198, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.271569) [db/compaction/compaction_job.cc:1663] [default] [JOB 126] Compacted 1@0 + 1@6 files to L6 => 11486161 bytes
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.273403) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.6 rd, 133.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 14.4 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(143.2) write-amplify(61.3) OK, records in: 11816, records dropped: 511 output_compression: NoCompression
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.273434) EVENT_LOG_v1 {"time_micros": 1769851198273420, "job": 126, "event": "compaction_finished", "compaction_time_micros": 86260, "compaction_time_cpu_micros": 31692, "output_level": 6, "num_output_files": 1, "total_output_size": 11486161, "num_input_records": 11816, "num_output_records": 11305, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000197.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198273593, "job": 126, "event": "table_file_deletion", "file_number": 197}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000195.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851198275840, "job": 126, "event": "table_file_deletion", "file_number": 195}
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.184933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.276067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.276076) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.276079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.276081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:19:58.276083) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:19:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:19:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:19:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:19:58.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:00.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 04:20:01 np0005603622 nova_compute[221301]: 2026-01-31 09:20:01.030 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:02 np0005603622 nova_compute[221301]: 2026-01-31 09:20:02.045 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:02.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:04.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:04.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:06 np0005603622 nova_compute[221301]: 2026-01-31 09:20:06.031 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:06.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:06.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:07 np0005603622 nova_compute[221301]: 2026-01-31 09:20:07.048 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:07 np0005603622 systemd-logind[817]: New session 60 of user zuul.
Jan 31 04:20:07 np0005603622 systemd[1]: Started Session 60 of User zuul.
Jan 31 04:20:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:08.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:08.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 31 04:20:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:10.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 31 04:20:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:10.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:11 np0005603622 nova_compute[221301]: 2026-01-31 09:20:11.034 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:20:11 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3356599047' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:20:12 np0005603622 nova_compute[221301]: 2026-01-31 09:20:12.051 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:12.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:13 np0005603622 nova_compute[221301]: 2026-01-31 09:20:13.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:14.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:14.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:15 np0005603622 ovs-vsctl[311829]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 04:20:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:15 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 04:20:15 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 04:20:15 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:20:16 np0005603622 nova_compute[221301]: 2026-01-31 09:20:16.036 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:16.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:16 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: cache status {prefix=cache status} (starting...)
Jan 31 04:20:16 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:16 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: client ls {prefix=client ls} (starting...)
Jan 31 04:20:16 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:16 np0005603622 lvm[312144]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 04:20:16 np0005603622 lvm[312144]: VG ceph_vg0 finished
Jan 31 04:20:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:16.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:17 np0005603622 nova_compute[221301]: 2026-01-31 09:20:17.055 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: damage ls {prefix=damage ls} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 31 04:20:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2151453367' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump loads {prefix=dump loads} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:17 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 31 04:20:17 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1625226748' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 31 04:20:17 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3723937098' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:18.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: ops {prefix=ops} (starting...)
Jan 31 04:20:18 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:18.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/885574374' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 31 04:20:18 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3020956944' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 31 04:20:19 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: session ls {prefix=session ls} (starting...)
Jan 31 04:20:19 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:20:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:20:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1921483486' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:20:19 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: status {prefix=status} (starting...)
Jan 31 04:20:19 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:20:19 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1338677843' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2446859050' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2659623158' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:20:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:20.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3844870915' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2289310724' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 31 04:20:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:20.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:21 np0005603622 nova_compute[221301]: 2026-01-31 09:20:21.039 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1638361948' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3644117102' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 31 04:20:21 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1130816754' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 31 04:20:22 np0005603622 nova_compute[221301]: 2026-01-31 09:20:22.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 31 04:20:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4167957998' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 31 04:20:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:22.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2c1b000/0x0/0x1bfc00000, data 0x355ae89/0x3743000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27993c00 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4358445 data_alloc: 218103808 data_used: 24469504
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2b8dcc00 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2c1b000/0x0/0x1bfc00000, data 0x355ae89/0x3743000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d400 session 0x55bb29305e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb2a23f860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.275858879s of 11.995030403s, submitted: 46
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27675400 session 0x55bb296bb680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4357853 data_alloc: 218103808 data_used: 24469504
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2c1b000/0x0/0x1bfc00000, data 0x355ae89/0x3743000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb2850a960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27902400 session 0x55bb29350d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb29e68d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4357853 data_alloc: 218103808 data_used: 24469504
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415481856 unmapped: 78938112 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2c19000/0x0/0x1bfc00000, data 0x355aec2/0x3745000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb292d9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415907840 unmapped: 78512128 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27993c00 session 0x55bb29577c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb295d9000 session 0x55bb295b43c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ec3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb29304000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27675400 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4328688 data_alloc: 218103808 data_used: 23715840
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a30bf000/0x0/0x1bfc00000, data 0x30b4efb/0x329f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4328688 data_alloc: 218103808 data_used: 23715840
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a30bf000/0x0/0x1bfc00000, data 0x30b4efb/0x329f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.698860168s of 18.062194824s, submitted: 134
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb2a066f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415809536 unmapped: 78610432 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415965184 unmapped: 78454784 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27993c00 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415973376 unmapped: 78446592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4333463 data_alloc: 218103808 data_used: 23724032
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a309a000/0x0/0x1bfc00000, data 0x30d8f5d/0x32c4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415973376 unmapped: 78446592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415973376 unmapped: 78446592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415973376 unmapped: 78446592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a309a000/0x0/0x1bfc00000, data 0x30d8f5d/0x32c4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415973376 unmapped: 78446592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb27a17860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 415989760 unmapped: 78430208 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4372989 data_alloc: 234881024 data_used: 29011968
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417046528 unmapped: 77373440 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417120256 unmapped: 77299712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.661831856s of 10.058757782s, submitted: 158
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a3099000/0x0/0x1bfc00000, data 0x30d8fbf/0x32c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1989f9c6), peers [0,2] op hist [0,0,1,0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417169408 unmapped: 77250560 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417202176 unmapped: 77217792 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2c89000/0x0/0x1bfc00000, data 0x30d8fbf/0x32c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417202176 unmapped: 77217792 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4373293 data_alloc: 234881024 data_used: 29024256
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 417906688 unmapped: 76513280 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb2747e1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2c7ab000 session 0x55bb27473860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2ccf1c00 session 0x55bb297343c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb2747e000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421093376 unmapped: 73326592 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb28062b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2e506000 session 0x55bb296af2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421789696 unmapped: 72630272 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421789696 unmapped: 72630272 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb29274b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421789696 unmapped: 72630272 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4515256 data_alloc: 234881024 data_used: 30412800
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c45000/0x0/0x1bfc00000, data 0x411cfbf/0x4309000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2c7ab000 session 0x55bb294f9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb2a066d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c46000/0x0/0x1bfc00000, data 0x411cf5d/0x4308000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4514544 data_alloc: 234881024 data_used: 30412800
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c46000/0x0/0x1bfc00000, data 0x411cf5d/0x4308000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421675008 unmapped: 72744960 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421675008 unmapped: 72744960 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb29616780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.595837593s of 16.267459869s, submitted: 259
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2e506000 session 0x55bb27a145a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421691392 unmapped: 72728576 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4520000 data_alloc: 234881024 data_used: 30416896
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421355520 unmapped: 73064448 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c44000/0x0/0x1bfc00000, data 0x411cf90/0x430a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c44000/0x0/0x1bfc00000, data 0x411cf90/0x430a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c44000/0x0/0x1bfc00000, data 0x411cf90/0x430a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4571360 data_alloc: 234881024 data_used: 37490688
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c44000/0x0/0x1bfc00000, data 0x411cf90/0x430a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1c44000/0x0/0x1bfc00000, data 0x411cf90/0x430a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19caf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4571360 data_alloc: 234881024 data_used: 37490688
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 421158912 unmapped: 73261056 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.997329712s of 13.038918495s, submitted: 11
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 424312832 unmapped: 70107136 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 424804352 unmapped: 69615616 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a24f3000/0x0/0x1bfc00000, data 0x489ff90/0x4a8d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a246b000/0x0/0x1bfc00000, data 0x492df90/0x4b1b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425287680 unmapped: 69132288 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425287680 unmapped: 69132288 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4640718 data_alloc: 234881024 data_used: 38727680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425287680 unmapped: 69132288 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425287680 unmapped: 69132288 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425361408 unmapped: 69058560 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a246b000/0x0/0x1bfc00000, data 0x492df90/0x4b1b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425295872 unmapped: 69124096 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425295872 unmapped: 69124096 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4635854 data_alloc: 234881024 data_used: 38731776
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2452000/0x0/0x1bfc00000, data 0x494df90/0x4b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425295872 unmapped: 69124096 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425295872 unmapped: 69124096 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.621104240s of 11.092043877s, submitted: 123
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2452000/0x0/0x1bfc00000, data 0x494df90/0x4b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425312256 unmapped: 69107712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425312256 unmapped: 69107712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27882c00 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a15c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425312256 unmapped: 69107712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4640818 data_alloc: 234881024 data_used: 38727680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425312256 unmapped: 69107712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425312256 unmapped: 69107712 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a243f000/0x0/0x1bfc00000, data 0x4961f90/0x4b4f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425320448 unmapped: 69099520 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425320448 unmapped: 69099520 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425320448 unmapped: 69099520 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4640818 data_alloc: 234881024 data_used: 38727680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb2a0661e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb293510e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb2861a5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2e506000 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425336832 unmapped: 69083136 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2090000/0x0/0x1bfc00000, data 0x4d10f90/0x4efe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [0,0,0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a141e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb2a23e5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb279d3000 session 0x55bb296bb4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425771008 unmapped: 68648960 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb2970c1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2e506800 session 0x55bb295bb2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425836544 unmapped: 68583424 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425836544 unmapped: 68583424 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2090000/0x0/0x1bfc00000, data 0x4d10f90/0x4efe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425844736 unmapped: 68575232 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4677755 data_alloc: 234881024 data_used: 38727680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425844736 unmapped: 68575232 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425844736 unmapped: 68575232 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2090000/0x0/0x1bfc00000, data 0x4d10f90/0x4efe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425852928 unmapped: 68567040 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb295ec960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425869312 unmapped: 68550656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 425869312 unmapped: 68550656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.302047729s of 17.631980896s, submitted: 56
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb295ba780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29b7a000 session 0x55bb2a160f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27992000 session 0x55bb29e69c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2cd66c00 session 0x55bb294de960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4678207 data_alloc: 234881024 data_used: 38735872
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a2090000/0x0/0x1bfc00000, data 0x4d10f90/0x4efe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb296ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27992000 session 0x55bb296163c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29b7a000 session 0x55bb294db4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb2968fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29e5a800 session 0x55bb294ded20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4708361 data_alloc: 234881024 data_used: 38936576
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29e5a800 session 0x55bb282caf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1dbc000/0x0/0x1bfc00000, data 0x4fe4f90/0x51d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb2780d800 session 0x55bb29377680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1dbc000/0x0/0x1bfc00000, data 0x4fe4f90/0x51d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27992000 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29b7a000 session 0x55bb29536f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426016768 unmapped: 68403200 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426188800 unmapped: 68231168 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4736347 data_alloc: 234881024 data_used: 42512384
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 426196992 unmapped: 68222976 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.856282234s of 10.944790840s, submitted: 11
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427253760 unmapped: 67166208 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1dbb000/0x0/0x1bfc00000, data 0x4fe4fa0/0x51d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427261952 unmapped: 67158016 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427261952 unmapped: 67158016 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29b7fc00 session 0x55bb28062b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427261952 unmapped: 67158016 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4754659 data_alloc: 234881024 data_used: 42987520
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427261952 unmapped: 67158016 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427917312 unmapped: 66502656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1dbb000/0x0/0x1bfc00000, data 0x4fe4fa0/0x51d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427917312 unmapped: 66502656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427917312 unmapped: 66502656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 heartbeat osd_stat(store_statfs(0x1a1dbb000/0x0/0x1bfc00000, data 0x4fe4fa0/0x51d3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb27883800 session 0x55bb292d9a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb279d3000 session 0x55bb29616780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 427917312 unmapped: 66502656 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4809093 data_alloc: 251658240 data_used: 45756416
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.272718430s of 10.019602776s, submitted: 108
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 434372608 unmapped: 60047360 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 434970624 unmapped: 59449344 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 ms_handle_reset con 0x55bb29b7a000 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435593216 unmapped: 58826752 heap: 494419968 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435675136 unmapped: 67141632 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 handle_osd_map epochs [373,374], i have 373, src has [1,374]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 373 handle_osd_map epochs [374,374], i have 374, src has [1,374]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7fc00 session 0x55bb29e692c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436469760 unmapped: 66347008 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4760697 data_alloc: 234881024 data_used: 38866944
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a1684000/0x0/0x1bfc00000, data 0x571cb64/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436469760 unmapped: 66347008 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a1684000/0x0/0x1bfc00000, data 0x571cb64/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438853632 unmapped: 63963136 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5a800 session 0x55bb27a154a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb27a1a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438853632 unmapped: 63963136 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb295ec3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437919744 unmapped: 64897024 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7a000 session 0x55bb295b43c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438067200 unmapped: 64749568 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7fc00 session 0x55bb29304000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4855179 data_alloc: 234881024 data_used: 38580224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb292e6400 session 0x55bb27a22f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.569067001s of 10.043709755s, submitted: 220
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438067200 unmapped: 64749568 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb282770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0be7000/0x0/0x1bfc00000, data 0x61b7be6/0x63a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438067200 unmapped: 64749568 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0be7000/0x0/0x1bfc00000, data 0x61b7be6/0x63a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438075392 unmapped: 64741376 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438075392 unmapped: 64741376 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438075392 unmapped: 64741376 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4857387 data_alloc: 234881024 data_used: 38580224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0bc1000/0x0/0x1bfc00000, data 0x61ddbe6/0x63cd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 438091776 unmapped: 64724992 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439140352 unmapped: 63676416 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7a000 session 0x55bb27a1a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7fc00 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443514880 unmapped: 59301888 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0bb4000/0x0/0x1bfc00000, data 0x61eabe6/0x63da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443514880 unmapped: 59301888 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb292f2960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b800 session 0x55bb2a23f680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb293045a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb297343c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2a1603c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb294daf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436510720 unmapped: 66306048 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4722289 data_alloc: 234881024 data_used: 36401152
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436510720 unmapped: 66306048 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.117231369s of 10.290706635s, submitted: 60
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7a000 session 0x55bb293774a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb296ae960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439001088 unmapped: 63815680 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb295bb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436961280 unmapped: 65855488 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a1922000/0x0/0x1bfc00000, data 0x547cbe6/0x566c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436961280 unmapped: 65855488 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436961280 unmapped: 65855488 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4722289 data_alloc: 234881024 data_used: 36401152
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a1922000/0x0/0x1bfc00000, data 0x547cbe6/0x566c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [0,0,0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb292f2d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436961280 unmapped: 65855488 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb2a0661e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7fc00 session 0x55bb29e683c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29e69e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 447643648 unmapped: 55173120 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb292741e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb292f3860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439517184 unmapped: 63299584 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439517184 unmapped: 63299584 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0c28000/0x0/0x1bfc00000, data 0x6174c58/0x6366000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb29e68780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2a1edc00 session 0x55bb27472d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27675400 session 0x55bb2747dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d400 session 0x55bb294db0e0
Jan 31 04:20:22 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2057637489' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29304780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439541760 unmapped: 63275008 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb29615a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4657132 data_alloc: 234881024 data_used: 29708288
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27883800 session 0x55bb27a1a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb27a1a5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436420608 unmapped: 66396160 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.196194649s of 10.311293602s, submitted: 121
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436273152 unmapped: 66543616 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0bd4000/0x0/0x1bfc00000, data 0x61c8c48/0x63b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436273152 unmapped: 66543616 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436273152 unmapped: 66543616 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2850d860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a0bd4000/0x0/0x1bfc00000, data 0x61c8c48/0x63b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436207616 unmapped: 66609152 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4831109 data_alloc: 234881024 data_used: 29872128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb279d3000 session 0x55bb294de1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b69400 session 0x55bb29275860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436207616 unmapped: 66609152 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436215808 unmapped: 66600960 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a05af000/0x0/0x1bfc00000, data 0x67efbe6/0x69df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436215808 unmapped: 66600960 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x1a05af000/0x0/0x1bfc00000, data 0x67efbe6/0x69df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x18c6f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436215808 unmapped: 66600960 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436215808 unmapped: 66600960 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4827484 data_alloc: 234881024 data_used: 29872128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436215808 unmapped: 66600960 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.057785988s of 10.570034981s, submitted: 38
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436224000 unmapped: 66592768 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19f40f000/0x0/0x1bfc00000, data 0x67efbe6/0x69df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436240384 unmapped: 66576384 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436240384 unmapped: 66576384 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4850728 data_alloc: 234881024 data_used: 31330304
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b69400 session 0x55bb295bbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d400 session 0x55bb293774a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19f40f000/0x0/0x1bfc00000, data 0x67efbe6/0x69df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb282770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb27a1a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b96a000 session 0x55bb29e69e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436248576 unmapped: 66568192 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4859503 data_alloc: 234881024 data_used: 32546816
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d400 session 0x55bb29617e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436256768 unmapped: 66560000 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb292f3860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb29304780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19f40e000/0x0/0x1bfc00000, data 0x67efbf6/0x69e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb294dc000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb293054a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2850c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb295770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.968399048s of 10.076539040s, submitted: 47
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436273152 unmapped: 66543616 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb295ed4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d8c00 session 0x55bb295b4f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29614f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2a160d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb2850a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436281344 unmapped: 66535424 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27675400 session 0x55bb29351c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441597952 unmapped: 61218816 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19eee2000/0x0/0x1bfc00000, data 0x6d1ac06/0x6f0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441597952 unmapped: 61218816 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5093354 data_alloc: 251658240 data_used: 54112256
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441597952 unmapped: 61218816 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441597952 unmapped: 61218816 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441606144 unmapped: 61210624 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441614336 unmapped: 61202432 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19eee2000/0x0/0x1bfc00000, data 0x6d1ac06/0x6f0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441614336 unmapped: 61202432 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5093354 data_alloc: 251658240 data_used: 54112256
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb29537680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb2850a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441614336 unmapped: 61202432 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb29e68d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443940864 unmapped: 58875904 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.778660774s of 10.187301636s, submitted: 54
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19e5d4000/0x0/0x1bfc00000, data 0x7620c06/0x7812000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b8da400 session 0x55bb27473c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443949056 unmapped: 58867712 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b8da400 session 0x55bb27472d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 448217088 unmapped: 54599680 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 450715648 unmapped: 52101120 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5345354 data_alloc: 268435456 data_used: 61140992
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 48848896 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455278592 unmapped: 47538176 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19d6ce000/0x0/0x1bfc00000, data 0x852dc06/0x871f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455278592 unmapped: 47538176 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455286784 unmapped: 47529984 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455155712 unmapped: 47661056 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5377630 data_alloc: 268435456 data_used: 66424832
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455155712 unmapped: 47661056 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455155712 unmapped: 47661056 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19d6ab000/0x0/0x1bfc00000, data 0x8551c06/0x8743000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455155712 unmapped: 47661056 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.962745667s of 11.495304108s, submitted: 152
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19d6ab000/0x0/0x1bfc00000, data 0x8551c06/0x8743000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455188480 unmapped: 47628288 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456613888 unmapped: 46202880 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5453662 data_alloc: 268435456 data_used: 66465792
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 458620928 unmapped: 44195840 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b72d000/0x0/0x1bfc00000, data 0x932fc06/0x9521000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b72d000/0x0/0x1bfc00000, data 0x932fc06/0x9521000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460242944 unmapped: 42573824 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460955648 unmapped: 41861120 heap: 502816768 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb294da960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ab64800 session 0x55bb2a23f0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b96ac00 session 0x55bb27a14960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469336064 unmapped: 40837120 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb282774a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ab64800 session 0x55bb27a1a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b8da400 session 0x55bb2a23f680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb29536780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb292e5400 session 0x55bb2968fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a3f7000/0x0/0x1bfc00000, data 0xa65dc06/0xa84f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461152256 unmapped: 49020928 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5666580 data_alloc: 268435456 data_used: 69623808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461275136 unmapped: 48898048 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461283328 unmapped: 48889856 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29734b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b80000 session 0x55bb27a15e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a16000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461291520 unmapped: 48881664 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a396000/0x0/0x1bfc00000, data 0xa6c6c06/0xa8b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.123270988s of 10.113728523s, submitted: 222
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461291520 unmapped: 48881664 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb2850a000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462503936 unmapped: 47669248 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b704000/0x0/0x1bfc00000, data 0x9359bf6/0x954a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5460011 data_alloc: 268435456 data_used: 62349312
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ab64800 session 0x55bb29616f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b704000/0x0/0x1bfc00000, data 0x9359bf6/0x954a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462520320 unmapped: 47652864 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295b4960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462528512 unmapped: 47644672 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2850da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb29e681e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b80000 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462528512 unmapped: 47644672 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b700000/0x0/0x1bfc00000, data 0x935cbe6/0x954c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462528512 unmapped: 47644672 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb297350e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2a066d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb27a17c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b700000/0x0/0x1bfc00000, data 0x935cbe6/0x954c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462528512 unmapped: 47644672 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5459683 data_alloc: 268435456 data_used: 62353408
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462528512 unmapped: 47644672 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b6f8000/0x0/0x1bfc00000, data 0x9366be6/0x9556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b6f8000/0x0/0x1bfc00000, data 0x9366be6/0x9556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.276803017s of 11.630755424s, submitted: 48
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5541491 data_alloc: 285212672 data_used: 73699328
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468680704 unmapped: 41492480 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b6f5000/0x0/0x1bfc00000, data 0x9369be6/0x9559000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 41459712 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b6f5000/0x0/0x1bfc00000, data 0x9369be6/0x9559000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 41459712 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 41459712 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5541919 data_alloc: 285212672 data_used: 73699328
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 41459712 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ab3c000/0x0/0x1bfc00000, data 0x9f22be6/0xa112000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472719360 unmapped: 37453824 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2b8da400 session 0x55bb295eda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb295b5680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473071616 unmapped: 37101568 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2850d0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470056960 unmapped: 40116224 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470056960 unmapped: 40116224 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5479255 data_alloc: 268435456 data_used: 68755456
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470056960 unmapped: 40116224 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b87a000/0x0/0x1bfc00000, data 0x91e5bd6/0x93d4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470056960 unmapped: 40116224 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470056960 unmapped: 40116224 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470065152 unmapped: 40108032 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.251715660s of 14.585317612s, submitted: 120
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470065152 unmapped: 40108032 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5479959 data_alloc: 268435456 data_used: 68755456
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2a161a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb2a066b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27873c00 session 0x55bb297345a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295b5a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2a066960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471277568 unmapped: 38895616 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad58000/0x0/0x1bfc00000, data 0x9d07bd6/0x9ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad58000/0x0/0x1bfc00000, data 0x9d07bd6/0x9ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471277568 unmapped: 38895616 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471277568 unmapped: 38895616 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471277568 unmapped: 38895616 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471277568 unmapped: 38895616 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5566221 data_alloc: 268435456 data_used: 68755456
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad58000/0x0/0x1bfc00000, data 0x9d07bd6/0x9ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad58000/0x0/0x1bfc00000, data 0x9d07bd6/0x9ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.870006561s of 10.001375198s, submitted: 23
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5566818 data_alloc: 268435456 data_used: 68755456
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471285760 unmapped: 38887424 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471293952 unmapped: 38879232 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471293952 unmapped: 38879232 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5648898 data_alloc: 285212672 data_used: 80330752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5648898 data_alloc: 285212672 data_used: 80330752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ad57000/0x0/0x1bfc00000, data 0x9d07bf9/0x9ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476594176 unmapped: 33579008 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.434972763s of 13.434974670s, submitted: 0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479027200 unmapped: 31145984 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a77e000/0x0/0x1bfc00000, data 0xa2e0bf9/0xa4d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479559680 unmapped: 30613504 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a33e000/0x0/0x1bfc00000, data 0xa310bf9/0xa500000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5714081 data_alloc: 285212672 data_used: 81842176
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb296aeb40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479592448 unmapped: 30580736 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479592448 unmapped: 30580736 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb28034c00 session 0x55bb2a23f860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 29532160 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a13d000/0x0/0x1bfc00000, data 0xa511bf9/0xa701000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480673792 unmapped: 29499392 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb295ec000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480673792 unmapped: 29499392 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5721933 data_alloc: 285212672 data_used: 81842176
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb27a23c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb29351c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480714752 unmapped: 29458432 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2747c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb28034c00 session 0x55bb293043c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a12f000/0x0/0x1bfc00000, data 0xa51fd97/0xa70f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29351c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476971008 unmapped: 33202176 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476971008 unmapped: 33202176 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.844069481s of 10.619462967s, submitted: 142
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476979200 unmapped: 33193984 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476979200 unmapped: 33193984 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b9da000/0x0/0x1bfc00000, data 0x8c76b87/0x8e64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5457287 data_alloc: 268435456 data_used: 69111808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476979200 unmapped: 33193984 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476987392 unmapped: 33185792 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476995584 unmapped: 33177600 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476995584 unmapped: 33177600 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b9d9000/0x0/0x1bfc00000, data 0x8c76be9/0x8e65000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476995584 unmapped: 33177600 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5458541 data_alloc: 268435456 data_used: 69111808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476995584 unmapped: 33177600 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb295b5a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477011968 unmapped: 33161216 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477020160 unmapped: 33153024 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b9d7000/0x0/0x1bfc00000, data 0x8c76c5b/0x8e67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477020160 unmapped: 33153024 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.279641151s of 11.006599426s, submitted: 10
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477020160 unmapped: 33153024 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5463229 data_alloc: 268435456 data_used: 69111808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477028352 unmapped: 33144832 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477028352 unmapped: 33144832 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477028352 unmapped: 33144832 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b9d7000/0x0/0x1bfc00000, data 0x8c76c5b/0x8e67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477028352 unmapped: 33144832 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477028352 unmapped: 33144832 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5463521 data_alloc: 268435456 data_used: 69115904
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 33136640 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295eda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 33136640 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2cf43000 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29e681e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2850da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477044736 unmapped: 33128448 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295b4960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb2968fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295b50e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb27473860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2a21f000 session 0x55bb294dc000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb296bbc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb295d7400 session 0x55bb295bbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468058112 unmapped: 42115072 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295edc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b501000/0x0/0x1bfc00000, data 0x914bc6b/0x933d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468058112 unmapped: 42115072 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5288852 data_alloc: 268435456 data_used: 56049664
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19c618000/0x0/0x1bfc00000, data 0x8034c48/0x8225000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19c618000/0x0/0x1bfc00000, data 0x8034c48/0x8225000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468058112 unmapped: 42115072 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468058112 unmapped: 42115072 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468058112 unmapped: 42115072 heap: 510173184 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.426679611s of 14.143112183s, submitted: 47
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476372992 unmapped: 37478400 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb27a22780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2a21f000 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb2a1614a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2a23e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29e5b000 session 0x55bb293054a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468246528 unmapped: 45604864 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5349666 data_alloc: 268435456 data_used: 56053760
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468246528 unmapped: 45604864 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2747c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19be35000/0x0/0x1bfc00000, data 0x8818c48/0x8a09000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468377600 unmapped: 45473792 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2a21f000 session 0x55bb27a1a5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb296152c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb295b41e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2850b4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468377600 unmapped: 45473792 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb29614f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ccf0400 session 0x55bb296afc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468402176 unmapped: 45449216 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295b5e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468426752 unmapped: 45424640 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ccf0400 session 0x55bb295bb4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb294dbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5446797 data_alloc: 268435456 data_used: 61054976
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb294da000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ab64c00 session 0x55bb29274960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb2a161680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb26cb8f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ccf0400 session 0x55bb27a16b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468426752 unmapped: 45424640 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2d87c000 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb29e683c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb29351680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb27a223c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2747c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb29274b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb294da3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb292d92c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19b61c000/0x0/0x1bfc00000, data 0x9031d97/0x9221000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 45268992 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb2a066b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19adf4000/0x0/0x1bfc00000, data 0x9857db7/0x9a49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468803584 unmapped: 45047808 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb2a23e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb2a1614a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.050930977s of 10.061657906s, submitted: 129
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2d87c000 session 0x55bb27473860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 44834816 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 44834816 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb295eda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5549801 data_alloc: 268435456 data_used: 66043904
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19adf5000/0x0/0x1bfc00000, data 0x9857db7/0x9a49000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ccf0400 session 0x55bb295edc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469164032 unmapped: 44687360 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469164032 unmapped: 44687360 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469164032 unmapped: 44687360 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469622784 unmapped: 44228608 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473669632 unmapped: 40181760 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb36b66c00 session 0x55bb26cb83c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5673791 data_alloc: 268435456 data_used: 72937472
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19a59c000/0x0/0x1bfc00000, data 0xa0b1bb7/0xa2a2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2d87c000 session 0x55bb27a161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2fd81400 session 0x55bb29350000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473751552 unmapped: 40099840 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476610560 unmapped: 37240832 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a152c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475250688 unmapped: 38600704 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19ab70000/0x0/0x1bfc00000, data 0x9adcbda/0x9cce000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.996892929s of 10.106891632s, submitted: 110
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475496448 unmapped: 38354944 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 heartbeat osd_stat(store_statfs(0x19aae7000/0x0/0x1bfc00000, data 0x9b65bda/0x9d57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475521024 unmapped: 38330368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5660308 data_alloc: 285212672 data_used: 74424320
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb27992000 session 0x55bb2a161c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2747fc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475529216 unmapped: 38322176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb2ccf0400 session 0x55bb2850cb40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 ms_handle_reset con 0x55bb29b7c800 session 0x55bb274734a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472465408 unmapped: 41385984 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 374 handle_osd_map epochs [374,375], i have 374, src has [1,375]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 375 ms_handle_reset con 0x55bb2780d800 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 41369600 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 375 heartbeat osd_stat(store_statfs(0x19b27c000/0x0/0x1bfc00000, data 0x92b5885/0x94a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 375 ms_handle_reset con 0x55bb27992000 session 0x55bb27a1b860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 41369600 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 375 handle_osd_map epochs [375,376], i have 375, src has [1,376]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2747f2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb29e5b000 session 0x55bb296170e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb2a21f000 session 0x55bb282cab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 heartbeat osd_stat(store_statfs(0x19b392000/0x0/0x1bfc00000, data 0x92b7532/0x94ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 41361408 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5396438 data_alloc: 268435456 data_used: 60923904
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a1a960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 467853312 unmapped: 45998080 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb27992000 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469377024 unmapped: 44474368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb295d9800 session 0x55bb296174a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb2e506c00 session 0x55bb293505a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a152c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469385216 unmapped: 44466176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 heartbeat osd_stat(store_statfs(0x19bdf9000/0x0/0x1bfc00000, data 0x885449d/0x8a45000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469385216 unmapped: 44466176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469385216 unmapped: 44466176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5435218 data_alloc: 268435456 data_used: 61128704
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469385216 unmapped: 44466176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 heartbeat osd_stat(store_statfs(0x19bdef000/0x0/0x1bfc00000, data 0x885e49d/0x8a4f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.795957565s of 13.373539925s, submitted: 146
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb36b69400 session 0x55bb2968e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ed680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469385216 unmapped: 44466176 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469196800 unmapped: 44654592 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 ms_handle_reset con 0x55bb27992000 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469196800 unmapped: 44654592 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469196800 unmapped: 44654592 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 handle_osd_map epochs [376,377], i have 376, src has [1,377]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 376 handle_osd_map epochs [377,377], i have 377, src has [1,377]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5436028 data_alloc: 268435456 data_used: 61136896
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19bdef000/0x0/0x1bfc00000, data 0x885e49d/0x8a4f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469204992 unmapped: 44646400 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469204992 unmapped: 44646400 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2968d400 session 0x55bb28063a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb29275a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19bdeb000/0x0/0x1bfc00000, data 0x885ffdc/0x8a52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb279d3000 session 0x55bb2a0661e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2b96a000 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2780d800 session 0x55bb2850da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5278493 data_alloc: 251658240 data_used: 52523008
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19cac4000/0x0/0x1bfc00000, data 0x7b89f99/0x7d79000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19cac4000/0x0/0x1bfc00000, data 0x7b89f99/0x7d79000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5278493 data_alloc: 251658240 data_used: 52523008
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.294272423s of 13.430481911s, submitted: 78
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a165a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 51208192 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19cac5000/0x0/0x1bfc00000, data 0x7b89f99/0x7d79000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461127680 unmapped: 52723712 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2780d800 session 0x55bb27a16000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461127680 unmapped: 52723712 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb279d3000 session 0x55bb27472780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461127680 unmapped: 52723712 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb26cb9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 446881792 unmapped: 66969600 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4739695 data_alloc: 234881024 data_used: 30351360
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x19fdb6000/0x0/0x1bfc00000, data 0x4899f89/0x4a88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 446889984 unmapped: 66961408 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 ms_handle_reset con 0x55bb2b96a000 session 0x55bb29735860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 74170368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 74170368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 74170368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 74170368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4480579 data_alloc: 218103808 data_used: 17907712
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 heartbeat osd_stat(store_statfs(0x1a13c8000/0x0/0x1bfc00000, data 0x3285f27/0x3473000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 74170368 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.249233246s of 10.791379929s, submitted: 80
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 377 handle_osd_map epochs [377,378], i have 377, src has [1,378]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a0670e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436011008 unmapped: 77840384 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d800 session 0x55bb296af2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb279d3000 session 0x55bb292f23c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb293241e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb27992000 session 0x55bb27472d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d400 session 0x55bb296bbc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d800 session 0x55bb295ec3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb279d3000 session 0x55bb2850b4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb2850d4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2968d400 session 0x55bb2850cd20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 heartbeat osd_stat(store_statfs(0x1a15cf000/0x0/0x1bfc00000, data 0x307fbd4/0x326e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4455333 data_alloc: 218103808 data_used: 14311424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d400 session 0x55bb292752c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 heartbeat osd_stat(store_statfs(0x1a15cf000/0x0/0x1bfc00000, data 0x307fbd4/0x326e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436142080 unmapped: 77709312 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 ms_handle_reset con 0x55bb2780d800 session 0x55bb292745a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 378 handle_osd_map epochs [378,379], i have 378, src has [1,379]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436076544 unmapped: 77774848 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb279d3000 session 0x55bb292750e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4461751 data_alloc: 218103808 data_used: 14311424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb274734a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a15a7000/0x0/0x1bfc00000, data 0x30a5746/0x3297000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a15a7000/0x0/0x1bfc00000, data 0x30a5746/0x3297000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4463834 data_alloc: 218103808 data_used: 14311424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a15a7000/0x0/0x1bfc00000, data 0x30a5746/0x3297000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436387840 unmapped: 77463552 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436305920 unmapped: 77545472 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436305920 unmapped: 77545472 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436305920 unmapped: 77545472 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4507994 data_alloc: 234881024 data_used: 20484096
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a15a7000/0x0/0x1bfc00000, data 0x30a5746/0x3297000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a21f000 session 0x55bb29577a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a14960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d800 session 0x55bb29304b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436305920 unmapped: 77545472 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a14f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.592803955s of 20.105058670s, submitted: 88
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb36b69400 session 0x55bb27473a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb295d9800 session 0x55bb296150e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441475072 unmapped: 72376320 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 441475072 unmapped: 72376320 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dd000/0x0/0x1bfc00000, data 0x356e756/0x3761000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,18])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb29537e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d800 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb279d3000 session 0x55bb2a0661e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb36b69400 session 0x55bb28063a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dd000/0x0/0x1bfc00000, data 0x356e756/0x3761000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4557976 data_alloc: 234881024 data_used: 20492288
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dd000/0x0/0x1bfc00000, data 0x356e756/0x3761000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dd000/0x0/0x1bfc00000, data 0x356e756/0x3761000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4558136 data_alloc: 234881024 data_used: 20496384
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb279d3000 session 0x55bb293254a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dd000/0x0/0x1bfc00000, data 0x356e756/0x3761000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb294dda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ed680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d800 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb29b7c800 session 0x55bb27a152c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.745285034s of 11.993100166s, submitted: 20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb293505a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dc000/0x0/0x1bfc00000, data 0x356e766/0x3762000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 437256192 unmapped: 76595200 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4560179 data_alloc: 234881024 data_used: 20500480
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dc000/0x0/0x1bfc00000, data 0x356e766/0x3762000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4587059 data_alloc: 234881024 data_used: 24100864
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436920320 unmapped: 76931072 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dc000/0x0/0x1bfc00000, data 0x356e766/0x3762000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb27a1b2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb29bf5000 session 0x55bb282cab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435347456 unmapped: 78503936 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2fd81400 session 0x55bb295bb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb36b66c00 session 0x55bb292f3a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb292f2960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.957603455s of 10.124045372s, submitted: 19
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb29bf5000 session 0x55bb29304000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2fd81400 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a10dc000/0x0/0x1bfc00000, data 0x356e766/0x3762000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb282cb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 433922048 unmapped: 79929344 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 433922048 unmapped: 79929344 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a0e3f000/0x0/0x1bfc00000, data 0x380c733/0x39fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4568583 data_alloc: 218103808 data_used: 17920000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 433627136 unmapped: 80224256 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435322880 unmapped: 78528512 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a073d000/0x0/0x1bfc00000, data 0x3f08733/0x40fa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435257344 unmapped: 78594048 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a0738000/0x0/0x1bfc00000, data 0x3f0c733/0x40fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,8])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435937280 unmapped: 77914112 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435552256 unmapped: 78299136 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4633519 data_alloc: 218103808 data_used: 19447808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435552256 unmapped: 78299136 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb29b7a400 session 0x55bb296ba5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a06b0000/0x0/0x1bfc00000, data 0x3f9c733/0x418e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435552256 unmapped: 78299136 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2780d400 session 0x55bb296bb4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 heartbeat osd_stat(store_statfs(0x1a06b0000/0x0/0x1bfc00000, data 0x3f9c733/0x418e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435552256 unmapped: 78299136 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.377690315s of 10.624553680s, submitted: 145
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb29bf5000 session 0x55bb27a1b4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435511296 unmapped: 78340096 heap: 513851392 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb2850a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 379 handle_osd_map epochs [379,380], i have 379, src has [1,380]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x1a06b0000/0x0/0x1bfc00000, data 0x3f9c733/0x418e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29b7a400 session 0x55bb296ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435822592 unmapped: 86425600 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4699532 data_alloc: 218103808 data_used: 19456000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 435830784 unmapped: 86417408 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe66000/0x0/0x1bfc00000, data 0x47e339c/0x49d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436494336 unmapped: 85753856 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe66000/0x0/0x1bfc00000, data 0x47e339c/0x49d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436494336 unmapped: 85753856 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436494336 unmapped: 85753856 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436494336 unmapped: 85753856 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4764652 data_alloc: 234881024 data_used: 28524544
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436494336 unmapped: 85753856 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe66000/0x0/0x1bfc00000, data 0x47e339c/0x49d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436502528 unmapped: 85745664 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe5e000/0x0/0x1bfc00000, data 0x47ec39c/0x49e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe5e000/0x0/0x1bfc00000, data 0x47ec39c/0x49e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 436502528 unmapped: 85745664 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2b8dc000 session 0x55bb29e69860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443252736 unmapped: 78995456 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19fe5e000/0x0/0x1bfc00000, data 0x47ec39c/0x49e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443252736 unmapped: 78995456 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4784640 data_alloc: 234881024 data_used: 35868672
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.224851608s of 12.208698273s, submitted: 16
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 443252736 unmapped: 78995456 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462790656 unmapped: 59457536 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19f36c000/0x0/0x1bfc00000, data 0x52de39c/0x54d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,5,0,60])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19eab4000/0x0/0x1bfc00000, data 0x5b9639c/0x5d8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1,0,0,31])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 448151552 unmapped: 74096640 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29b7a400 session 0x55bb29e69c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2a160f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb295ba780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452829184 unmapped: 69419008 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449503232 unmapped: 72744960 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb27872c00 session 0x55bb2a160d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2780d400 session 0x55bb295b5a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4966165 data_alloc: 234881024 data_used: 35880960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 448839680 unmapped: 73408512 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19e5c2000/0x0/0x1bfc00000, data 0x608839c/0x627c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29b7a400 session 0x55bb29351680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2a0672c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb295edc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 448839680 unmapped: 73408512 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2bbd6c00 session 0x55bb27a161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29b7a400 session 0x55bb2850d4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460365824 unmapped: 61882368 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19dcbd000/0x0/0x1bfc00000, data 0x698c3ab/0x6b81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,6])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457277440 unmapped: 64970752 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 447569920 unmapped: 74678272 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082540 data_alloc: 234881024 data_used: 36605952
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2780d400 session 0x55bb296af680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29bf5c00 session 0x55bb2751ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb279d2c00 session 0x55bb297350e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb30ef4400 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 447635456 unmapped: 74612736 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449044480 unmapped: 73203712 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 3.390561819s of 11.727977753s, submitted: 127
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449069056 unmapped: 73179136 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449069056 unmapped: 73179136 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb279d2c00 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19d8fb000/0x0/0x1bfc00000, data 0x6d4d3ce/0x6f43000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449069056 unmapped: 73179136 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5164193 data_alloc: 251658240 data_used: 45740032
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb29b7a400 session 0x55bb293503c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449069056 unmapped: 73179136 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 453615616 unmapped: 68632576 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 65945600 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 heartbeat osd_stat(store_statfs(0x19d8fb000/0x0/0x1bfc00000, data 0x6d4d3ce/0x6f43000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 65945600 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb2fd81400 session 0x55bb295770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb28729c00 session 0x55bb27a1a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 65945600 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5258465 data_alloc: 268435456 data_used: 58949632
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 65945600 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 ms_handle_reset con 0x55bb279d2c00 session 0x55bb2a099e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460939264 unmapped: 61308928 heap: 522248192 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 handle_osd_map epochs [380,381], i have 380, src has [1,381]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 380 handle_osd_map epochs [381,381], i have 381, src has [1,381]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 ms_handle_reset con 0x55bb29b7a400 session 0x55bb27a22b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 ms_handle_reset con 0x55bb2fb88800 session 0x55bb292752c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 ms_handle_reset con 0x55bb2fd81400 session 0x55bb27a154a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.980182171s of 10.046480179s, submitted: 88
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 ms_handle_reset con 0x55bb2a1ec000 session 0x55bb294dcd20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471572480 unmapped: 65208320 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 handle_osd_map epochs [381,382], i have 381, src has [1,382]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 381 handle_osd_map epochs [382,382], i have 382, src has [1,382]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 382 ms_handle_reset con 0x55bb2a1ec000 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471613440 unmapped: 65167360 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 382 handle_osd_map epochs [382,383], i have 382, src has [1,383]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 382 handle_osd_map epochs [383,383], i have 383, src has [1,383]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 383 heartbeat osd_stat(store_statfs(0x19bbf3000/0x0/0x1bfc00000, data 0x8a48ce4/0x8c42000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 383 ms_handle_reset con 0x55bb279d2c00 session 0x55bb295ecf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 65126400 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5555236 data_alloc: 268435456 data_used: 72839168
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 65126400 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 383 handle_osd_map epochs [383,384], i have 383, src has [1,384]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 383 ms_handle_reset con 0x55bb29b7a400 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 384 handle_osd_map epochs [384,384], i have 384, src has [1,384]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 384 ms_handle_reset con 0x55bb2fb88800 session 0x55bb2747f680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474832896 unmapped: 61947904 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 384 ms_handle_reset con 0x55bb2fd81400 session 0x55bb282ca000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 384 heartbeat osd_stat(store_statfs(0x19b0b1000/0x0/0x1bfc00000, data 0x95885ce/0x9784000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474660864 unmapped: 62119936 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474759168 unmapped: 62021632 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 384 handle_osd_map epochs [384,385], i have 384, src has [1,385]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5637998 data_alloc: 285212672 data_used: 74248192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19b0a7000/0x0/0x1bfc00000, data 0x959910d/0x9796000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19b0a7000/0x0/0x1bfc00000, data 0x959910d/0x9796000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.160793304s of 11.584527969s, submitted: 182
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19b0a7000/0x0/0x1bfc00000, data 0x959910d/0x9796000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 62595072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5638174 data_alloc: 285212672 data_used: 74248192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d2c00 session 0x55bb280621e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468819968 unmapped: 67960832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29bf5c00 session 0x55bb296174a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29b51800 session 0x55bb295ed4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19c76a000/0x0/0x1bfc00000, data 0x7ed710d/0x80d4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29b7a400 session 0x55bb26cb8f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19df75000/0x0/0x1bfc00000, data 0x66cd0db/0x68c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19df75000/0x0/0x1bfc00000, data 0x66cd0db/0x68c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5081715 data_alloc: 234881024 data_used: 35815424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19df75000/0x0/0x1bfc00000, data 0x66cd0db/0x68c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5081715 data_alloc: 234881024 data_used: 35815424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19df75000/0x0/0x1bfc00000, data 0x66cd0db/0x68c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461176832 unmapped: 75603968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb2780d800 session 0x55bb2747f2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.727203369s of 14.108214378s, submitted: 92
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d3000 session 0x55bb297345a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d2c00 session 0x55bb27a170e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4948454 data_alloc: 234881024 data_used: 31674368
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19ebfb000/0x0/0x1bfc00000, data 0x5a4a0bb/0x5c43000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19ebfb000/0x0/0x1bfc00000, data 0x5a4a0bb/0x5c43000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 heartbeat osd_stat(store_statfs(0x19ebfb000/0x0/0x1bfc00000, data 0x5a4a0bb/0x5c43000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29b51800 session 0x55bb297350e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29bf5c00 session 0x55bb27a161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29b7a400 session 0x55bb2850d4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d2c00 session 0x55bb29351680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d3000 session 0x55bb295ba780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29b51800 session 0x55bb29e68780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456548352 unmapped: 80232448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb29bf5c00 session 0x55bb2850a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040640 data_alloc: 251658240 data_used: 53006336
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb2a1ec000 session 0x55bb2751ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb2a1ec000 session 0x55bb294dda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468869120 unmapped: 67911680 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d2c00 session 0x55bb294daf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 ms_handle_reset con 0x55bb279d3000 session 0x55bb292745a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468869120 unmapped: 67911680 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 385 handle_osd_map epochs [385,386], i have 385, src has [1,386]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 386 ms_handle_reset con 0x55bb29b51800 session 0x55bb2a067680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 386 ms_handle_reset con 0x55bb29bf5c00 session 0x55bb2a160d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 386 heartbeat osd_stat(store_statfs(0x19ddf3000/0x0/0x1bfc00000, data 0x684fd68/0x6a4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469041152 unmapped: 67739648 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 386 heartbeat osd_stat(store_statfs(0x19ddf3000/0x0/0x1bfc00000, data 0x684fd68/0x6a4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.397292137s of 10.868457794s, submitted: 89
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469073920 unmapped: 67706880 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 386 handle_osd_map epochs [386,387], i have 386, src has [1,387]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 ms_handle_reset con 0x55bb29b51800 session 0x55bb2a23e000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461553664 unmapped: 75227136 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4954987 data_alloc: 234881024 data_used: 35934208
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461553664 unmapped: 75227136 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461553664 unmapped: 75227136 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 heartbeat osd_stat(store_statfs(0x19ee7b000/0x0/0x1bfc00000, data 0x53b6a44/0x55b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 ms_handle_reset con 0x55bb29bf5000 session 0x55bb282ca5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb2a161e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 ms_handle_reset con 0x55bb2780d400 session 0x55bb294dba40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 heartbeat osd_stat(store_statfs(0x19ee7b000/0x0/0x1bfc00000, data 0x53b6a44/0x55b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461553664 unmapped: 75227136 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 ms_handle_reset con 0x55bb2a1ec000 session 0x55bb29e69e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461553664 unmapped: 75227136 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 heartbeat osd_stat(store_statfs(0x19ee7c000/0x0/0x1bfc00000, data 0x53b6a44/0x55b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 387 handle_osd_map epochs [387,388], i have 387, src has [1,388]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461561856 unmapped: 75218944 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4959309 data_alloc: 234881024 data_used: 36061184
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19ee78000/0x0/0x1bfc00000, data 0x53b85bb/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.128215790s of 10.335646629s, submitted: 80
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4970637 data_alloc: 234881024 data_used: 36892672
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19ee79000/0x0/0x1bfc00000, data 0x53b85bb/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461594624 unmapped: 75186176 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 74137600 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4971149 data_alloc: 234881024 data_used: 36892672
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462643200 unmapped: 74137600 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19ee79000/0x0/0x1bfc00000, data 0x53b85bb/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.817487717s of 10.957990646s, submitted: 25
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988635 data_alloc: 251658240 data_used: 39628800
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19ee79000/0x0/0x1bfc00000, data 0x53b85bb/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb29576960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb295b5e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462659584 unmapped: 74121216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb2747e1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457277440 unmapped: 79503360 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2fb88800 session 0x55bb274734a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457302016 unmapped: 79478784 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19fca1000/0x0/0x1bfc00000, data 0x4590598/0x478c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457310208 unmapped: 79470592 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4832118 data_alloc: 234881024 data_used: 33005568
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457310208 unmapped: 79470592 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457310208 unmapped: 79470592 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 79454208 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b7b800 session 0x55bb29615a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb2747da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2fb88800 session 0x55bb2747da40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23f000/0x0/0x1bfc00000, data 0x4ff3598/0x51ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4917248 data_alloc: 234881024 data_used: 33005568
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23f000/0x0/0x1bfc00000, data 0x4ff3598/0x51ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23f000/0x0/0x1bfc00000, data 0x4ff3598/0x51ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 79290368 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4917248 data_alloc: 234881024 data_used: 33005568
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.165901184s of 15.479522705s, submitted: 70
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 79372288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2a1ed400 session 0x55bb2747e1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 79372288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb295b5e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23f000/0x0/0x1bfc00000, data 0x4ff3598/0x51ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 79372288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23f000/0x0/0x1bfc00000, data 0x4ff3598/0x51ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb29576960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457424896 unmapped: 79355904 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb29e69e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457424896 unmapped: 79355904 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4922478 data_alloc: 234881024 data_used: 33746944
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457424896 unmapped: 79355904 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23e000/0x0/0x1bfc00000, data 0x4ff35a8/0x51f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 457424896 unmapped: 79355904 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b51c00 session 0x55bb28062d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2e506400 session 0x55bb292d9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2d87c800 session 0x55bb293050e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455393280 unmapped: 81387520 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455393280 unmapped: 81387520 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23e000/0x0/0x1bfc00000, data 0x4ff35a8/0x51f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455393280 unmapped: 81387520 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4980674 data_alloc: 251658240 data_used: 45002752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.944196701s of 11.008917809s, submitted: 6
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b51c00 session 0x55bb295ed4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f23c000/0x0/0x1bfc00000, data 0x4ff361a/0x51f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4985406 data_alloc: 251658240 data_used: 45002752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 454049792 unmapped: 82731008 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455999488 unmapped: 80781312 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19ea20000/0x0/0x1bfc00000, data 0x580f61a/0x5a0e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455999488 unmapped: 80781312 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb29304f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb295ec000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2a1ec400 session 0x55bb29617860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456015872 unmapped: 80764928 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb282ca000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb28063680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 80732160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5097323 data_alloc: 251658240 data_used: 45273088
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b51c00 session 0x55bb27a23860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2747c960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19e4b3000/0x0/0x1bfc00000, data 0x5d7c61a/0x5f7b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29bf5000 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5095860 data_alloc: 251658240 data_used: 45277184
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19e4b3000/0x0/0x1bfc00000, data 0x5d7c61a/0x5f7b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.754028320s of 15.305920601s, submitted: 141
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb2a1601e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456220672 unmapped: 80560128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b51c00 session 0x55bb2a161680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a1ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456228864 unmapped: 80551936 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2780d400 session 0x55bb28062b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb29b51800 session 0x55bb2a23eb40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 84189184 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452526080 unmapped: 84254720 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4853740 data_alloc: 234881024 data_used: 31211520
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a141e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452526080 unmapped: 84254720 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 heartbeat osd_stat(store_statfs(0x19f7aa000/0x0/0x1bfc00000, data 0x4a8561a/0x4c84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452534272 unmapped: 84246528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452534272 unmapped: 84246528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 452534272 unmapped: 84246528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d2c00 session 0x55bb27a1b860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 ms_handle_reset con 0x55bb279d3000 session 0x55bb294def00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455950336 unmapped: 80830464 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4864620 data_alloc: 234881024 data_used: 34750464
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 388 handle_osd_map epochs [388,389], i have 388, src has [1,389]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455950336 unmapped: 80830464 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 ms_handle_reset con 0x55bb29b51c00 session 0x55bb296161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449896448 unmapped: 86884352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.413146019s of 10.642684937s, submitted: 77
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 heartbeat osd_stat(store_statfs(0x19ffa8000/0x0/0x1bfc00000, data 0x42872b7/0x4486000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 ms_handle_reset con 0x55bb29e5a400 session 0x55bb296ba000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 449896448 unmapped: 86884352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 ms_handle_reset con 0x55bb2780d400 session 0x55bb29735860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 ms_handle_reset con 0x55bb279d2c00 session 0x55bb280623c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 handle_osd_map epochs [389,390], i have 389, src has [1,390]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 450945024 unmapped: 85835776 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 389 handle_osd_map epochs [390,390], i have 390, src has [1,390]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 ms_handle_reset con 0x55bb2d87c800 session 0x55bb297343c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 450945024 unmapped: 85835776 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 ms_handle_reset con 0x55bb279d3000 session 0x55bb29e68d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4834821 data_alloc: 234881024 data_used: 27422720
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 ms_handle_reset con 0x55bb29b51c00 session 0x55bb294da3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 450945024 unmapped: 85835776 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 450945024 unmapped: 85835776 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451231744 unmapped: 85549056 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 heartbeat osd_stat(store_statfs(0x19fb0f000/0x0/0x1bfc00000, data 0x471cf4f/0x491e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 61K writes, 243K keys, 61K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.04 MB/s#012Cumulative WAL: 61K writes, 22K syncs, 2.75 writes per sync, written: 0.23 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7872 writes, 30K keys, 7872 commit groups, 1.0 writes per commit group, ingest: 29.88 MB, 0.05 MB/s#012Interval WAL: 7871 writes, 3051 syncs, 2.58 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bb25ed5610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 ms_handle_reset con 0x55bb279d3000 session 0x55bb282ca960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451264512 unmapped: 85516288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 390 handle_osd_map epochs [390,391], i have 390, src has [1,391]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451264512 unmapped: 85516288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4877911 data_alloc: 234881024 data_used: 32944128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2751ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451739648 unmapped: 85041152 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 heartbeat osd_stat(store_statfs(0x19ef21000/0x0/0x1bfc00000, data 0x530aa8e/0x550d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451739648 unmapped: 85041152 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 heartbeat osd_stat(store_statfs(0x19ef21000/0x0/0x1bfc00000, data 0x530aa8e/0x550d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b7cf9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451739648 unmapped: 85041152 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb2b55a400 session 0x55bb2968fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb29b52800 session 0x55bb294de000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb27883000 session 0x55bb293505a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451739648 unmapped: 85041152 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.172283173s of 12.346176147s, submitted: 72
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb279d3000 session 0x55bb295763c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 451739648 unmapped: 85041152 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4977913 data_alloc: 234881024 data_used: 32944128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 handle_osd_map epochs [391,392], i have 391, src has [1,392]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 handle_osd_map epochs [392,392], i have 392, src has [1,392]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 391 ms_handle_reset con 0x55bb29b52800 session 0x55bb29275e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455286784 unmapped: 81494016 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 455286784 unmapped: 81494016 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2b55a400 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 456900608 unmapped: 79880192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2d87c800 session 0x55bb29617e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19d978000/0x0/0x1bfc00000, data 0x5711749/0x5916000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb36b67800 session 0x55bb29616b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459276288 unmapped: 77504512 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a17c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb28062960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2b55a400 session 0x55bb27a1be00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2d87c800 session 0x55bb29e681e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459292672 unmapped: 77488128 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5129270 data_alloc: 251658240 data_used: 38141952
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459374592 unmapped: 77406208 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460611584 unmapped: 76169216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19cd48000/0x0/0x1bfc00000, data 0x633f7bb/0x6546000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460611584 unmapped: 76169216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460611584 unmapped: 76169216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460611584 unmapped: 76169216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5137238 data_alloc: 251658240 data_used: 38412288
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.766183853s of 11.144976616s, submitted: 142
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460619776 unmapped: 76161024 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460619776 unmapped: 76161024 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460627968 unmapped: 76152832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19cd48000/0x0/0x1bfc00000, data 0x633f7bb/0x6546000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279cac00 session 0x55bb294f9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460627968 unmapped: 76152832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460627968 unmapped: 76152832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5136822 data_alloc: 251658240 data_used: 38408192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460627968 unmapped: 76152832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb295bbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19cd48000/0x0/0x1bfc00000, data 0x633f7bb/0x6546000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460627968 unmapped: 76152832 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb29e68960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2780d400 session 0x55bb27472d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d2c00 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460636160 unmapped: 76144640 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2b55a400 session 0x55bb295372c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460636160 unmapped: 76144640 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460636160 unmapped: 76144640 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5138769 data_alloc: 251658240 data_used: 38408192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.692634583s of 10.237747192s, submitted: 31
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460644352 unmapped: 76136448 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb293512c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a16960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462200832 unmapped: 74579968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19cd46000/0x0/0x1bfc00000, data 0x633f7ee/0x6548000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2d87c800 session 0x55bb29e68780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2978b800 session 0x55bb295b5680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b81800 session 0x55bb292743c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb2a23f2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 464986112 unmapped: 71794688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2978b800 session 0x55bb27a150e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2d87c800 session 0x55bb29274780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5301068 data_alloc: 251658240 data_used: 49684480
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2cefa800 session 0x55bb29617c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb295bb2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19c364000/0x0/0x1bfc00000, data 0x6d207fe/0x6f2a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465256448 unmapped: 71524352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19c364000/0x0/0x1bfc00000, data 0x6d207fe/0x6f2a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2978b800 session 0x55bb296ba5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465264640 unmapped: 71516160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5293282 data_alloc: 251658240 data_used: 49577984
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb29735680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465264640 unmapped: 71516160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19c387000/0x0/0x1bfc00000, data 0x6cfc80d/0x6f07000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.412647247s of 10.589317322s, submitted: 34
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19c363000/0x0/0x1bfc00000, data 0x6d2080d/0x6f2b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 465264640 unmapped: 71516160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470392832 unmapped: 66387968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb297a1400 session 0x55bb29617e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19c1f5000/0x0/0x1bfc00000, data 0x6e8e80d/0x7099000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb294f8000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470925312 unmapped: 65855488 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475275264 unmapped: 61505536 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5460866 data_alloc: 268435456 data_used: 57360384
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2978b800 session 0x55bb29537e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb29576b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475480064 unmapped: 61300736 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475496448 unmapped: 61284352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475512832 unmapped: 61267968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19a98b000/0x0/0x1bfc00000, data 0x754f830/0x775b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475512832 unmapped: 61267968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480034816 unmapped: 56745984 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2ab65800 session 0x55bb28063c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279cb800 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5532320 data_alloc: 268435456 data_used: 67592192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19a98b000/0x0/0x1bfc00000, data 0x754f830/0x775b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279cb800 session 0x55bb2850c5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19a996000/0x0/0x1bfc00000, data 0x754c830/0x7758000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5530164 data_alloc: 268435456 data_used: 67588096
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb279d3000 session 0x55bb27a22f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.147744179s of 14.091394424s, submitted: 132
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb2978b800 session 0x55bb27a1ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19a996000/0x0/0x1bfc00000, data 0x754c830/0x7758000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481771520 unmapped: 55009280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5550290 data_alloc: 268435456 data_used: 67596288
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482746368 unmapped: 54034432 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 heartbeat osd_stat(store_statfs(0x19a719000/0x0/0x1bfc00000, data 0x77c9830/0x79d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 ms_handle_reset con 0x55bb29b52800 session 0x55bb294f9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484065280 unmapped: 52715520 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 handle_osd_map epochs [392,393], i have 392, src has [1,393]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 392 handle_osd_map epochs [393,393], i have 393, src has [1,393]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 51224576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb2ab65800 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 heartbeat osd_stat(store_statfs(0x19a556000/0x0/0x1bfc00000, data 0x798347b/0x7b8f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 51224576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 51224576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0571429
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1174405120 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1107296256 meta_used: 5577368 data_alloc: 268435456 data_used: 68186112
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb279d3000 session 0x55bb295774a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.472976685s of 10.119526863s, submitted: 86
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb279cb800 session 0x55bb27a150e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480747520 unmapped: 56033280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb2978b800 session 0x55bb292743c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480747520 unmapped: 56033280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480747520 unmapped: 56033280 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb29b52800 session 0x55bb296161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 ms_handle_reset con 0x55bb27993400 session 0x55bb28063680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 heartbeat osd_stat(store_statfs(0x19b14b000/0x0/0x1bfc00000, data 0x6d9747b/0x6fa3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 393 handle_osd_map epochs [393,394], i have 393, src has [1,394]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481058816 unmapped: 55721984 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481058816 unmapped: 55721984 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5408705 data_alloc: 268435456 data_used: 57008128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2978b800 session 0x55bb2747fc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19b104000/0x0/0x1bfc00000, data 0x6ddafdd/0x6fe9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481075200 unmapped: 55705600 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb29b52800 session 0x55bb2a23fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19b104000/0x0/0x1bfc00000, data 0x6ddafdd/0x6fe9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5408845 data_alloc: 268435456 data_used: 57716736
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19b0fb000/0x0/0x1bfc00000, data 0x6de4fdd/0x6ff3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.377502441s of 10.793389320s, submitted: 41
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb27883c00 session 0x55bb296aef00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb295b7c00 session 0x55bb294db680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb29b80400 session 0x55bb29e683c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 55697408 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb27883c00 session 0x55bb295bb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19bf37000/0x0/0x1bfc00000, data 0x5fa9faa/0x61b6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5224681 data_alloc: 251658240 data_used: 46977024
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5245749 data_alloc: 251658240 data_used: 48820224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb295b7c00 session 0x55bb2850dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19bf38000/0x0/0x1bfc00000, data 0x5fa9faa/0x61b6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19bf32000/0x0/0x1bfc00000, data 0x5faffaa/0x61bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.392992020s of 10.005044937s, submitted: 98
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2978b800 session 0x55bb296bba40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb29b52800 session 0x55bb292f2d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19b4b2000/0x0/0x1bfc00000, data 0x6a2ffaa/0x6c3c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5334620 data_alloc: 251658240 data_used: 48812032
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2fb88800 session 0x55bb282ca5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb29bf4000 session 0x55bb2a23e000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478560256 unmapped: 58220544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2780d400 session 0x55bb294def00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb279d2c00 session 0x55bb280623c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2d87c800 session 0x55bb29e692c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478568448 unmapped: 58212352 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474300416 unmapped: 62480384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb279cb800 session 0x55bb29350f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb279d3000 session 0x55bb29351e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb27883c00 session 0x55bb294dba40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19c739000/0x0/0x1bfc00000, data 0x57a9f9a/0x59b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474349568 unmapped: 62431232 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5118365 data_alloc: 251658240 data_used: 43204608
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474398720 unmapped: 62382080 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.672562599s of 10.000566483s, submitted: 260
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474456064 unmapped: 62324736 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb295b7c00 session 0x55bb29735680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474464256 unmapped: 62316544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470032384 unmapped: 66748416 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2978b800 session 0x55bb292745a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf05/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470032384 unmapped: 66748416 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4968807 data_alloc: 234881024 data_used: 34607104
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470040576 unmapped: 66740224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470040576 unmapped: 66740224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf05/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470048768 unmapped: 66732032 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf05/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470048768 unmapped: 66732032 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470048768 unmapped: 66732032 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4969447 data_alloc: 234881024 data_used: 34623488
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471113728 unmapped: 65667072 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471097344 unmapped: 65683456 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471097344 unmapped: 65683456 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471097344 unmapped: 65683456 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f8000/0x0/0x1bfc00000, data 0x4cedf05/0x4ef6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb27883c00 session 0x55bb2a160780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471097344 unmapped: 65683456 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4980847 data_alloc: 234881024 data_used: 35708928
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb279d3000 session 0x55bb294dc000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.569890976s of 14.646986008s, submitted: 127
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471105536 unmapped: 65675264 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb295b7c00 session 0x55bb295b4780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471064576 unmapped: 65716224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471228416 unmapped: 65552384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf15/0x4ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5057633 data_alloc: 251658240 data_used: 46551040
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf15/0x4ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5057105 data_alloc: 251658240 data_used: 46551040
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19d1f7000/0x0/0x1bfc00000, data 0x4cedf15/0x4ef7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471236608 unmapped: 65544192 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.735871315s of 11.856009483s, submitted: 8
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 62537728 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19cee7000/0x0/0x1bfc00000, data 0x4ffdf15/0x5207000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 59744256 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477044736 unmapped: 59736064 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5161677 data_alloc: 251658240 data_used: 46542848
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 60497920 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb29bf4000 session 0x55bb2a160d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 60497920 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 60489728 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a161a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477421568 unmapped: 59359232 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19c023000/0x0/0x1bfc00000, data 0x5aaff15/0x5cb9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477429760 unmapped: 59351040 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5166495 data_alloc: 251658240 data_used: 47095808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477437952 unmapped: 59342848 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477388800 unmapped: 59392000 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb27883c00 session 0x55bb29616d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477388800 unmapped: 59392000 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.980478287s of 11.006450653s, submitted: 147
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19c00e000/0x0/0x1bfc00000, data 0x5ac7f06/0x5cd0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477388800 unmapped: 59392000 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb279d2c00 session 0x55bb295b50e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2a23fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477396992 unmapped: 59383808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172517 data_alloc: 251658240 data_used: 47411200
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477405184 unmapped: 59375616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477405184 unmapped: 59375616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 heartbeat osd_stat(store_statfs(0x19c00f000/0x0/0x1bfc00000, data 0x5ac7ef6/0x5ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474759168 unmapped: 62021632 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474931200 unmapped: 61849600 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 394 handle_osd_map epochs [394,395], i have 394, src has [1,395]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb29b50800 session 0x55bb280630e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb29b50800 session 0x55bb2a161860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb279d3000 session 0x55bb297343c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb295b7c00 session 0x55bb295eda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb2780d400 session 0x55bb292d94a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb27883c00 session 0x55bb2747dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474947584 unmapped: 61833216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 heartbeat osd_stat(store_statfs(0x19c72c000/0x0/0x1bfc00000, data 0x53a6b7e/0x55ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5031519 data_alloc: 234881024 data_used: 34050048
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474947584 unmapped: 61833216 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474669056 unmapped: 62111744 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 395 handle_osd_map epochs [395,396], i have 395, src has [1,396]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 396 ms_handle_reset con 0x55bb279d3000 session 0x55bb29e69e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474677248 unmapped: 62103552 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474685440 unmapped: 62095360 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474685440 unmapped: 62095360 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5034254 data_alloc: 234881024 data_used: 33599488
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 396 heartbeat osd_stat(store_statfs(0x19c72b000/0x0/0x1bfc00000, data 0x53a87f5/0x55b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474693632 unmapped: 62087168 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.442606926s of 13.067510605s, submitted: 99
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474693632 unmapped: 62087168 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 396 handle_osd_map epochs [396,397], i have 396, src has [1,397]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 62046208 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 ms_handle_reset con 0x55bb295b7c00 session 0x55bb2a161c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 heartbeat osd_stat(store_statfs(0x19c728000/0x0/0x1bfc00000, data 0x53aa4a2/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468918272 unmapped: 67862528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 ms_handle_reset con 0x55bb29b50800 session 0x55bb294da000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468918272 unmapped: 67862528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4995036 data_alloc: 234881024 data_used: 28954624
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 ms_handle_reset con 0x55bb279d2c00 session 0x55bb29e694a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468918272 unmapped: 67862528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 ms_handle_reset con 0x55bb2780d400 session 0x55bb29274b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 ms_handle_reset con 0x55bb279d2c00 session 0x55bb2747cb40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468918272 unmapped: 67862528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468918272 unmapped: 67862528 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 397 handle_osd_map epochs [397,398], i have 397, src has [1,398]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19cb94000/0x0/0x1bfc00000, data 0x4f3bff1/0x5149000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084621 data_alloc: 251658240 data_used: 40611840
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084621 data_alloc: 251658240 data_used: 40611840
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.443560600s of 13.701803207s, submitted: 71
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19cb94000/0x0/0x1bfc00000, data 0x4f3bff1/0x5149000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19cb94000/0x0/0x1bfc00000, data 0x4f3bff1/0x5149000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb279d3000 session 0x55bb292f34a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb295b7c00 session 0x55bb2a23e000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470368256 unmapped: 66412544 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460513280 unmapped: 76267520 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2b7b7400 session 0x55bb26cb9a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2d87c800 session 0x55bb27a145a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb29b50800 session 0x55bb292f34a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d7db000/0x0/0x1bfc00000, data 0x42faf7f/0x4503000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4913310 data_alloc: 234881024 data_used: 28966912
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d7da000/0x0/0x1bfc00000, data 0x4525f5c/0x4503000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4931880 data_alloc: 234881024 data_used: 28966912
Jan 31 04:20:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460521472 unmapped: 76259328 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a1ab40
Jan 31 04:20:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:22.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2b7b7400 session 0x55bb2a23fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 76300288 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb279d2c00 session 0x55bb27a1a3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.576138496s of 12.279341698s, submitted: 43
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459268096 unmapped: 77512704 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459276288 unmapped: 77504512 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d7db000/0x0/0x1bfc00000, data 0x4525f5c/0x4503000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2b7b7400 session 0x55bb292d92c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459276288 unmapped: 77504512 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4936334 data_alloc: 234881024 data_used: 29081600
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459276288 unmapped: 77504512 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d7c1000/0x0/0x1bfc00000, data 0x456cf5c/0x451d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459284480 unmapped: 77496320 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459284480 unmapped: 77496320 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2a23e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb279d3000 session 0x55bb26cb83c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460349440 unmapped: 76431360 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb295b7c00 session 0x55bb296bb680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460472320 unmapped: 76308480 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4976173 data_alloc: 234881024 data_used: 29081600
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d4a3000/0x0/0x1bfc00000, data 0x488af5c/0x483b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460472320 unmapped: 76308480 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d4a3000/0x0/0x1bfc00000, data 0x488af5c/0x483b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460472320 unmapped: 76308480 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2e506800 session 0x55bb2a23f860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2e506800 session 0x55bb295770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb279d3000 session 0x55bb2a098b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460472320 unmapped: 76308480 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb295b7c00 session 0x55bb297350e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.535288811s of 10.724942207s, submitted: 44
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2b7b7400 session 0x55bb2a160f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb2d87c800 session 0x55bb2a160000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb279d3000 session 0x55bb26cb8f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 ms_handle_reset con 0x55bb295b7c00 session 0x55bb295ecf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 heartbeat osd_stat(store_statfs(0x19d038000/0x0/0x1bfc00000, data 0x4cf5f5c/0x4ca6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 459997184 unmapped: 76783616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 handle_osd_map epochs [398,399], i have 398, src has [1,399]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 398 handle_osd_map epochs [399,399], i have 399, src has [1,399]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460013568 unmapped: 76767232 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5049660 data_alloc: 234881024 data_used: 30433280
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 399 handle_osd_map epochs [399,400], i have 399, src has [1,400]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 400 ms_handle_reset con 0x55bb2b7b7400 session 0x55bb29274780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 460021760 unmapped: 76759040 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 400 handle_osd_map epochs [400,401], i have 400, src has [1,401]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463708160 unmapped: 73072640 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 401 ms_handle_reset con 0x55bb2cd66800 session 0x55bb295ed680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 401 heartbeat osd_stat(store_statfs(0x19ce79000/0x0/0x1bfc00000, data 0x4eaf8d7/0x4e64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 401 ms_handle_reset con 0x55bb2e506800 session 0x55bb2747c5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 401 handle_osd_map epochs [401,402], i have 401, src has [1,402]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 401 handle_osd_map epochs [402,402], i have 402, src has [1,402]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb29e5a000 session 0x55bb27473c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb29b7b400 session 0x55bb295b4780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463126528 unmapped: 73654272 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463126528 unmapped: 73654272 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 heartbeat osd_stat(store_statfs(0x19c7cf000/0x0/0x1bfc00000, data 0x5557223/0x550f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb2780c800 session 0x55bb294f9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb2d87c400 session 0x55bb280630e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb29694c00 session 0x55bb29350f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463626240 unmapped: 73154560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5129093 data_alloc: 234881024 data_used: 30834688
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 heartbeat osd_stat(store_statfs(0x19c7cc000/0x0/0x1bfc00000, data 0x55572a5/0x5512000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463626240 unmapped: 73154560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 heartbeat osd_stat(store_statfs(0x19c7cc000/0x0/0x1bfc00000, data 0x55572a5/0x5512000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb29e5a000 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463896576 unmapped: 72884224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 heartbeat osd_stat(store_statfs(0x19c777000/0x0/0x1bfc00000, data 0x55ac2a5/0x5567000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 ms_handle_reset con 0x55bb2e506800 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462946304 unmapped: 73834496 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 402 handle_osd_map epochs [402,403], i have 402, src has [1,403]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.445144653s of 10.668481827s, submitted: 93
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c56d000/0x0/0x1bfc00000, data 0x57b62a5/0x5771000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462946304 unmapped: 73834496 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb27a16960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2cefc000 session 0x55bb2751ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c56d000/0x0/0x1bfc00000, data 0x57b62a5/0x5771000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b7b400 session 0x55bb295763c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462946304 unmapped: 73834496 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5182918 data_alloc: 234881024 data_used: 35315712
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb27a150e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb280623c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb2968e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c9d6000/0x0/0x1bfc00000, data 0x534cf26/0x5308000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5116563 data_alloc: 234881024 data_used: 30785536
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb294daf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb29274960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb2850c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b7b400 session 0x55bb294dbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb294f9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb27473c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a160000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172177 data_alloc: 234881024 data_used: 30785536
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172177 data_alloc: 234881024 data_used: 30785536
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.317159653s of 18.586101532s, submitted: 71
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb2a098b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29e5a000 session 0x55bb2850de00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462192640 unmapped: 74588160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780d400 session 0x55bb29616d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb27a16b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b50800 session 0x55bb294dd2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb29350000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462200832 unmapped: 74579968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5214363 data_alloc: 234881024 data_used: 36044800
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29e5a000 session 0x55bb2747c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a161e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c396000/0x0/0x1bfc00000, data 0x598afab/0x5948000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c371000/0x0/0x1bfc00000, data 0x59aefba/0x596d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b50800 session 0x55bb29536b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb29735e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c596000/0x0/0x1bfc00000, data 0x5789fba/0x5748000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2e506800 session 0x55bb27a22780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5214697 data_alloc: 251658240 data_used: 38395904
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 403 handle_osd_map epochs [404,404], i have 404, src has [1,404]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2cf43c00 session 0x55bb2a160780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb29b52800 session 0x55bb292f3e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb29577680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2780c800 session 0x55bb26cb9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 heartbeat osd_stat(store_statfs(0x19c5b1000/0x0/0x1bfc00000, data 0x5515bf5/0x572d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.468164444s of 11.910409927s, submitted: 76
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463290368 unmapped: 73490432 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5247418 data_alloc: 251658240 data_used: 38752256
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb29b50800 session 0x55bb28063c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2d87c400 session 0x55bb2968e780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461103104 unmapped: 75677696 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461103104 unmapped: 75677696 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 heartbeat osd_stat(store_statfs(0x19d960000/0x0/0x1bfc00000, data 0x4165bf5/0x437d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 69074944 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 467877888 unmapped: 68902912 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 405 heartbeat osd_stat(store_statfs(0x19cf66000/0x0/0x1bfc00000, data 0x4dbebf5/0x4d78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780c800 session 0x55bb2850a000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082544 data_alloc: 234881024 data_used: 26394624
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.698586464s of 10.684947014s, submitted: 257
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468967424 unmapped: 67813376 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5243277 data_alloc: 234881024 data_used: 26406912
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b50800 session 0x55bb2751b860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b52800 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2e506800 session 0x55bb296165a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a160d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469311488 unmapped: 75341824 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19bd75000/0x0/0x1bfc00000, data 0x5fad3ec/0x5f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780d400 session 0x55bb295b5680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29694c00 session 0x55bb28063a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19bd75000/0x0/0x1bfc00000, data 0x5fad425/0x5f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469311488 unmapped: 75341824 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b50800 session 0x55bb295ba960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469139456 unmapped: 75513856 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 heartbeat osd_stat(store_statfs(0x19bd9a000/0x0/0x1bfc00000, data 0x5f89416/0x5f44000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469155840 unmapped: 75497472 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29b52800 session 0x55bb2850c5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2780c800 session 0x55bb294dc000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5209919 data_alloc: 234881024 data_used: 26370048
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29694c00 session 0x55bb29735860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29b50800 session 0x55bb295b50e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb295b4d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb27903c00 session 0x55bb292d9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 heartbeat osd_stat(store_statfs(0x19bd71000/0x0/0x1bfc00000, data 0x5faefc7/0x5f6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 407 handle_osd_map epochs [408,408], i have 408, src has [1,408]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 408 ms_handle_reset con 0x55bb29694c00 session 0x55bb29e683c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470220800 unmapped: 74432512 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 408 ms_handle_reset con 0x55bb29b50800 session 0x55bb274734a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.688539028s of 10.181116104s, submitted: 137
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 408 handle_osd_map epochs [409,409], i have 409, src has [1,409]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2850a000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472137728 unmapped: 72515584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5147665 data_alloc: 251658240 data_used: 38809600
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694400 session 0x55bb2850dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a22780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472145920 unmapped: 72507392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb27903c00 session 0x55bb2a23e5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 heartbeat osd_stat(store_statfs(0x19d302000/0x0/0x1bfc00000, data 0x47c294b/0x49dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,3])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 67059712 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694c00 session 0x55bb27a16b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473128960 unmapped: 71524352 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473128960 unmapped: 71524352 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5163434 data_alloc: 251658240 data_used: 38809600
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 heartbeat osd_stat(store_statfs(0x19ceb8000/0x0/0x1bfc00000, data 0x4c0c94b/0x4e26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19ceb8000/0x0/0x1bfc00000, data 0x4c0c94b/0x4e26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473169920 unmapped: 71483392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.688801765s of 10.107757568s, submitted: 117
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 68362240 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5212668 data_alloc: 251658240 data_used: 38866944
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476856320 unmapped: 67796992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c837000/0x0/0x1bfc00000, data 0x52864c2/0x54a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478584832 unmapped: 66068480 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b50800 session 0x55bb2a098b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27903c00 session 0x55bb294f9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694400 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694c00 session 0x55bb294dbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c683000/0x0/0x1bfc00000, data 0x54314c2/0x564c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5257436 data_alloc: 251658240 data_used: 39890944
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c671000/0x0/0x1bfc00000, data 0x54524c2/0x566d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5252592 data_alloc: 251658240 data_used: 39927808
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb29274960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478478336 unmapped: 66174976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.228658676s of 13.433518410s, submitted: 62
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694400 session 0x55bb2751af00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c671000/0x0/0x1bfc00000, data 0x54524c2/0x566d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5297052 data_alloc: 251658240 data_used: 45432832
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a161a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c63f000/0x0/0x1bfc00000, data 0x54844c2/0x569f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb2850ba40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d8f0000/0x0/0x1bfc00000, data 0x3b72460/0x3d8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4972366 data_alloc: 234881024 data_used: 27074560
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d8f0000/0x0/0x1bfc00000, data 0x3b72460/0x3d8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478314496 unmapped: 66338816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.538283348s of 10.021756172s, submitted: 58
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dbc1000/0x0/0x1bfc00000, data 0x3efb460/0x4115000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478748672 unmapped: 65904640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dbc1000/0x0/0x1bfc00000, data 0x3efb460/0x4115000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479059968 unmapped: 65593344 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5028850 data_alloc: 234881024 data_used: 27836416
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 64315392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d7a9000/0x0/0x1bfc00000, data 0x431b460/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5037736 data_alloc: 234881024 data_used: 28106752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d769000/0x0/0x1bfc00000, data 0x435b460/0x4575000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5038588 data_alloc: 234881024 data_used: 28106752
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.746368408s of 12.203164101s, submitted: 87
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2968e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27903c00 session 0x55bb27a141e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb29351e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d74b000/0x0/0x1bfc00000, data 0x4379460/0x4593000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5039636 data_alloc: 234881024 data_used: 28098560
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d742000/0x0/0x1bfc00000, data 0x4382460/0x459c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694c00 session 0x55bb29351680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb280623c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780c800 session 0x55bb27a223c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb36b66800 session 0x55bb282ca3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb28035400 session 0x55bb29e69680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b52400 session 0x55bb29576f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.523767471s of 23.023344040s, submitted: 47
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb2850ba40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b7b400 session 0x55bb2a23e960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb27472000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480632832 unmapped: 64020480 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb2850c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cd67000 session 0x55bb29537680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb294f94a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a1be00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b7b400 session 0x55bb282cb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb29536780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477405184 unmapped: 67248128 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4903504 data_alloc: 234881024 data_used: 20979712
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4904784 data_alloc: 234881024 data_used: 21102592
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb29e690e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.964398384s of 10.504033089s, submitted: 94
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb296af2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4911099 data_alloc: 234881024 data_used: 21143552
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4966203 data_alloc: 234881024 data_used: 27598848
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4966683 data_alloc: 234881024 data_used: 27611136
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.378284454s of 11.749645233s, submitted: 24
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479789056 unmapped: 64864256 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5005681 data_alloc: 234881024 data_used: 28753920
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb26cb9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29cfd400 session 0x55bb29350f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29304b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb295bb680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb294db0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb27a1a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb27a170e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5010257 data_alloc: 234881024 data_used: 28770304
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.470820427s of 10.827239037s, submitted: 55
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb2850c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb27a170e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb27a1a1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb294db0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb29304b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19dcf3000/0x0/0x1bfc00000, data 0x3dcf067/0x3fea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d345000/0x0/0x1bfc00000, data 0x477e067/0x4999000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082493 data_alloc: 234881024 data_used: 28770304
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d345000/0x0/0x1bfc00000, data 0x477e067/0x4999000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb26cb9e00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb29e690e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb29536780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb282cb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480157696 unmapped: 64495616 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5175453 data_alloc: 234881024 data_used: 35057664
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d045000/0x0/0x1bfc00000, data 0x4a7e067/0x4c99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb29350b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.939766884s of 11.147185326s, submitted: 13
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb29537c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb294dbc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480534528 unmapped: 64118784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480534528 unmapped: 64118784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e6000 session 0x55bb295bb2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb296ba5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480821248 unmapped: 63832064 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5199454 data_alloc: 251658240 data_used: 38203392
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d045000/0x0/0x1bfc00000, data 0x4a7e067/0x4c99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5237254 data_alloc: 251658240 data_used: 38260736
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19cb96000/0x0/0x1bfc00000, data 0x4f2d067/0x5148000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.031694412s of 10.257151604s, submitted: 56
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5237728 data_alloc: 251658240 data_used: 38326272
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 61358080 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19c1e4000/0x0/0x1bfc00000, data 0x58df067/0x5afa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5333817 data_alloc: 251658240 data_used: 40177664
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 61300736 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda7000/0x0/0x1bfc00000, data 0x590c067/0x5b27000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.384566307s of 10.649941444s, submitted: 69
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5330865 data_alloc: 251658240 data_used: 40177664
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483459072 unmapped: 61194240 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484507648 unmapped: 60145664 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484507648 unmapped: 60145664 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb294f94a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb296174a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f2d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483516416 unmapped: 61136896 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5155803 data_alloc: 234881024 data_used: 30330880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483516416 unmapped: 61136896 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19cc04000/0x0/0x1bfc00000, data 0x4aaf067/0x4cca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2e506400 session 0x55bb27a22b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2850d0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb2968e3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480935936 unmapped: 63717376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb29305a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19c7a1000/0x0/0x1bfc00000, data 0x5116067/0x512d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.340438843s of 11.199736595s, submitted: 45
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b7b400 session 0x55bb27a22780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e7400 session 0x55bb282caf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5218242 data_alloc: 234881024 data_used: 30330880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 69935104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29577680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 69918720 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 69918720 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d21a000/0x0/0x1bfc00000, data 0x469e034/0x46b3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5087643 data_alloc: 234881024 data_used: 23363584
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474808320 unmapped: 69844992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d21a000/0x0/0x1bfc00000, data 0x469e034/0x46b3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb2850a3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474808320 unmapped: 69844992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb2a066b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474816512 unmapped: 69836800 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.243156910s of 10.041037560s, submitted: 49
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb293252c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474824704 unmapped: 69828608 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d1ab000/0x0/0x1bfc00000, data 0x470d057/0x4723000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5128304 data_alloc: 234881024 data_used: 27824128
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb296ba000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb2a067c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e7400 session 0x55bb295b45a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d2d0000/0x0/0x1bfc00000, data 0x45e8057/0x45fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb29b55000 session 0x55bb293241e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474374144 unmapped: 70279168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5107048 data_alloc: 234881024 data_used: 27828224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 heartbeat osd_stat(store_statfs(0x19d2db000/0x0/0x1bfc00000, data 0x43d7d04/0x45f3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb2cefc400 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb29694400 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.464142799s of 10.002035141s, submitted: 91
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 heartbeat osd_stat(store_statfs(0x19cfea000/0x0/0x1bfc00000, data 0x46c8d04/0x48e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5175422 data_alloc: 234881024 data_used: 28540928
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477298688 unmapped: 67354624 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477306880 unmapped: 67346432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb27675c00 session 0x55bb29e68d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb279cbc00 session 0x55bb274732c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476790784 unmapped: 67862528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19e2e0000/0x0/0x1bfc00000, data 0x4411833/0x462d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476790784 unmapped: 67862528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19e2e0000/0x0/0x1bfc00000, data 0x4411833/0x462d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476798976 unmapped: 67854336 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5051174 data_alloc: 234881024 data_used: 21647360
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476798976 unmapped: 67854336 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb294dc960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29275860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4917660 data_alloc: 218103808 data_used: 16318464
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19f002000/0x0/0x1bfc00000, data 0x36f0810/0x390b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2a098780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.090511322s of 11.899164200s, submitted: 102
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b2a8800 session 0x55bb282774a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2a066d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb29275a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a23a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb27a161e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471883776 unmapped: 72769536 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb27675c00 session 0x55bb294f9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2751a5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb292f2f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f2960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb295ec960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29577680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471900160 unmapped: 72753152 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4884964 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4884964 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb2a1605a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940164 data_alloc: 234881024 data_used: 20566016
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940164 data_alloc: 234881024 data_used: 20566016
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.335626602s of 21.673030853s, submitted: 56
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472883200 unmapped: 71770112 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473751552 unmapped: 70901760 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473628672 unmapped: 71024640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5052360 data_alloc: 234881024 data_used: 20586496
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475676672 unmapped: 68976640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d326000/0x0/0x1bfc00000, data 0x4224820/0x4440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474939392 unmapped: 69713920 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474947584 unmapped: 69705728 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2a1610e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb296165a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb292e7400 session 0x55bb282ca5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb26cb9a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747eb40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb295770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474193920 unmapped: 74661888 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2850c5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb27903000 session 0x55bb295b52c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5174273 data_alloc: 234881024 data_used: 22171648
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c4b6000/0x0/0x1bfc00000, data 0x5091891/0x52af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29274780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747d0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.416330338s of 12.273886681s, submitted: 157
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2850c1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5174289 data_alloc: 234881024 data_used: 22171648
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb29351a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 74670080 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c4bd000/0x0/0x1bfc00000, data 0x50918c4/0x52b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476332032 unmapped: 72523776 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478511104 unmapped: 70344704 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478511104 unmapped: 70344704 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295372c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb282cb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478519296 unmapped: 70336512 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026371 data_alloc: 234881024 data_used: 27156480
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29304b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026371 data_alloc: 234881024 data_used: 27156480
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.068428993s of 12.367558479s, submitted: 35
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481763328 unmapped: 67092480 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d89e000/0x0/0x1bfc00000, data 0x3cb18b4/0x3ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 71229440 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 71221248 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5087449 data_alloc: 234881024 data_used: 27910144
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c46f000/0x0/0x1bfc00000, data 0x3f328b4/0x4151000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092981 data_alloc: 234881024 data_used: 28061696
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479715328 unmapped: 69140480 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c47a000/0x0/0x1bfc00000, data 0x3f358b4/0x4154000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c47a000/0x0/0x1bfc00000, data 0x3f358b4/0x4154000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.518205643s of 12.845505714s, submitted: 98
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb293045a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb29351680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084297 data_alloc: 234881024 data_used: 28065792
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5833/0x2ce1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2a23f4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.286748886s of 24.817827225s, submitted: 52
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2747c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475627520 unmapped: 73228288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb292f2960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb295b45a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859977 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb29325680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb27a22b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475627520 unmapped: 73228288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475635712 unmapped: 73220096 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475635712 unmapped: 73220096 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475717632 unmapped: 73138176 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d582000/0x0/0x1bfc00000, data 0x2e31810/0x304c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475652096 unmapped: 73203712 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4860642 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478961664 unmapped: 69894144 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb282ca5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb29735680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb294df860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb293050e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4894676 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.479883194s of 12.613527298s, submitted: 30
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a098b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb293512c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475996160 unmapped: 72859648 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4899001 data_alloc: 218103808 data_used: 12861440
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475996160 unmapped: 72859648 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940761 data_alloc: 218103808 data_used: 18632704
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940761 data_alloc: 218103808 data_used: 18632704
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.086318970s of 14.066018105s, submitted: 8
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479961088 unmapped: 68894720 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481017856 unmapped: 67837952 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 67772416 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056003 data_alloc: 234881024 data_used: 19955712
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c49e000/0x0/0x1bfc00000, data 0x3f14810/0x412f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c49e000/0x0/0x1bfc00000, data 0x3f14810/0x412f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056179 data_alloc: 234881024 data_used: 19955712
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c480000/0x0/0x1bfc00000, data 0x3f33810/0x414e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c480000/0x0/0x1bfc00000, data 0x3f33810/0x414e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.410203934s of 13.434028625s, submitted: 118
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056143 data_alloc: 234881024 data_used: 19963904
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850a5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480608256 unmapped: 68247552 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb27883400 session 0x55bb29537680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2b8da000 session 0x55bb2850cd20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2a142400 session 0x55bb296ba000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486924288 unmapped: 61931520 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2780d400 session 0x55bb296150e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485277696 unmapped: 70844416 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb27883400 session 0x55bb27a22f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 415 heartbeat osd_stat(store_statfs(0x19b45e000/0x0/0x1bfc00000, data 0x4f524db/0x5170000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486375424 unmapped: 69746688 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5223021 data_alloc: 234881024 data_used: 26726400
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 415 handle_osd_map epochs [416,416], i have 416, src has [1,416]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb29694400 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19b45a000/0x0/0x1bfc00000, data 0x4f54188/0x5173000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486383616 unmapped: 69738496 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486391808 unmapped: 69730304 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb293043c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2780d400 session 0x55bb280623c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483475456 unmapped: 72646656 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb27883400 session 0x55bb27473680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2a142400 session 0x55bb2a067860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a160f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19b453000/0x0/0x1bfc00000, data 0x4f59e5f/0x517b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb27883400 session 0x55bb27a1ad20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5284242 data_alloc: 234881024 data_used: 26730496
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19ace2000/0x0/0x1bfc00000, data 0x56c9ec1/0x58ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.696437836s of 12.454614639s, submitted: 122
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19ace2000/0x0/0x1bfc00000, data 0x56c9ec1/0x58ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483614720 unmapped: 72507392 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb2747f680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483614720 unmapped: 72507392 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b8da000 session 0x55bb295bbe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850c3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb294f9860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acde000/0x0/0x1bfc00000, data 0x56cba00/0x58ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27883400 session 0x55bb293252c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb29614b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483631104 unmapped: 72491008 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5287136 data_alloc: 234881024 data_used: 26738688
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b8da000 session 0x55bb295bb0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483631104 unmapped: 72491008 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a22f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdf000/0x0/0x1bfc00000, data 0x56cba00/0x58ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27883400 session 0x55bb296ba000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483639296 unmapped: 72482816 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483762176 unmapped: 72359936 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb32c02800 session 0x55bb2850b860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2a861400 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27902000 session 0x55bb297354a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb28062d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5342782 data_alloc: 234881024 data_used: 33411072
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 70574080 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5406978 data_alloc: 251658240 data_used: 42504192
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.194473267s of 13.459496498s, submitted: 34
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486047744 unmapped: 70074368 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486047744 unmapped: 70074368 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 66863104 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 66715648 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5483936 data_alloc: 251658240 data_used: 43495424
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489594880 unmapped: 66527232 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 66519040 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489611264 unmapped: 66510848 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507051 data_alloc: 251658240 data_used: 45670400
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.791836739s of 10.737925529s, submitted: 109
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb293050e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb295ecf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489938944 unmapped: 66183168 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 66158592 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb32c02800 session 0x55bb27a23a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5321874 data_alloc: 251658240 data_used: 38035456
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb2a067c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb36b66000 session 0x55bb27a22960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489988096 unmapped: 66134016 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5193660 data_alloc: 234881024 data_used: 33263616
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5193660 data_alloc: 234881024 data_used: 33263616
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5194940 data_alloc: 234881024 data_used: 33316864
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 67K writes, 269K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.04 MB/s#012Cumulative WAL: 67K writes, 24K syncs, 2.73 writes per sync, written: 0.26 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6455 writes, 25K keys, 6455 commit groups, 1.0 writes per commit group, ingest: 27.90 MB, 0.05 MB/s#012Interval WAL: 6456 writes, 2525 syncs, 2.56 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb295b4f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.139095306s of 20.152223587s, submitted: 103
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 417 handle_osd_map epochs [418,418], i have 418, src has [1,418]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282761e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2a143000 session 0x55bb2850dc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb32c02800 session 0x55bb274723c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490045440 unmapped: 66076672 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2780d400 session 0x55bb294de960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497688576 unmapped: 71041024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 419 ms_handle_reset con 0x55bb29694400 session 0x55bb29e68f00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497737728 unmapped: 70991872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497819648 unmapped: 70909952 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5467556 data_alloc: 251658240 data_used: 38674432
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 420 heartbeat osd_stat(store_statfs(0x19a05e000/0x0/0x1bfc00000, data 0x63492b4/0x656f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497827840 unmapped: 70901760 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494247936 unmapped: 74481664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27a221e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494247936 unmapped: 74481664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb27883400 session 0x55bb27a141e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2a861400 session 0x55bb296165a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5236952 data_alloc: 251658240 data_used: 38674432
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 heartbeat osd_stat(store_statfs(0x19c01f000/0x0/0x1bfc00000, data 0x4387be2/0x45af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.853560448s of 10.686148643s, submitted: 111
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 495321088 unmapped: 73408512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 495321088 unmapped: 73408512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494297088 unmapped: 74432512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282ca780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb32c02800 session 0x55bb2a066b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb36b66000 session 0x55bb294daf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2780d400 session 0x55bb296bbc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb29694400 session 0x55bb2a067a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2a861400 session 0x55bb294db4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482951168 unmapped: 85778432 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5103466 data_alloc: 234881024 data_used: 21983232
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb29e5b000 session 0x55bb294f94a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb36b67000 session 0x55bb295ec960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482951168 unmapped: 85778432 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 heartbeat osd_stat(store_statfs(0x19c53c000/0x0/0x1bfc00000, data 0x3e6a316/0x4091000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ba3c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5007400 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 424 heartbeat osd_stat(store_statfs(0x19cde3000/0x0/0x1bfc00000, data 0x35c0e71/0x37e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.565454483s of 10.000356674s, submitted: 123
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb294f9a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471490560 unmapped: 97239040 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5013770 data_alloc: 218103808 data_used: 13463552
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cde1000/0x0/0x1bfc00000, data 0x35c29b0/0x37ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5089290 data_alloc: 234881024 data_used: 24158208
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cde1000/0x0/0x1bfc00000, data 0x35c29b0/0x37ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5089290 data_alloc: 234881024 data_used: 24158208
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.364485741s of 13.372555733s, submitted: 14
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473571328 unmapped: 95158272 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c49c000/0x0/0x1bfc00000, data 0x3af29b0/0x3d1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474390528 unmapped: 94339072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c40e000/0x0/0x1bfc00000, data 0x3b789b0/0x3da2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5152986 data_alloc: 234881024 data_used: 25112576
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c40e000/0x0/0x1bfc00000, data 0x3b789b0/0x3da2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5145886 data_alloc: 234881024 data_used: 25112576
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c3fb000/0x0/0x1bfc00000, data 0x3b999b0/0x3dc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5146206 data_alloc: 234881024 data_used: 25120768
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.126967430s of 15.535551071s, submitted: 75
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850c780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb2a23fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2ccf1000 session 0x55bb297343c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a067c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1af00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27473c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb295eda40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29576000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e69860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdc1000/0x0/0x1bfc00000, data 0x41d29c0/0x43fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5201202 data_alloc: 234881024 data_used: 25120768
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbe000/0x0/0x1bfc00000, data 0x41d59c0/0x4400000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2751b680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb29536780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5202004 data_alloc: 234881024 data_used: 25120768
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29cfc000 session 0x55bb294dc1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477601792 unmapped: 91127808 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5241699 data_alloc: 234881024 data_used: 29421568
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.085439682s of 17.923780441s, submitted: 24
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5242859 data_alloc: 234881024 data_used: 29425664
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdb7000/0x0/0x1bfc00000, data 0x41db9e3/0x4407000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479543296 unmapped: 89186304 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480960512 unmapped: 87769088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5289261 data_alloc: 234881024 data_used: 29421568
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b77c000/0x0/0x1bfc00000, data 0x48169e3/0x4a42000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482779136 unmapped: 85950464 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482893824 unmapped: 85835776 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482820096 unmapped: 85909504 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 5.414937973s of 10.002117157s, submitted: 100
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5304011 data_alloc: 234881024 data_used: 29417472
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b5be000/0x0/0x1bfc00000, data 0x49cc9e3/0x4bf8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b5c3000/0x0/0x1bfc00000, data 0x49cf9e3/0x4bfb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a17860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb292743c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5144857 data_alloc: 234881024 data_used: 22913024
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c3e2000/0x0/0x1bfc00000, data 0x3bb19d3/0x3ddc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27a230e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.416720390s of 11.380039215s, submitted: 49
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5144084 data_alloc: 234881024 data_used: 22908928
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29274b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2a861400 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483155968 unmapped: 85573632 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb294dc5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483205120 unmapped: 85524480 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,2])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483262464 unmapped: 85467136 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483262464 unmapped: 85467136 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934149 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483311616 unmapped: 85417984 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483311616 unmapped: 85417984 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483377152 unmapped: 85352448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483377152 unmapped: 85352448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a23c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb296ba5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282770e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb29274b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 105.291740417s of 109.832557678s, submitted: 273
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483426304 unmapped: 85303296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb282ca000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb294f92c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb296ae960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2d87d000 session 0x55bb282cb860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5024620 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483926016 unmapped: 84803584 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483926016 unmapped: 84803584 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483934208 unmapped: 84795392 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483999744 unmapped: 84729856 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5024620 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a1601e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb2968fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2968fe00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2a23ef00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484368384 unmapped: 84361216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484212736 unmapped: 84516864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104070 data_alloc: 234881024 data_used: 23773184
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104070 data_alloc: 234881024 data_used: 23773184
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.882793427s of 20.041955948s, submitted: 38
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486875136 unmapped: 81854464 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5205218 data_alloc: 234881024 data_used: 25612288
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491757568 unmapped: 76972032 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x3fe8a12/0x4213000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5202962 data_alloc: 234881024 data_used: 25657344
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x3fe8a12/0x4213000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5201846 data_alloc: 234881024 data_used: 25669632
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa8000/0x0/0x1bfc00000, data 0x3feba12/0x4216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491552768 unmapped: 77176832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.885244370s of 13.422520638s, submitted: 137
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb3dc9cc00 session 0x55bb27a16d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32709000 session 0x55bb2850c000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491552768 unmapped: 77176832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb295763c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 41.659423828s of 42.354778290s, submitted: 29
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb295baf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947574 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4b9000/0x0/0x1bfc00000, data 0x2adaa12/0x2d05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2751b4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb280630e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a23e5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb280621e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295ed4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32709000 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb298fcc00 session 0x55bb294dc5a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb294df860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1be00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5010242 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486481920 unmapped: 82247680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486490112 unmapped: 82239488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5010242 data_alloc: 218103808 data_used: 12922880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29617c20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486301696 unmapped: 82427904 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5050882 data_alloc: 218103808 data_used: 18731008
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5050882 data_alloc: 218103808 data_used: 18731008
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.477214813s of 24.551872253s, submitted: 19
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486932480 unmapped: 81797120 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5122440 data_alloc: 218103808 data_used: 18755584
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490283008 unmapped: 78446592 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490610688 unmapped: 78118912 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c38c000/0x0/0x1bfc00000, data 0x3c01a12/0x3e2c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490610688 unmapped: 78118912 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2dd000/0x0/0x1bfc00000, data 0x3caea12/0x3ed9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5151356 data_alloc: 234881024 data_used: 19578880
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2dd000/0x0/0x1bfc00000, data 0x3caea12/0x3ed9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5147488 data_alloc: 234881024 data_used: 19587072
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2c1000/0x0/0x1bfc00000, data 0x3cd2a12/0x3efd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.734907150s of 13.229278564s, submitted: 108
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490708992 unmapped: 78020608 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2a066960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490708992 unmapped: 78020608 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2aa000/0x0/0x1bfc00000, data 0x3ce8a12/0x3f13000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5154359 data_alloc: 234881024 data_used: 19595264
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490717184 unmapped: 78012416 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 425 handle_osd_map epochs [426,426], i have 426, src has [1,426]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 426 ms_handle_reset con 0x55bb30ef5800 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 426 ms_handle_reset con 0x55bb2780d400 session 0x55bb293241e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 426 heartbeat osd_stat(store_statfs(0x19c2a6000/0x0/0x1bfc00000, data 0x3cea6cd/0x3f17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29e5b000 session 0x55bb27a23a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2747fc20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 509476864 unmapped: 63234048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb2978a400 session 0x55bb27a22960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5438017 data_alloc: 234881024 data_used: 31862784
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498819072 unmapped: 73891840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 heartbeat osd_stat(store_statfs(0x19a8c1000/0x0/0x1bfc00000, data 0x5895388/0x58fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 428 ms_handle_reset con 0x55bb2780d400 session 0x55bb294de1e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498843648 unmapped: 73867264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29694400 session 0x55bb2850a780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498860032 unmapped: 73850880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 heartbeat osd_stat(store_statfs(0x19a8b9000/0x0/0x1bfc00000, data 0x56d1caa/0x5903000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.446578026s of 10.657642365s, submitted: 129
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29cfc400 session 0x55bb27472000
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29734b40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb3dc9e000 session 0x55bb293054a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29694400 session 0x55bb2850ad20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb2780d400 session 0x55bb29305680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5424483 data_alloc: 234881024 data_used: 31879168
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb29617a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2751ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb32709400 session 0x55bb29537680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb29617860
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2a0670e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1ab40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb292f21e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb27873800 session 0x55bb27a1be00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb294df2c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5428305 data_alloc: 234881024 data_used: 31887360
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494419968 unmapped: 78290944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494419968 unmapped: 78290944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5428305 data_alloc: 234881024 data_used: 31887360
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb2751b4a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb295baf00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295763c0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.790133476s of 13.918015480s, submitted: 28
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5a400 session 0x55bb2a23ef00
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5501254 data_alloc: 251658240 data_used: 42024960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494452736 unmapped: 78258176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d387e/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496705536 unmapped: 76005376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5534806 data_alloc: 251658240 data_used: 46772224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d387e/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5535606 data_alloc: 251658240 data_used: 46792704
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.097463608s of 13.780971527s, submitted: 6
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498991104 unmapped: 73719808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5590520 data_alloc: 251658240 data_used: 51654656
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499474432 unmapped: 73236480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499474432 unmapped: 73236480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5591224 data_alloc: 251658240 data_used: 51654656
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.348139763s of 11.720171928s, submitted: 7
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5590872 data_alloc: 251658240 data_used: 51654656
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504668160 unmapped: 68042752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504676352 unmapped: 68034560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5614032 data_alloc: 251658240 data_used: 53387264
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5614032 data_alloc: 251658240 data_used: 53387264
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.533994675s of 15.573607445s, submitted: 7
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5638706 data_alloc: 251658240 data_used: 54968320
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 68247552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb293050e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 68247552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a20a000/0x0/0x1bfc00000, data 0x5d7da7e/0x5fb4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968fa40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb29304780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5595832 data_alloc: 251658240 data_used: 54308864
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29e694a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b3000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b3000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5595136 data_alloc: 251658240 data_used: 54304768
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.704790115s of 13.284405708s, submitted: 34
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb32709800 session 0x55bb29577a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b4000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68d20
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 430 handle_osd_map epochs [431,431], i have 431, src has [1,431]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5583026 data_alloc: 251658240 data_used: 54300672
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d5496/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d5496/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5581986 data_alloc: 251658240 data_used: 54300672
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.344113350s of 10.144760132s, submitted: 94
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 ms_handle_reset con 0x55bb29694400 session 0x55bb280625a0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 ms_handle_reset con 0x55bb29cfc400 session 0x55bb294dc960
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19c294000/0x0/0x1bfc00000, data 0x3cf6434/0x3f29000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 431 handle_osd_map epochs [432,432], i have 432, src has [1,432]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 432 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2a23e780
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5244005 data_alloc: 234881024 data_used: 31895552
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 heartbeat osd_stat(store_statfs(0x19c28e000/0x0/0x1bfc00000, data 0x3cf9bf6/0x3f2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb32709000 session 0x55bb2751b0e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb298fc000 session 0x55bb274730e0
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 heartbeat osd_stat(store_statfs(0x19c28e000/0x0/0x1bfc00000, data 0x3cf9bf6/0x3f2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ed680
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5012605 data_alloc: 218103808 data_used: 12972032
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 433 handle_osd_map epochs [433,434], i have 433, src has [1,434]
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.993594170s of 11.322154045s, submitted: 67
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 ms_handle_reset con 0x55bb29694400 session 0x55bb29577a40
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea735/0x2d20000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491233280 unmapped: 81477632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491233280 unmapped: 81477632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491241472 unmapped: 81469440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491241472 unmapped: 81469440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491298816 unmapped: 81412096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491298816 unmapped: 81412096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491331584 unmapped: 81379328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}'
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490782720 unmapped: 81928192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490848256 unmapped: 81862656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:20:22 np0005603622 ceph-osd[79107]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:20:22 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1525721526' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1533309545' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:20:23 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1953960211' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:20:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:24.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:20:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4097509831' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:20:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:24.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:24 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 31 04:20:24 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/334653811' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 31 04:20:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/220825936' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 31 04:20:26 np0005603622 nova_compute[221301]: 2026-01-31 09:20:26.042 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:26 np0005603622 podman[313604]: 2026-01-31 09:20:26.136231111 +0000 UTC m=+0.054561374 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 31 04:20:26 np0005603622 podman[313603]: 2026-01-31 09:20:26.181599412 +0000 UTC m=+0.099794061 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 31 04:20:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:26.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1751601628' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 31 04:20:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:26.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/919153167' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3959832848' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:27 np0005603622 nova_compute[221301]: 2026-01-31 09:20:27.061 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:27 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 04:20:27 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1841236354' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/755621859' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 31 04:20:27 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2068886668' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1855803881' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4225136205' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:28.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1817712555' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2072882409' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 31 04:20:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:28.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 31 04:20:28 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4087787616' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/735463293' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/49463125' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 31 04:20:29 np0005603622 nova_compute[221301]: 2026-01-31 09:20:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:29 np0005603622 nova_compute[221301]: 2026-01-31 09:20:29.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:29 np0005603622 nova_compute[221301]: 2026-01-31 09:20:29.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2373330130' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:29 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:30.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:30.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 31 04:20:30 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/227318566' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 31 04:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:20:30.938 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:20:30.939 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:20:30.939 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:31 np0005603622 nova_compute[221301]: 2026-01-31 09:20:31.046 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:31 np0005603622 nova_compute[221301]: 2026-01-31 09:20:31.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 31 04:20:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1263227450' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 31 04:20:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3665823025' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:32 np0005603622 nova_compute[221301]: 2026-01-31 09:20:32.064 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:32.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 31 04:20:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2757924827' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 31 04:20:32 np0005603622 nova_compute[221301]: 2026-01-31 09:20:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:32 np0005603622 nova_compute[221301]: 2026-01-31 09:20:32.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:20:32 np0005603622 nova_compute[221301]: 2026-01-31 09:20:32.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:20:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:32.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:32 np0005603622 nova_compute[221301]: 2026-01-31 09:20:32.836 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:20:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 31 04:20:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4085721648' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 31 04:20:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:34.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 31 04:20:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1115122449' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 31 04:20:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:34.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0) v1
Jan 31 04:20:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1378728980' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 31 04:20:35 np0005603622 nova_compute[221301]: 2026-01-31 09:20:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:35 np0005603622 nova_compute[221301]: 2026-01-31 09:20:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Jan 31 04:20:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/403622529' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 31 04:20:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:36 np0005603622 nova_compute[221301]: 2026-01-31 09:20:36.047 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Jan 31 04:20:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4002523848' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 31 04:20:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:36.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:36.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Jan 31 04:20:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1869538934' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.066 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.573 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.573 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.574 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.574 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:20:37 np0005603622 nova_compute[221301]: 2026-01-31 09:20:37.574 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:20:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:20:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2454760043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:20:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Jan 31 04:20:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2331179702' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 31 04:20:38 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.083 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:20:38 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.246 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:20:38 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.248 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3954MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:20:38 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.248 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:20:38 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.249 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:20:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:38.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:38.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.997 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:38.997 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.094 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/282863672' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3833836867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.551 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.560 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Jan 31 04:20:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1652385625' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.593 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.596 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:20:39 np0005603622 nova_compute[221301]: 2026-01-31 09:20:39.596 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:20:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:40.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:40 np0005603622 ovs-appctl[316244]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603622 ovs-appctl[316267]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603622 ovs-appctl[316271]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 31 04:20:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Jan 31 04:20:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4121370474' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 31 04:20:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:40.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0) v1
Jan 31 04:20:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2839349942' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 31 04:20:41 np0005603622 nova_compute[221301]: 2026-01-31 09:20:41.049 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:42 np0005603622 nova_compute[221301]: 2026-01-31 09:20:42.068 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/925282534' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:20:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:42.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/711731617' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 31 04:20:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:42.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/431835934' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #199. Immutable memtables: 0.
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.338145) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 127] Flushing memtable with next log file: 199
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243338191, "job": 127, "event": "flush_started", "num_memtables": 1, "num_entries": 1066, "num_deletes": 255, "total_data_size": 1617282, "memory_usage": 1644560, "flush_reason": "Manual Compaction"}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 127] Level-0 flush table #200: started
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243392765, "cf_name": "default", "job": 127, "event": "table_file_creation", "file_number": 200, "file_size": 1065671, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95689, "largest_seqno": 96750, "table_properties": {"data_size": 1060077, "index_size": 2604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 16513, "raw_average_key_size": 22, "raw_value_size": 1047493, "raw_average_value_size": 1409, "num_data_blocks": 113, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851198, "oldest_key_time": 1769851198, "file_creation_time": 1769851243, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 200, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 127] Flush lasted 54662 microseconds, and 3584 cpu microseconds.
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.392806) [db/flush_job.cc:967] [default] [JOB 127] Level-0 flush table #200: 1065671 bytes OK
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.392823) [db/memtable_list.cc:519] [default] Level-0 commit table #200 started
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.471496) [db/memtable_list.cc:722] [default] Level-0 commit table #200: memtable #1 done
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.471545) EVENT_LOG_v1 {"time_micros": 1769851243471534, "job": 127, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.471573) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 127] Try to delete WAL files size 1611113, prev total WAL file size 1611113, number of live WAL files 2.
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000196.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.472212) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373636' seq:72057594037927935, type:22 .. '6C6F676D0034303137' seq:0, type:0; will stop at (end)
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 128] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 127 Base level 0, inputs: [200(1040KB)], [198(10MB)]
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243472313, "job": 128, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [200], "files_L6": [198], "score": -1, "input_data_size": 12551832, "oldest_snapshot_seqno": -1}
Jan 31 04:20:43 np0005603622 nova_compute[221301]: 2026-01-31 09:20:43.597 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:43 np0005603622 nova_compute[221301]: 2026-01-31 09:20:43.598 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 128] Generated table #201: 11529 keys, 12427200 bytes, temperature: kUnknown
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243701416, "cf_name": "default", "job": 128, "event": "table_file_creation", "file_number": 201, "file_size": 12427200, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12355928, "index_size": 41335, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306579, "raw_average_key_size": 26, "raw_value_size": 12157968, "raw_average_value_size": 1054, "num_data_blocks": 1557, "num_entries": 11529, "num_filter_entries": 11529, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851243, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 201, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.701993) [db/compaction/compaction_job.cc:1663] [default] [JOB 128] Compacted 1@0 + 1@6 files to L6 => 12427200 bytes
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.731954) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 54.7 rd, 54.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(23.4) write-amplify(11.7) OK, records in: 12048, records dropped: 519 output_compression: NoCompression
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.732032) EVENT_LOG_v1 {"time_micros": 1769851243732004, "job": 128, "event": "compaction_finished", "compaction_time_micros": 229407, "compaction_time_cpu_micros": 48728, "output_level": 6, "num_output_files": 1, "total_output_size": 12427200, "num_input_records": 12048, "num_output_records": 11529, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000200.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243732483, "job": 128, "event": "table_file_deletion", "file_number": 200}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000198.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851243734009, "job": 128, "event": "table_file_deletion", "file_number": 198}
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.472107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.734133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.734145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.734149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.734153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:20:43.734156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1434397084' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:44.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:44.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Jan 31 04:20:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2889119995' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 31 04:20:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3129125868' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:46 np0005603622 nova_compute[221301]: 2026-01-31 09:20:46.052 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:46.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/336307917' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:46.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/18908044' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:47 np0005603622 nova_compute[221301]: 2026-01-31 09:20:47.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3980837095' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2596859608' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:48.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Jan 31 04:20:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/322969753' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 31 04:20:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:48.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Jan 31 04:20:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/381343092' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 31 04:20:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1441126239' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:20:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:50.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:20:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:51 np0005603622 nova_compute[221301]: 2026-01-31 09:20:51.054 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:20:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1429005822' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:20:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Jan 31 04:20:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1678619059' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 31 04:20:52 np0005603622 nova_compute[221301]: 2026-01-31 09:20:52.073 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:52.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:52 np0005603622 systemd[1]: Starting Time & Date Service...
Jan 31 04:20:52 np0005603622 systemd[1]: Started Time & Date Service.
Jan 31 04:20:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:52 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:52 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1110611078' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:52 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:20:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Jan 31 04:20:53 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1364734139' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 31 04:20:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:54.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:20:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:20:56 np0005603622 nova_compute[221301]: 2026-01-31 09:20:56.058 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:56.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:56.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:57 np0005603622 nova_compute[221301]: 2026-01-31 09:20:57.075 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:20:57 np0005603622 podman[318409]: 2026-01-31 09:20:57.129947216 +0000 UTC m=+0.051072692 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:20:57 np0005603622 podman[318408]: 2026-01-31 09:20:57.162013174 +0000 UTC m=+0.083364456 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 31 04:20:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:20:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:20:58.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:20:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:20:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:20:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:20:58.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:00.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:00.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:01 np0005603622 nova_compute[221301]: 2026-01-31 09:21:01.060 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:01 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:21:01 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:21:02 np0005603622 nova_compute[221301]: 2026-01-31 09:21:02.078 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:02.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:02.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:04.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:06 np0005603622 nova_compute[221301]: 2026-01-31 09:21:06.066 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:06.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:07 np0005603622 nova_compute[221301]: 2026-01-31 09:21:07.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:08.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:08.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:10.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:10.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:11 np0005603622 nova_compute[221301]: 2026-01-31 09:21:11.069 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:12 np0005603622 nova_compute[221301]: 2026-01-31 09:21:12.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:12.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:12.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:14.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:14.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:16 np0005603622 nova_compute[221301]: 2026-01-31 09:21:16.071 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:16.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:16.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:17 np0005603622 nova_compute[221301]: 2026-01-31 09:21:17.087 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:18.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:18.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:20.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:20.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:21 np0005603622 nova_compute[221301]: 2026-01-31 09:21:21.074 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:22 np0005603622 nova_compute[221301]: 2026-01-31 09:21:22.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:22.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:22 np0005603622 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 31 04:21:22 np0005603622 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 31 04:21:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:22.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:24.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:21:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:24.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:21:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:26 np0005603622 nova_compute[221301]: 2026-01-31 09:21:26.077 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:26.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:26.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:27 np0005603622 nova_compute[221301]: 2026-01-31 09:21:27.093 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:27 np0005603622 podman[318507]: 2026-01-31 09:21:27.248130531 +0000 UTC m=+0.080008458 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Jan 31 04:21:27 np0005603622 podman[318513]: 2026-01-31 09:21:27.311413335 +0000 UTC m=+0.104538327 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:21:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:28.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:28.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:21:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:30.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:21:30 np0005603622 nova_compute[221301]: 2026-01-31 09:21:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:30 np0005603622 nova_compute[221301]: 2026-01-31 09:21:30.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:30 np0005603622 nova_compute[221301]: 2026-01-31 09:21:30.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:21:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:21:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:30.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:21:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:21:30.940 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:21:30.944 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:21:30.945 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:31 np0005603622 nova_compute[221301]: 2026-01-31 09:21:31.081 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:31 np0005603622 nova_compute[221301]: 2026-01-31 09:21:31.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:32 np0005603622 nova_compute[221301]: 2026-01-31 09:21:32.098 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:32.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:32.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:34.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:34 np0005603622 nova_compute[221301]: 2026-01-31 09:21:34.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:34 np0005603622 nova_compute[221301]: 2026-01-31 09:21:34.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:21:34 np0005603622 nova_compute[221301]: 2026-01-31 09:21:34.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:21:34 np0005603622 nova_compute[221301]: 2026-01-31 09:21:34.427 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:21:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:34.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:34 np0005603622 systemd[1]: session-60.scope: Deactivated successfully.
Jan 31 04:21:34 np0005603622 systemd[1]: session-60.scope: Consumed 2min 39.967s CPU time, 1.0G memory peak, read 470.6M from disk, written 344.8M to disk.
Jan 31 04:21:34 np0005603622 systemd-logind[817]: Session 60 logged out. Waiting for processes to exit.
Jan 31 04:21:34 np0005603622 systemd-logind[817]: Removed session 60.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: New session 61 of user zuul.
Jan 31 04:21:35 np0005603622 systemd[1]: Started Session 61 of User zuul.
Jan 31 04:21:35 np0005603622 systemd[1]: session-61.scope: Deactivated successfully.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: Session 61 logged out. Waiting for processes to exit.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: Removed session 61.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: New session 62 of user zuul.
Jan 31 04:21:35 np0005603622 systemd[1]: Started Session 62 of User zuul.
Jan 31 04:21:35 np0005603622 systemd[1]: session-62.scope: Deactivated successfully.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: Session 62 logged out. Waiting for processes to exit.
Jan 31 04:21:35 np0005603622 systemd-logind[817]: Removed session 62.
Jan 31 04:21:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:36 np0005603622 nova_compute[221301]: 2026-01-31 09:21:36.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:36.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:36 np0005603622 nova_compute[221301]: 2026-01-31 09:21:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:36.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:37 np0005603622 nova_compute[221301]: 2026-01-31 09:21:37.101 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:37 np0005603622 nova_compute[221301]: 2026-01-31 09:21:37.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:21:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:38.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.442 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.442 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.443 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.443 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.444 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:21:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:21:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:38.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:21:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:21:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1254568844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:21:38 np0005603622 nova_compute[221301]: 2026-01-31 09:21:38.973 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.153 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.154 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4152MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.154 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.155 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.779 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.779 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:21:39 np0005603622 nova_compute[221301]: 2026-01-31 09:21:39.864 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:21:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:21:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1396227476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:21:40 np0005603622 nova_compute[221301]: 2026-01-31 09:21:40.293 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:21:40 np0005603622 nova_compute[221301]: 2026-01-31 09:21:40.299 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:21:40 np0005603622 nova_compute[221301]: 2026-01-31 09:21:40.323 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:21:40 np0005603622 nova_compute[221301]: 2026-01-31 09:21:40.325 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:21:40 np0005603622 nova_compute[221301]: 2026-01-31 09:21:40.326 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:21:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:40.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:40.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:41 np0005603622 nova_compute[221301]: 2026-01-31 09:21:41.082 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:42 np0005603622 nova_compute[221301]: 2026-01-31 09:21:42.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:42.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:42.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:44 np0005603622 nova_compute[221301]: 2026-01-31 09:21:44.327 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:44.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:44 np0005603622 nova_compute[221301]: 2026-01-31 09:21:44.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:44.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:45 np0005603622 nova_compute[221301]: 2026-01-31 09:21:45.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:21:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:46 np0005603622 nova_compute[221301]: 2026-01-31 09:21:46.085 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:46.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:46.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:47 np0005603622 nova_compute[221301]: 2026-01-31 09:21:47.107 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:48.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:48.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:50.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:50.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:51 np0005603622 nova_compute[221301]: 2026-01-31 09:21:51.088 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:52 np0005603622 nova_compute[221301]: 2026-01-31 09:21:52.111 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:52.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:52.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:21:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:21:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:54.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:21:56 np0005603622 nova_compute[221301]: 2026-01-31 09:21:56.091 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:56.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:56.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:57 np0005603622 nova_compute[221301]: 2026-01-31 09:21:57.114 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:21:58 np0005603622 podman[318657]: 2026-01-31 09:21:58.179720261 +0000 UTC m=+0.080487850 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 04:21:58 np0005603622 podman[318656]: 2026-01-31 09:21:58.251323445 +0000 UTC m=+0.153445710 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:21:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:21:58.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:21:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:21:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:21:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:21:58.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:00.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:00.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:01 np0005603622 nova_compute[221301]: 2026-01-31 09:22:01.094 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:22:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:02 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:22:02 np0005603622 nova_compute[221301]: 2026-01-31 09:22:02.118 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:22:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 70K writes, 278K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.04 MB/s#012Cumulative WAL: 70K writes, 25K syncs, 2.72 writes per sync, written: 0.27 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2445 writes, 9480 keys, 2445 commit groups, 1.0 writes per commit group, ingest: 9.67 MB, 0.02 MB/s#012Interval WAL: 2445 writes, 959 syncs, 2.55 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:22:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:02.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:04.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:04.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:05 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:06 np0005603622 nova_compute[221301]: 2026-01-31 09:22:06.098 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:06.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:06.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:07 np0005603622 nova_compute[221301]: 2026-01-31 09:22:07.122 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:08.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:08.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:10.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:10.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:10 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:22:10 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:11 np0005603622 nova_compute[221301]: 2026-01-31 09:22:11.102 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:12 np0005603622 nova_compute[221301]: 2026-01-31 09:22:12.126 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:12.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:12.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:14 np0005603622 nova_compute[221301]: 2026-01-31 09:22:14.409 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:14.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:22:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1895092335' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:22:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:22:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1895092335' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:22:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:15 np0005603622 nova_compute[221301]: 2026-01-31 09:22:15.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:15 np0005603622 nova_compute[221301]: 2026-01-31 09:22:15.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:22:15 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:16 np0005603622 nova_compute[221301]: 2026-01-31 09:22:16.104 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:16.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:16.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:17 np0005603622 nova_compute[221301]: 2026-01-31 09:22:17.129 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:18.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:18.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:20.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:20.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:20 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:21 np0005603622 nova_compute[221301]: 2026-01-31 09:22:21.107 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:22 np0005603622 nova_compute[221301]: 2026-01-31 09:22:22.132 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:22.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:22.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:23 np0005603622 nova_compute[221301]: 2026-01-31 09:22:23.412 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:23 np0005603622 nova_compute[221301]: 2026-01-31 09:22:23.413 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:22:23 np0005603622 nova_compute[221301]: 2026-01-31 09:22:23.442 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:22:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:24.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:24.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:25 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:25 np0005603622 nova_compute[221301]: 2026-01-31 09:22:25.990 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:26 np0005603622 nova_compute[221301]: 2026-01-31 09:22:26.109 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:26.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:26.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:27 np0005603622 nova_compute[221301]: 2026-01-31 09:22:27.135 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:22:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.0 total, 600.0 interval#012Cumulative writes: 19K writes, 97K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.03 MB/s#012Cumulative WAL: 19K writes, 19K syncs, 1.00 writes per sync, written: 0.19 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1537 writes, 7845 keys, 1537 commit groups, 1.0 writes per commit group, ingest: 15.97 MB, 0.03 MB/s#012Interval WAL: 1537 writes, 1537 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     35.2      3.45              0.39        64    0.054       0      0       0.0       0.0#012  L6      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.7      0.0       0.0   5.5     86.2     74.0      9.00              2.12        63    0.143    515K    33K       0.0       0.0#012 Sum      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.8      0.1       0.0   6.5     62.3     63.3     12.45              2.51       127    0.098    515K    33K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.6     63.9     64.0      1.29              0.28        12    0.107     70K   3129       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.6       0.7      0.0       0.0   0.0     86.2     74.0      9.00              2.12        63    0.143    515K    33K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     35.2      3.45              0.39        63    0.055       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7200.0 total, 600.0 interval#012Flush(GB): cumulative 0.119, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.77 GB write, 0.11 MB/s write, 0.76 GB read, 0.11 MB/s read, 12.5 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 1.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 85.25 MB table_size: 0 occupancy: 18446744073709551615 collections: 13 last_copies: 0 last_secs: 0.001004 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5253,81.59 MB,26.8404%) FilterBlock(127,1.40 MB,0.459786%) IndexBlock(127,2.26 MB,0.742325%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:22:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:28.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:28.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:29 np0005603622 podman[318885]: 2026-01-31 09:22:29.175028609 +0000 UTC m=+0.080399407 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 31 04:22:29 np0005603622 podman[318884]: 2026-01-31 09:22:29.20644753 +0000 UTC m=+0.118027413 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 31 04:22:30 np0005603622 nova_compute[221301]: 2026-01-31 09:22:30.418 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:30.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:30.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:30 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:22:30.941 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:22:30.943 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:22:30.943 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:31 np0005603622 nova_compute[221301]: 2026-01-31 09:22:31.112 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:31 np0005603622 nova_compute[221301]: 2026-01-31 09:22:31.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:31 np0005603622 nova_compute[221301]: 2026-01-31 09:22:31.394 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:31 np0005603622 nova_compute[221301]: 2026-01-31 09:22:31.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:22:32 np0005603622 nova_compute[221301]: 2026-01-31 09:22:32.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:32.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:32.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:34.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:35 np0005603622 nova_compute[221301]: 2026-01-31 09:22:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:35 np0005603622 nova_compute[221301]: 2026-01-31 09:22:35.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:22:35 np0005603622 nova_compute[221301]: 2026-01-31 09:22:35.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:22:35 np0005603622 nova_compute[221301]: 2026-01-31 09:22:35.419 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:22:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:36 np0005603622 nova_compute[221301]: 2026-01-31 09:22:36.114 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:36.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:36.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:37 np0005603622 nova_compute[221301]: 2026-01-31 09:22:37.142 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.444 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.444 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.445 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.445 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:22:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:38.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:22:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/995782552' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:22:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:38.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:38 np0005603622 nova_compute[221301]: 2026-01-31 09:22:38.892 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.061 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.063 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4171MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.063 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.064 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.366 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.367 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.390 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.420 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.422 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.462 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.511 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:22:39 np0005603622 nova_compute[221301]: 2026-01-31 09:22:39.544 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:22:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:22:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/865005703' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:22:40 np0005603622 nova_compute[221301]: 2026-01-31 09:22:40.013 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:22:40 np0005603622 nova_compute[221301]: 2026-01-31 09:22:40.021 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:22:40 np0005603622 nova_compute[221301]: 2026-01-31 09:22:40.049 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:22:40 np0005603622 nova_compute[221301]: 2026-01-31 09:22:40.053 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:22:40 np0005603622 nova_compute[221301]: 2026-01-31 09:22:40.054 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:22:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:40.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:40.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:41 np0005603622 nova_compute[221301]: 2026-01-31 09:22:41.117 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:42 np0005603622 nova_compute[221301]: 2026-01-31 09:22:42.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:42.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:42.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:22:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:44.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:22:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:46 np0005603622 nova_compute[221301]: 2026-01-31 09:22:46.054 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:46 np0005603622 nova_compute[221301]: 2026-01-31 09:22:46.056 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:22:46 np0005603622 nova_compute[221301]: 2026-01-31 09:22:46.120 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:46.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:46.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:47 np0005603622 nova_compute[221301]: 2026-01-31 09:22:47.147 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:50.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:51 np0005603622 nova_compute[221301]: 2026-01-31 09:22:51.123 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:52 np0005603622 nova_compute[221301]: 2026-01-31 09:22:52.150 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:52.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:52.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:54.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:54.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:55 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:22:56 np0005603622 nova_compute[221301]: 2026-01-31 09:22:56.127 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:22:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:56.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:22:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:56.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:57 np0005603622 nova_compute[221301]: 2026-01-31 09:22:57.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:22:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:22:58.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:22:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:22:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:22:58.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:22:59 np0005603622 ceph-mgr[81925]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3835187053
Jan 31 04:23:00 np0005603622 podman[318975]: 2026-01-31 09:23:00.162402246 +0000 UTC m=+0.072772276 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 31 04:23:00 np0005603622 podman[318974]: 2026-01-31 09:23:00.19278848 +0000 UTC m=+0.106947770 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 31 04:23:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:00.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:00 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:00.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:01 np0005603622 nova_compute[221301]: 2026-01-31 09:23:01.129 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:02 np0005603622 nova_compute[221301]: 2026-01-31 09:23:02.156 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:02.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:04.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:04.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:06.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:06 np0005603622 nova_compute[221301]: 2026-01-31 09:23:06.537 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:06.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:07 np0005603622 nova_compute[221301]: 2026-01-31 09:23:07.158 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:08.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:08.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:10.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:11 np0005603622 nova_compute[221301]: 2026-01-31 09:23:11.135 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:12 np0005603622 nova_compute[221301]: 2026-01-31 09:23:12.161 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:12.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:12.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:13 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:23:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:23:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:14.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:16 np0005603622 nova_compute[221301]: 2026-01-31 09:23:16.137 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:16.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:16.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #202. Immutable memtables: 0.
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.147629) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 129] Flushing memtable with next log file: 202
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397147672, "job": 129, "event": "flush_started", "num_memtables": 1, "num_entries": 1827, "num_deletes": 251, "total_data_size": 4313386, "memory_usage": 4367744, "flush_reason": "Manual Compaction"}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 129] Level-0 flush table #203: started
Jan 31 04:23:17 np0005603622 nova_compute[221301]: 2026-01-31 09:23:17.164 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397172500, "cf_name": "default", "job": 129, "event": "table_file_creation", "file_number": 203, "file_size": 2812756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 96755, "largest_seqno": 98577, "table_properties": {"data_size": 2804975, "index_size": 4658, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16803, "raw_average_key_size": 20, "raw_value_size": 2789238, "raw_average_value_size": 3426, "num_data_blocks": 204, "num_entries": 814, "num_filter_entries": 814, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851244, "oldest_key_time": 1769851244, "file_creation_time": 1769851397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 203, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 129] Flush lasted 24931 microseconds, and 7956 cpu microseconds.
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.172556) [db/flush_job.cc:967] [default] [JOB 129] Level-0 flush table #203: 2812756 bytes OK
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.172584) [db/memtable_list.cc:519] [default] Level-0 commit table #203 started
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.175666) [db/memtable_list.cc:722] [default] Level-0 commit table #203: memtable #1 done
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.175692) EVENT_LOG_v1 {"time_micros": 1769851397175683, "job": 129, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.175718) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 129] Try to delete WAL files size 4305062, prev total WAL file size 4305062, number of live WAL files 2.
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000199.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.176830) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038353334' seq:72057594037927935, type:22 .. '7061786F730038373836' seq:0, type:0; will stop at (end)
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 130] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 129 Base level 0, inputs: [203(2746KB)], [201(11MB)]
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397177044, "job": 130, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [203], "files_L6": [201], "score": -1, "input_data_size": 15239956, "oldest_snapshot_seqno": -1}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 130] Generated table #204: 11824 keys, 13220022 bytes, temperature: kUnknown
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397277545, "cf_name": "default", "job": 130, "event": "table_file_creation", "file_number": 204, "file_size": 13220022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13146146, "index_size": 43235, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 313704, "raw_average_key_size": 26, "raw_value_size": 12942431, "raw_average_value_size": 1094, "num_data_blocks": 1633, "num_entries": 11824, "num_filter_entries": 11824, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 204, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.277997) [db/compaction/compaction_job.cc:1663] [default] [JOB 130] Compacted 1@0 + 1@6 files to L6 => 13220022 bytes
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.280715) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.4 rd, 131.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 11.9 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(10.1) write-amplify(4.7) OK, records in: 12343, records dropped: 519 output_compression: NoCompression
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.280737) EVENT_LOG_v1 {"time_micros": 1769851397280726, "job": 130, "event": "compaction_finished", "compaction_time_micros": 100677, "compaction_time_cpu_micros": 38997, "output_level": 6, "num_output_files": 1, "total_output_size": 13220022, "num_input_records": 12343, "num_output_records": 11824, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000203.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397281464, "job": 130, "event": "table_file_deletion", "file_number": 203}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000201.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851397283724, "job": 130, "event": "table_file_deletion", "file_number": 201}
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.176659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.283814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.283819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.283821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.283822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:17 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:23:17.283824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:23:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:18.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:18 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:23:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:18.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:20.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:20.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:21 np0005603622 nova_compute[221301]: 2026-01-31 09:23:21.139 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:22 np0005603622 nova_compute[221301]: 2026-01-31 09:23:22.167 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:22.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:22.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:24.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:24.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:26 np0005603622 nova_compute[221301]: 2026-01-31 09:23:26.141 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:26.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:26.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:27 np0005603622 nova_compute[221301]: 2026-01-31 09:23:27.169 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:28.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:28.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:30 np0005603622 nova_compute[221301]: 2026-01-31 09:23:30.399 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:30.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:23:30.943 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:23:30.944 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:23:30.944 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:30.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:31 np0005603622 nova_compute[221301]: 2026-01-31 09:23:31.143 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:31 np0005603622 podman[319316]: 2026-01-31 09:23:31.149625997 +0000 UTC m=+0.066794778 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:23:31 np0005603622 podman[319315]: 2026-01-31 09:23:31.192796048 +0000 UTC m=+0.111470059 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:23:31 np0005603622 nova_compute[221301]: 2026-01-31 09:23:31.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:32 np0005603622 nova_compute[221301]: 2026-01-31 09:23:32.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:32 np0005603622 nova_compute[221301]: 2026-01-31 09:23:32.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:32 np0005603622 nova_compute[221301]: 2026-01-31 09:23:32.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:23:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:32.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:34.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:34.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:36 np0005603622 nova_compute[221301]: 2026-01-31 09:23:36.144 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:36 np0005603622 nova_compute[221301]: 2026-01-31 09:23:36.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:36 np0005603622 nova_compute[221301]: 2026-01-31 09:23:36.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:23:36 np0005603622 nova_compute[221301]: 2026-01-31 09:23:36.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:23:36 np0005603622 nova_compute[221301]: 2026-01-31 09:23:36.413 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:23:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:36.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:36.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:37 np0005603622 nova_compute[221301]: 2026-01-31 09:23:37.176 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.531 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.531 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.532 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.532 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.532 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:23:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 31 04:23:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:38.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 31 04:23:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:23:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/799921063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:23:38 np0005603622 nova_compute[221301]: 2026-01-31 09:23:38.952 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:23:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:38.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.120 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.121 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4173MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.122 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.122 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.193 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.194 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.222 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:23:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:23:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1150746289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.680 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.688 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.712 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.713 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:23:39 np0005603622 nova_compute[221301]: 2026-01-31 09:23:39.714 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:23:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:40.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:40.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:41 np0005603622 nova_compute[221301]: 2026-01-31 09:23:41.146 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:41 np0005603622 nova_compute[221301]: 2026-01-31 09:23:41.714 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:41 np0005603622 nova_compute[221301]: 2026-01-31 09:23:41.715 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:42 np0005603622 nova_compute[221301]: 2026-01-31 09:23:42.178 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:42.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:44.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:44.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:45 np0005603622 nova_compute[221301]: 2026-01-31 09:23:45.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:45 np0005603622 nova_compute[221301]: 2026-01-31 09:23:45.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:23:46 np0005603622 nova_compute[221301]: 2026-01-31 09:23:46.148 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:46.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:46.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:47 np0005603622 nova_compute[221301]: 2026-01-31 09:23:47.181 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:48.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:48.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:50.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:23:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:51.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:23:51 np0005603622 nova_compute[221301]: 2026-01-31 09:23:51.150 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:52 np0005603622 nova_compute[221301]: 2026-01-31 09:23:52.184 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:52.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:23:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:54.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:56 np0005603622 nova_compute[221301]: 2026-01-31 09:23:56.153 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:23:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:56.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:23:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:23:56 np0005603622 nova_compute[221301]: 2026-01-31 09:23:56.699 221324 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.89 sec#033[00m
Jan 31 04:23:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:23:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:57.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:23:57 np0005603622 nova_compute[221301]: 2026-01-31 09:23:57.188 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:23:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:23:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:23:58.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:23:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:23:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:23:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:23:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:00.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:01.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:01 np0005603622 nova_compute[221301]: 2026-01-31 09:24:01.154 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:02 np0005603622 podman[319407]: 2026-01-31 09:24:02.189390013 +0000 UTC m=+0.099633591 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:24:02 np0005603622 nova_compute[221301]: 2026-01-31 09:24:02.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:02 np0005603622 podman[319406]: 2026-01-31 09:24:02.210238193 +0000 UTC m=+0.120281846 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:24:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:02.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:03.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:04.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:05.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:06 np0005603622 nova_compute[221301]: 2026-01-31 09:24:06.159 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:06.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:07.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:07 np0005603622 nova_compute[221301]: 2026-01-31 09:24:07.193 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:08.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:09.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:10.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:11.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:11 np0005603622 nova_compute[221301]: 2026-01-31 09:24:11.163 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:12 np0005603622 nova_compute[221301]: 2026-01-31 09:24:12.196 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:12.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:13.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:14.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:15.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:16 np0005603622 nova_compute[221301]: 2026-01-31 09:24:16.164 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:16 np0005603622 nova_compute[221301]: 2026-01-31 09:24:16.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:16.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:17.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:17 np0005603622 nova_compute[221301]: 2026-01-31 09:24:17.198 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:24:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:18.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:24:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:19.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:19 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.448418615 +0000 UTC m=+0.055828703 container create 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 31 04:24:19 np0005603622 systemd[1]: Started libpod-conmon-67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec.scope.
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.426995869 +0000 UTC m=+0.034405977 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 04:24:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.552045502 +0000 UTC m=+0.159455650 container init 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.558764463 +0000 UTC m=+0.166174561 container start 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.563455329 +0000 UTC m=+0.170865527 container attach 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 31 04:24:19 np0005603622 peaceful_kare[319739]: 167 167
Jan 31 04:24:19 np0005603622 systemd[1]: libpod-67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec.scope: Deactivated successfully.
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.568933906 +0000 UTC m=+0.176344004 container died 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 04:24:19 np0005603622 systemd[1]: var-lib-containers-storage-overlay-1aadde4175fdf19deaf37ad3dd29f27b9be6f2ce556e84fa59b5024ee4a5314d-merged.mount: Deactivated successfully.
Jan 31 04:24:19 np0005603622 podman[319722]: 2026-01-31 09:24:19.623961676 +0000 UTC m=+0.231371804 container remove 67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_kare, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 04:24:19 np0005603622 systemd[1]: libpod-conmon-67f05e99b40c5d074708fb92c221a80cdb031607c99f6652e7b8aa83b78c16ec.scope: Deactivated successfully.
Jan 31 04:24:19 np0005603622 podman[319764]: 2026-01-31 09:24:19.805665843 +0000 UTC m=+0.062335528 container create e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True)
Jan 31 04:24:19 np0005603622 systemd[1]: Started libpod-conmon-e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37.scope.
Jan 31 04:24:19 np0005603622 podman[319764]: 2026-01-31 09:24:19.778683877 +0000 UTC m=+0.035353572 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 31 04:24:19 np0005603622 systemd[1]: Started libcrun container.
Jan 31 04:24:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af042c5e388d31d4e5b0be49545162b058118e668a574967ee6d28aee16d3a1f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af042c5e388d31d4e5b0be49545162b058118e668a574967ee6d28aee16d3a1f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af042c5e388d31d4e5b0be49545162b058118e668a574967ee6d28aee16d3a1f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603622 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af042c5e388d31d4e5b0be49545162b058118e668a574967ee6d28aee16d3a1f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 31 04:24:19 np0005603622 podman[319764]: 2026-01-31 09:24:19.917635464 +0000 UTC m=+0.174305119 container init e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Jan 31 04:24:19 np0005603622 podman[319764]: 2026-01-31 09:24:19.925422054 +0000 UTC m=+0.182091699 container start e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 31 04:24:19 np0005603622 podman[319764]: 2026-01-31 09:24:19.930323566 +0000 UTC m=+0.186993231 container attach e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 31 04:24:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:20.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:21.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]: [
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:    {
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "available": false,
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "ceph_device": false,
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "lsm_data": {},
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "lvs": [],
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "path": "/dev/sr0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "rejected_reasons": [
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "Insufficient space (<5GB)",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "Has a FileSystem"
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        ],
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        "sys_api": {
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "actuators": null,
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "device_nodes": "sr0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "devname": "sr0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "human_readable_size": "482.00 KB",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "id_bus": "ata",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "model": "QEMU DVD-ROM",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "nr_requests": "2",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "parent": "/dev/sr0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "partitions": {},
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "path": "/dev/sr0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "removable": "1",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "rev": "2.5+",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "ro": "0",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "rotational": "1",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "sas_address": "",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "sas_device_handle": "",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "scheduler_mode": "mq-deadline",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "sectors": 0,
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "sectorsize": "2048",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "size": 493568.0,
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "support_discard": "2048",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "type": "disk",
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:            "vendor": "QEMU"
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:        }
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]:    }
Jan 31 04:24:21 np0005603622 relaxed_gates[319780]: ]
Jan 31 04:24:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:21 np0005603622 nova_compute[221301]: 2026-01-31 09:24:21.167 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:21 np0005603622 systemd[1]: libpod-e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37.scope: Deactivated successfully.
Jan 31 04:24:21 np0005603622 systemd[1]: libpod-e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37.scope: Consumed 1.266s CPU time.
Jan 31 04:24:21 np0005603622 podman[319764]: 2026-01-31 09:24:21.187096465 +0000 UTC m=+1.443766110 container died e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:24:21 np0005603622 systemd[1]: var-lib-containers-storage-overlay-af042c5e388d31d4e5b0be49545162b058118e668a574967ee6d28aee16d3a1f-merged.mount: Deactivated successfully.
Jan 31 04:24:21 np0005603622 podman[319764]: 2026-01-31 09:24:21.258810654 +0000 UTC m=+1.515480309 container remove e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_gates, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 31 04:24:21 np0005603622 systemd[1]: libpod-conmon-e5fdf24e4a0de3fabefd62f11fdc877551ba0ac251337e553d218485a7be2a37.scope: Deactivated successfully.
Jan 31 04:24:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:22 np0005603622 nova_compute[221301]: 2026-01-31 09:24:22.201 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:24:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:22.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:23.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:24.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:25.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:26 np0005603622 nova_compute[221301]: 2026-01-31 09:24:26.169 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:26.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:26 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:24:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:27.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:27 np0005603622 nova_compute[221301]: 2026-01-31 09:24:27.205 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:28.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:29.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:30.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:24:30.945 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:24:30.946 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:24:30.946 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:31.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:31 np0005603622 nova_compute[221301]: 2026-01-31 09:24:31.173 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:31 np0005603622 nova_compute[221301]: 2026-01-31 09:24:31.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:31 np0005603622 nova_compute[221301]: 2026-01-31 09:24:31.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:32 np0005603622 nova_compute[221301]: 2026-01-31 09:24:32.208 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:32.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:33.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:33 np0005603622 podman[321138]: 2026-01-31 09:24:33.17970547 +0000 UTC m=+0.091166983 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent)
Jan 31 04:24:33 np0005603622 podman[321137]: 2026-01-31 09:24:33.215974075 +0000 UTC m=+0.126912634 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:24:34 np0005603622 nova_compute[221301]: 2026-01-31 09:24:34.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:34 np0005603622 nova_compute[221301]: 2026-01-31 09:24:34.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:24:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:34.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:35.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:36 np0005603622 nova_compute[221301]: 2026-01-31 09:24:36.176 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:36.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:37.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:37 np0005603622 nova_compute[221301]: 2026-01-31 09:24:37.211 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:37 np0005603622 nova_compute[221301]: 2026-01-31 09:24:37.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:37 np0005603622 nova_compute[221301]: 2026-01-31 09:24:37.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:24:37 np0005603622 nova_compute[221301]: 2026-01-31 09:24:37.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:24:37 np0005603622 nova_compute[221301]: 2026-01-31 09:24:37.414 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:24:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:38.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:39.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.430 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.431 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.432 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.433 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:24:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:24:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2743024288' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:24:39 np0005603622 nova_compute[221301]: 2026-01-31 09:24:39.941 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.126 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.128 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4184MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.128 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.128 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.211 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.211 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.236 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:24:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:40.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:24:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2394240344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.699 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.706 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.752 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.754 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:24:40 np0005603622 nova_compute[221301]: 2026-01-31 09:24:40.755 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:24:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:41.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:41 np0005603622 nova_compute[221301]: 2026-01-31 09:24:41.181 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:42 np0005603622 nova_compute[221301]: 2026-01-31 09:24:42.214 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:42.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:42 np0005603622 nova_compute[221301]: 2026-01-31 09:24:42.756 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:42 np0005603622 nova_compute[221301]: 2026-01-31 09:24:42.757 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:43.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:44.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:45.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:46 np0005603622 nova_compute[221301]: 2026-01-31 09:24:46.183 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:46 np0005603622 nova_compute[221301]: 2026-01-31 09:24:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:46.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:47.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:47 np0005603622 nova_compute[221301]: 2026-01-31 09:24:47.217 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:47 np0005603622 nova_compute[221301]: 2026-01-31 09:24:47.402 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:24:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:48.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:49.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:50.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:51 np0005603622 nova_compute[221301]: 2026-01-31 09:24:51.185 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:52 np0005603622 nova_compute[221301]: 2026-01-31 09:24:52.220 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:52.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:53.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:54.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:55.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:56 np0005603622 nova_compute[221301]: 2026-01-31 09:24:56.187 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:24:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:56.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:57.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:24:57 np0005603622 nova_compute[221301]: 2026-01-31 09:24:57.226 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:24:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:24:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:24:58.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:24:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:24:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:24:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:24:59.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:00.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:01.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:01 np0005603622 nova_compute[221301]: 2026-01-31 09:25:01.190 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:02 np0005603622 nova_compute[221301]: 2026-01-31 09:25:02.230 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:25:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:02.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:25:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:04 np0005603622 podman[321226]: 2026-01-31 09:25:04.154830495 +0000 UTC m=+0.068660287 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 31 04:25:04 np0005603622 podman[321225]: 2026-01-31 09:25:04.172775988 +0000 UTC m=+0.092547530 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:25:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:04.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:05.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:06 np0005603622 nova_compute[221301]: 2026-01-31 09:25:06.194 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:07.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:07 np0005603622 nova_compute[221301]: 2026-01-31 09:25:07.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:08.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:09.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:10.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:11.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:11 np0005603622 nova_compute[221301]: 2026-01-31 09:25:11.194 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:12 np0005603622 nova_compute[221301]: 2026-01-31 09:25:12.236 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:12.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:13.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:14.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:15.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:16 np0005603622 nova_compute[221301]: 2026-01-31 09:25:16.197 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:25:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:25:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:17.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:17 np0005603622 nova_compute[221301]: 2026-01-31 09:25:17.239 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:18.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:19.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:20.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:21.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:21 np0005603622 nova_compute[221301]: 2026-01-31 09:25:21.199 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:22 np0005603622 nova_compute[221301]: 2026-01-31 09:25:22.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:25:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:22.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:25:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:23.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:25.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:26 np0005603622 nova_compute[221301]: 2026-01-31 09:25:26.201 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:26.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:27.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:27 np0005603622 nova_compute[221301]: 2026-01-31 09:25:27.246 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:25:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:25:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:28.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:29.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:30.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:25:30.947 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:25:30.948 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:25:30.948 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:31.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:31 np0005603622 nova_compute[221301]: 2026-01-31 09:25:31.203 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:31 np0005603622 nova_compute[221301]: 2026-01-31 09:25:31.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:31 np0005603622 nova_compute[221301]: 2026-01-31 09:25:31.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:32 np0005603622 nova_compute[221301]: 2026-01-31 09:25:32.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:32.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:33.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:34.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:34 np0005603622 podman[321427]: 2026-01-31 09:25:34.951158182 +0000 UTC m=+0.057419225 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 31 04:25:35 np0005603622 podman[321426]: 2026-01-31 09:25:35.008491804 +0000 UTC m=+0.116137304 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 04:25:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:35 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:25:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:35.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:35 np0005603622 nova_compute[221301]: 2026-01-31 09:25:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:35 np0005603622 nova_compute[221301]: 2026-01-31 09:25:35.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:25:36 np0005603622 nova_compute[221301]: 2026-01-31 09:25:36.206 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:37.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:37 np0005603622 nova_compute[221301]: 2026-01-31 09:25:37.250 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:38 np0005603622 nova_compute[221301]: 2026-01-31 09:25:38.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:38 np0005603622 nova_compute[221301]: 2026-01-31 09:25:38.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:25:38 np0005603622 nova_compute[221301]: 2026-01-31 09:25:38.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:25:38 np0005603622 nova_compute[221301]: 2026-01-31 09:25:38.531 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:25:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:38.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:39.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.451 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.452 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.452 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:40.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:25:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/340093540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:25:40 np0005603622 nova_compute[221301]: 2026-01-31 09:25:40.898 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.116 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.117 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4183MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.117 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.118 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:25:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:41.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.208 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.379 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.380 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.402 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:25:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1961534203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.865 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:41 np0005603622 nova_compute[221301]: 2026-01-31 09:25:41.872 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:25:42 np0005603622 nova_compute[221301]: 2026-01-31 09:25:42.009 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:25:42 np0005603622 nova_compute[221301]: 2026-01-31 09:25:42.012 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:25:42 np0005603622 nova_compute[221301]: 2026-01-31 09:25:42.012 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:25:42 np0005603622 nova_compute[221301]: 2026-01-31 09:25:42.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:42.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:43.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:44 np0005603622 nova_compute[221301]: 2026-01-31 09:25:44.013 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:44 np0005603622 nova_compute[221301]: 2026-01-31 09:25:44.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:44.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:45.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:46 np0005603622 nova_compute[221301]: 2026-01-31 09:25:46.210 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:46 np0005603622 nova_compute[221301]: 2026-01-31 09:25:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:46.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:47.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:47 np0005603622 nova_compute[221301]: 2026-01-31 09:25:47.257 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:48.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:49.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:49 np0005603622 nova_compute[221301]: 2026-01-31 09:25:49.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:25:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:50.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:25:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:51.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:25:51 np0005603622 nova_compute[221301]: 2026-01-31 09:25:51.213 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:52 np0005603622 nova_compute[221301]: 2026-01-31 09:25:52.259 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:52.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:52 np0005603622 nova_compute[221301]: 2026-01-31 09:25:52.950 221324 DEBUG oslo_concurrency.processutils [None req-3fbad158-704d-4546-a900-43e27b0e5be8 94836483675641d9846c5768c3b91eed 89e274acfc5c4097be7194f5ef1fabd3 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:25:52 np0005603622 nova_compute[221301]: 2026-01-31 09:25:52.982 221324 DEBUG oslo_concurrency.processutils [None req-3fbad158-704d-4546-a900-43e27b0e5be8 94836483675641d9846c5768c3b91eed 89e274acfc5c4097be7194f5ef1fabd3 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:25:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:53.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:54.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:55.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:56 np0005603622 nova_compute[221301]: 2026-01-31 09:25:56.216 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:25:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:56.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:25:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:57.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:57 np0005603622 nova_compute[221301]: 2026-01-31 09:25:57.262 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:25:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:25:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:25:58.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:25:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:25:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:25:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:25:59.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:00.512 139793 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=103, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:be:2a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:75:fe:8f:a4:91'}, ipsec=False) old=SB_Global(nb_cfg=102) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 31 04:26:00 np0005603622 nova_compute[221301]: 2026-01-31 09:26:00.513 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:00 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:00.514 139793 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 31 04:26:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:00.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:01.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:01 np0005603622 nova_compute[221301]: 2026-01-31 09:26:01.217 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:02 np0005603622 nova_compute[221301]: 2026-01-31 09:26:02.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:02.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:03.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:04 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:04.516 139793 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bd097fed-e54b-4ed7-90f0-078b39b8b13a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '103'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 31 04:26:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:04.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:05 np0005603622 podman[321542]: 2026-01-31 09:26:05.138710467 +0000 UTC m=+0.052720418 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 31 04:26:05 np0005603622 podman[321541]: 2026-01-31 09:26:05.171411127 +0000 UTC m=+0.091082441 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:26:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:05.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:06 np0005603622 nova_compute[221301]: 2026-01-31 09:26:06.218 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:06.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:07.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:07 np0005603622 nova_compute[221301]: 2026-01-31 09:26:07.268 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:08.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:09.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:10.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:11 np0005603622 nova_compute[221301]: 2026-01-31 09:26:11.221 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:11.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:12 np0005603622 nova_compute[221301]: 2026-01-31 09:26:12.272 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:12.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:13.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:15.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:16 np0005603622 nova_compute[221301]: 2026-01-31 09:26:16.222 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:16.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:17.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:17 np0005603622 nova_compute[221301]: 2026-01-31 09:26:17.275 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:18.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:19.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:20 np0005603622 nova_compute[221301]: 2026-01-31 09:26:20.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:26:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:20.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:26:21 np0005603622 nova_compute[221301]: 2026-01-31 09:26:21.225 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:26:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:21.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:26:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:22 np0005603622 nova_compute[221301]: 2026-01-31 09:26:22.278 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:22.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:23.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:24.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:25.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:26 np0005603622 nova_compute[221301]: 2026-01-31 09:26:26.227 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:26.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:27.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:27 np0005603622 nova_compute[221301]: 2026-01-31 09:26:27.282 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:28.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:29.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:30.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:30.948 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:30.949 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:26:30.949 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:31 np0005603622 nova_compute[221301]: 2026-01-31 09:26:31.229 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:31.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:32 np0005603622 nova_compute[221301]: 2026-01-31 09:26:32.284 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:32.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:33.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:33 np0005603622 nova_compute[221301]: 2026-01-31 09:26:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:33 np0005603622 nova_compute[221301]: 2026-01-31 09:26:33.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:34.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:35.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:35 np0005603622 podman[321637]: 2026-01-31 09:26:35.297209964 +0000 UTC m=+0.084965206 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:26:35 np0005603622 podman[321636]: 2026-01-31 09:26:35.309780842 +0000 UTC m=+0.095837949 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:26:36 np0005603622 nova_compute[221301]: 2026-01-31 09:26:36.231 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:36 np0005603622 nova_compute[221301]: 2026-01-31 09:26:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:36 np0005603622 nova_compute[221301]: 2026-01-31 09:26:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:36.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:26:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:37.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:26:37 np0005603622 nova_compute[221301]: 2026-01-31 09:26:37.286 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 31 04:26:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:26:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:37 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:26:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:38.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:39.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:39 np0005603622 nova_compute[221301]: 2026-01-31 09:26:39.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:39 np0005603622 nova_compute[221301]: 2026-01-31 09:26:39.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:26:39 np0005603622 nova_compute[221301]: 2026-01-31 09:26:39.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:26:39 np0005603622 nova_compute[221301]: 2026-01-31 09:26:39.550 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:26:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:40.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:41 np0005603622 nova_compute[221301]: 2026-01-31 09:26:41.233 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:41.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:41 np0005603622 nova_compute[221301]: 2026-01-31 09:26:41.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:42 np0005603622 nova_compute[221301]: 2026-01-31 09:26:42.291 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:42.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:43.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:43 np0005603622 nova_compute[221301]: 2026-01-31 09:26:43.750 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:43 np0005603622 nova_compute[221301]: 2026-01-31 09:26:43.751 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:43 np0005603622 nova_compute[221301]: 2026-01-31 09:26:43.751 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:43 np0005603622 nova_compute[221301]: 2026-01-31 09:26:43.752 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:26:43 np0005603622 nova_compute[221301]: 2026-01-31 09:26:43.753 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/929345425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.175 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.336 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.337 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4172MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.337 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.338 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.453 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.454 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.521 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:26:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:44.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:26:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2302001142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.953 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.960 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.987 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.989 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:26:44 np0005603622 nova_compute[221301]: 2026-01-31 09:26:44.989 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:26:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:45.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:46 np0005603622 nova_compute[221301]: 2026-01-31 09:26:46.235 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:46.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:46 np0005603622 nova_compute[221301]: 2026-01-31 09:26:46.990 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:46 np0005603622 nova_compute[221301]: 2026-01-31 09:26:46.991 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:47.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:47 np0005603622 nova_compute[221301]: 2026-01-31 09:26:47.294 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:47 np0005603622 nova_compute[221301]: 2026-01-31 09:26:47.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:48.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:49.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:49 np0005603622 nova_compute[221301]: 2026-01-31 09:26:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:50.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:51 np0005603622 nova_compute[221301]: 2026-01-31 09:26:51.237 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:51.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:52 np0005603622 nova_compute[221301]: 2026-01-31 09:26:52.297 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #205. Immutable memtables: 0.
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.308798) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 131] Flushing memtable with next log file: 205
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612308840, "job": 131, "event": "flush_started", "num_memtables": 1, "num_entries": 2368, "num_deletes": 251, "total_data_size": 6099364, "memory_usage": 6182360, "flush_reason": "Manual Compaction"}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 131] Level-0 flush table #206: started
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612331418, "cf_name": "default", "job": 131, "event": "table_file_creation", "file_number": 206, "file_size": 3939867, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 98582, "largest_seqno": 100945, "table_properties": {"data_size": 3930184, "index_size": 6176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19324, "raw_average_key_size": 20, "raw_value_size": 3911098, "raw_average_value_size": 4108, "num_data_blocks": 270, "num_entries": 952, "num_filter_entries": 952, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851397, "oldest_key_time": 1769851397, "file_creation_time": 1769851612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 206, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 131] Flush lasted 22673 microseconds, and 7772 cpu microseconds.
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.331468) [db/flush_job.cc:967] [default] [JOB 131] Level-0 flush table #206: 3939867 bytes OK
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.331487) [db/memtable_list.cc:519] [default] Level-0 commit table #206 started
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.334254) [db/memtable_list.cc:722] [default] Level-0 commit table #206: memtable #1 done
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.334319) EVENT_LOG_v1 {"time_micros": 1769851612334308, "job": 131, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.334348) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 131] Try to delete WAL files size 6089077, prev total WAL file size 6089077, number of live WAL files 2.
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000202.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.336156) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038373835' seq:72057594037927935, type:22 .. '7061786F730039303337' seq:0, type:0; will stop at (end)
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 132] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 131 Base level 0, inputs: [206(3847KB)], [204(12MB)]
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612336237, "job": 132, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [206], "files_L6": [204], "score": -1, "input_data_size": 17159889, "oldest_snapshot_seqno": -1}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 132] Generated table #207: 12257 keys, 15188590 bytes, temperature: kUnknown
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612441779, "cf_name": "default", "job": 132, "event": "table_file_creation", "file_number": 207, "file_size": 15188590, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15110225, "index_size": 46636, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 323350, "raw_average_key_size": 26, "raw_value_size": 14897289, "raw_average_value_size": 1215, "num_data_blocks": 1775, "num_entries": 12257, "num_filter_entries": 12257, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851612, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 207, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.442082) [db/compaction/compaction_job.cc:1663] [default] [JOB 132] Compacted 1@0 + 1@6 files to L6 => 15188590 bytes
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.443579) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.4 rd, 143.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 12.6 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(8.2) write-amplify(3.9) OK, records in: 12776, records dropped: 519 output_compression: NoCompression
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.443597) EVENT_LOG_v1 {"time_micros": 1769851612443588, "job": 132, "event": "compaction_finished", "compaction_time_micros": 105645, "compaction_time_cpu_micros": 44709, "output_level": 6, "num_output_files": 1, "total_output_size": 15188590, "num_input_records": 12776, "num_output_records": 12257, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000206.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612444199, "job": 132, "event": "table_file_deletion", "file_number": 206}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000204.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851612445955, "job": 132, "event": "table_file_deletion", "file_number": 204}
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.336041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.446096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.446104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.446106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.446108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:26:52.446110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:26:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:52.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:53.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:54 np0005603622 nova_compute[221301]: 2026-01-31 09:26:54.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:26:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:54.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:55.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:26:56 np0005603622 nova_compute[221301]: 2026-01-31 09:26:56.238 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:26:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:56.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:57.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:57 np0005603622 nova_compute[221301]: 2026-01-31 09:26:57.298 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:26:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:26:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:26:58.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:26:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:26:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:26:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:26:59.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:00.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:01 np0005603622 nova_compute[221301]: 2026-01-31 09:27:01.240 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:01.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:02 np0005603622 nova_compute[221301]: 2026-01-31 09:27:02.301 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:02.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:03.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:04.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:05.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:06 np0005603622 podman[321980]: 2026-01-31 09:27:06.13995504 +0000 UTC m=+0.065219855 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:27:06 np0005603622 podman[321981]: 2026-01-31 09:27:06.1399559 +0000 UTC m=+0.056139181 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:27:06 np0005603622 nova_compute[221301]: 2026-01-31 09:27:06.242 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:06.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:07 np0005603622 nova_compute[221301]: 2026-01-31 09:27:07.304 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:07.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:08.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:09.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:10.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:11 np0005603622 nova_compute[221301]: 2026-01-31 09:27:11.244 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:11.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:12 np0005603622 nova_compute[221301]: 2026-01-31 09:27:12.307 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:27:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:12.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:27:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:13.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:27:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/206591375' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:27:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:27:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/206591375' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:27:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:14.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:15.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:16 np0005603622 nova_compute[221301]: 2026-01-31 09:27:16.247 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:16.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:17 np0005603622 nova_compute[221301]: 2026-01-31 09:27:17.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:17.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:18.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:19.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:21 np0005603622 nova_compute[221301]: 2026-01-31 09:27:21.249 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:21.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:22 np0005603622 nova_compute[221301]: 2026-01-31 09:27:22.312 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:22.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:23.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:24 np0005603622 nova_compute[221301]: 2026-01-31 09:27:24.489 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:24 np0005603622 nova_compute[221301]: 2026-01-31 09:27:24.490 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:27:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:25 np0005603622 systemd[1]: Starting dnf makecache...
Jan 31 04:27:25 np0005603622 dnf[322022]: Metadata cache refreshed recently.
Jan 31 04:27:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:25.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:25 np0005603622 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 31 04:27:25 np0005603622 systemd[1]: Finished dnf makecache.
Jan 31 04:27:26 np0005603622 nova_compute[221301]: 2026-01-31 09:27:26.251 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:26.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:27 np0005603622 nova_compute[221301]: 2026-01-31 09:27:27.315 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:27.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:28.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:29 np0005603622 nova_compute[221301]: 2026-01-31 09:27:29.250 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:29 np0005603622 nova_compute[221301]: 2026-01-31 09:27:29.250 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:27:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:29.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:29 np0005603622 nova_compute[221301]: 2026-01-31 09:27:29.418 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:27:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:30.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:27:30.949 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:27:30.950 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:27:30.950 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:31 np0005603622 nova_compute[221301]: 2026-01-31 09:27:31.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:31.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:32 np0005603622 nova_compute[221301]: 2026-01-31 09:27:32.319 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:32.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:33.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:33 np0005603622 nova_compute[221301]: 2026-01-31 09:27:33.560 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:33 np0005603622 nova_compute[221301]: 2026-01-31 09:27:33.560 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:34.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:35.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:36 np0005603622 nova_compute[221301]: 2026-01-31 09:27:36.254 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:36 np0005603622 nova_compute[221301]: 2026-01-31 09:27:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:36 np0005603622 nova_compute[221301]: 2026-01-31 09:27:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:27:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:36.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:37 np0005603622 podman[322024]: 2026-01-31 09:27:37.16303637 +0000 UTC m=+0.081071972 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Jan 31 04:27:37 np0005603622 podman[322023]: 2026-01-31 09:27:37.163242585 +0000 UTC m=+0.084206166 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:27:37 np0005603622 nova_compute[221301]: 2026-01-31 09:27:37.351 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:37.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:38.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:39.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:40.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:41 np0005603622 nova_compute[221301]: 2026-01-31 09:27:41.256 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:41.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:41 np0005603622 nova_compute[221301]: 2026-01-31 09:27:41.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:41 np0005603622 nova_compute[221301]: 2026-01-31 09:27:41.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:27:41 np0005603622 nova_compute[221301]: 2026-01-31 09:27:41.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:27:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:41 np0005603622 nova_compute[221301]: 2026-01-31 09:27:41.925 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.438 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.439 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.439 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.439 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.439 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:27:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:27:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/251320509' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:27:42 np0005603622 nova_compute[221301]: 2026-01-31 09:27:42.864 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:27:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:42.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.004 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.005 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4193MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.005 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.005 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.223 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.223 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.247 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.282 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.282 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.308 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.349 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:27:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:43.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.366 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:27:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:27:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1613039078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.846 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.855 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.887 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.890 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:27:43 np0005603622 nova_compute[221301]: 2026-01-31 09:27:43.891 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:27:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:44.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:27:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:45 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:27:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:45.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:45 np0005603622 nova_compute[221301]: 2026-01-31 09:27:45.891 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:46 np0005603622 nova_compute[221301]: 2026-01-31 09:27:46.258 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:46 np0005603622 nova_compute[221301]: 2026-01-31 09:27:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:27:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:46.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:27:47 np0005603622 nova_compute[221301]: 2026-01-31 09:27:47.356 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:47.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:48 np0005603622 nova_compute[221301]: 2026-01-31 09:27:48.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:48.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:49.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:49 np0005603622 nova_compute[221301]: 2026-01-31 09:27:49.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:27:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:50.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:51 np0005603622 nova_compute[221301]: 2026-01-31 09:27:51.260 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:51.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:52 np0005603622 nova_compute[221301]: 2026-01-31 09:27:52.360 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:52.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:53.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:54 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:27:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:54.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:56 np0005603622 nova_compute[221301]: 2026-01-31 09:27:56.265 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:27:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:56.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:57 np0005603622 nova_compute[221301]: 2026-01-31 09:27:57.363 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:27:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:27:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:57.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:27:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:27:58.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:27:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:27:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:27:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:27:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:00.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:01 np0005603622 nova_compute[221301]: 2026-01-31 09:28:01.267 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:02 np0005603622 nova_compute[221301]: 2026-01-31 09:28:02.367 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:02.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:03.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:04.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:06 np0005603622 nova_compute[221301]: 2026-01-31 09:28:06.270 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:06 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:06.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:07 np0005603622 nova_compute[221301]: 2026-01-31 09:28:07.370 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:07.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:08 np0005603622 podman[322295]: 2026-01-31 09:28:08.146265463 +0000 UTC m=+0.058691309 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:28:08 np0005603622 podman[322294]: 2026-01-31 09:28:08.178758047 +0000 UTC m=+0.096153327 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 31 04:28:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:08.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:28:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:09.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:28:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:10.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:11 np0005603622 nova_compute[221301]: 2026-01-31 09:28:11.271 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:11 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:12 np0005603622 nova_compute[221301]: 2026-01-31 09:28:12.373 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:12.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:14.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:16 np0005603622 nova_compute[221301]: 2026-01-31 09:28:16.274 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:16 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:16.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:17 np0005603622 nova_compute[221301]: 2026-01-31 09:28:17.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:17.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #208. Immutable memtables: 0.
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.651424) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 133] Flushing memtable with next log file: 208
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698651513, "job": 133, "event": "flush_started", "num_memtables": 1, "num_entries": 1011, "num_deletes": 250, "total_data_size": 2190930, "memory_usage": 2223968, "flush_reason": "Manual Compaction"}
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 133] Level-0 flush table #209: started
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698704015, "cf_name": "default", "job": 133, "event": "table_file_creation", "file_number": 209, "file_size": 880301, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 100950, "largest_seqno": 101956, "table_properties": {"data_size": 876638, "index_size": 1378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9702, "raw_average_key_size": 20, "raw_value_size": 868855, "raw_average_value_size": 1856, "num_data_blocks": 62, "num_entries": 468, "num_filter_entries": 468, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851613, "oldest_key_time": 1769851613, "file_creation_time": 1769851698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 209, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 133] Flush lasted 52676 microseconds, and 5454 cpu microseconds.
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.704094) [db/flush_job.cc:967] [default] [JOB 133] Level-0 flush table #209: 880301 bytes OK
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.704130) [db/memtable_list.cc:519] [default] Level-0 commit table #209 started
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.771879) [db/memtable_list.cc:722] [default] Level-0 commit table #209: memtable #1 done
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.771988) EVENT_LOG_v1 {"time_micros": 1769851698771939, "job": 133, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.772023) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 133] Try to delete WAL files size 2185991, prev total WAL file size 2185991, number of live WAL files 2.
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000205.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.773407) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353130' seq:72057594037927935, type:22 .. '6D6772737461740033373631' seq:0, type:0; will stop at (end)
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 134] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 133 Base level 0, inputs: [209(859KB)], [207(14MB)]
Jan 31 04:28:18 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851698773521, "job": 134, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [209], "files_L6": [207], "score": -1, "input_data_size": 16068891, "oldest_snapshot_seqno": -1}
Jan 31 04:28:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:18.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 134] Generated table #210: 12249 keys, 12786281 bytes, temperature: kUnknown
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699153437, "cf_name": "default", "job": 134, "event": "table_file_creation", "file_number": 210, "file_size": 12786281, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12711644, "index_size": 42908, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 323364, "raw_average_key_size": 26, "raw_value_size": 12502457, "raw_average_value_size": 1020, "num_data_blocks": 1622, "num_entries": 12249, "num_filter_entries": 12249, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851698, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 210, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.154354) [db/compaction/compaction_job.cc:1663] [default] [JOB 134] Compacted 1@0 + 1@6 files to L6 => 12786281 bytes
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.212458) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 42.3 rd, 33.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(32.8) write-amplify(14.5) OK, records in: 12725, records dropped: 476 output_compression: NoCompression
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.212518) EVENT_LOG_v1 {"time_micros": 1769851699212496, "job": 134, "event": "compaction_finished", "compaction_time_micros": 380302, "compaction_time_cpu_micros": 58288, "output_level": 6, "num_output_files": 1, "total_output_size": 12786281, "num_input_records": 12725, "num_output_records": 12249, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000209.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699212944, "job": 134, "event": "table_file_deletion", "file_number": 209}
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000207.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851699215800, "job": 134, "event": "table_file_deletion", "file_number": 207}
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:18.773213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.215932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.215945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.215950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.215954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:19.215959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:19.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:21.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:21 np0005603622 nova_compute[221301]: 2026-01-31 09:28:21.277 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:21 np0005603622 nova_compute[221301]: 2026-01-31 09:28:21.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:21 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:22 np0005603622 nova_compute[221301]: 2026-01-31 09:28:22.380 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:23.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:23.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #211. Immutable memtables: 0.
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.541356) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 135] Flushing memtable with next log file: 211
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703541404, "job": 135, "event": "flush_started", "num_memtables": 1, "num_entries": 304, "num_deletes": 257, "total_data_size": 119322, "memory_usage": 125272, "flush_reason": "Manual Compaction"}
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 135] Level-0 flush table #212: started
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703552204, "cf_name": "default", "job": 135, "event": "table_file_creation", "file_number": 212, "file_size": 78285, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 101961, "largest_seqno": 102260, "table_properties": {"data_size": 76361, "index_size": 151, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4897, "raw_average_key_size": 17, "raw_value_size": 72497, "raw_average_value_size": 259, "num_data_blocks": 7, "num_entries": 279, "num_filter_entries": 279, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851699, "oldest_key_time": 1769851699, "file_creation_time": 1769851703, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 212, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 135] Flush lasted 10984 microseconds, and 1421 cpu microseconds.
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.552330) [db/flush_job.cc:967] [default] [JOB 135] Level-0 flush table #212: 78285 bytes OK
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.552368) [db/memtable_list.cc:519] [default] Level-0 commit table #212 started
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.590578) [db/memtable_list.cc:722] [default] Level-0 commit table #212: memtable #1 done
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.590618) EVENT_LOG_v1 {"time_micros": 1769851703590605, "job": 135, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.590647) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 135] Try to delete WAL files size 117093, prev total WAL file size 117093, number of live WAL files 2.
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000208.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.591395) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303136' seq:72057594037927935, type:22 .. '6C6F676D0034323639' seq:0, type:0; will stop at (end)
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 136] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 135 Base level 0, inputs: [212(76KB)], [210(12MB)]
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703591446, "job": 136, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [212], "files_L6": [210], "score": -1, "input_data_size": 12864566, "oldest_snapshot_seqno": -1}
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 136] Generated table #213: 12006 keys, 12747214 bytes, temperature: kUnknown
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851703948047, "cf_name": "default", "job": 136, "event": "table_file_creation", "file_number": 213, "file_size": 12747214, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12673652, "index_size": 42473, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30021, "raw_key_size": 319256, "raw_average_key_size": 26, "raw_value_size": 12468068, "raw_average_value_size": 1038, "num_data_blocks": 1601, "num_entries": 12006, "num_filter_entries": 12006, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851703, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 213, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:28:23 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.948500) [db/compaction/compaction_job.cc:1663] [default] [JOB 136] Compacted 1@0 + 1@6 files to L6 => 12747214 bytes
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.021789) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 36.1 rd, 35.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 12.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(327.2) write-amplify(162.8) OK, records in: 12528, records dropped: 522 output_compression: NoCompression
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.021857) EVENT_LOG_v1 {"time_micros": 1769851704021833, "job": 136, "event": "compaction_finished", "compaction_time_micros": 356702, "compaction_time_cpu_micros": 33468, "output_level": 6, "num_output_files": 1, "total_output_size": 12747214, "num_input_records": 12528, "num_output_records": 12006, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000212.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851704022204, "job": 136, "event": "table_file_deletion", "file_number": 212}
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000210.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851704024219, "job": 136, "event": "table_file_deletion", "file_number": 210}
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:23.591188) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.024354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.024364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.024366) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.024369) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:24 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:28:24.024372) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:28:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:25.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:25.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:26 np0005603622 nova_compute[221301]: 2026-01-31 09:28:26.280 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:27.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:27 np0005603622 nova_compute[221301]: 2026-01-31 09:28:27.382 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:27.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:29.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:28:30.951 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:28:30.952 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:28:30.952 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:31.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:31 np0005603622 nova_compute[221301]: 2026-01-31 09:28:31.281 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:31.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:32 np0005603622 nova_compute[221301]: 2026-01-31 09:28:32.385 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:33.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:33.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:33 np0005603622 nova_compute[221301]: 2026-01-31 09:28:33.459 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:35.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:35 np0005603622 nova_compute[221301]: 2026-01-31 09:28:35.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:35.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:36 np0005603622 nova_compute[221301]: 2026-01-31 09:28:36.283 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:36 np0005603622 nova_compute[221301]: 2026-01-31 09:28:36.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:36 np0005603622 nova_compute[221301]: 2026-01-31 09:28:36.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:28:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:37.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:37 np0005603622 nova_compute[221301]: 2026-01-31 09:28:37.387 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:37.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:39.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:39 np0005603622 podman[322344]: 2026-01-31 09:28:39.174541648 +0000 UTC m=+0.090585387 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 31 04:28:39 np0005603622 podman[322343]: 2026-01-31 09:28:39.181225548 +0000 UTC m=+0.097388830 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:28:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:39.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:41.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:41 np0005603622 nova_compute[221301]: 2026-01-31 09:28:41.287 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:28:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:41.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:28:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.390 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.445 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.446 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.446 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.446 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:28:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:28:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1262418939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:28:42 np0005603622 nova_compute[221301]: 2026-01-31 09:28:42.922 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:28:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:43.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.095 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.097 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4191MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.097 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.098 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.189 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.189 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.225 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:28:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:43.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:28:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3226864818' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.640 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.648 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.681 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.684 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:28:43 np0005603622 nova_compute[221301]: 2026-01-31 09:28:43.684 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:28:44 np0005603622 nova_compute[221301]: 2026-01-31 09:28:44.686 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:44 np0005603622 nova_compute[221301]: 2026-01-31 09:28:44.686 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:28:44 np0005603622 nova_compute[221301]: 2026-01-31 09:28:44.687 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:28:44 np0005603622 nova_compute[221301]: 2026-01-31 09:28:44.703 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:28:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:45.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:45.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:46 np0005603622 nova_compute[221301]: 2026-01-31 09:28:46.288 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:46 np0005603622 nova_compute[221301]: 2026-01-31 09:28:46.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:47.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:47 np0005603622 nova_compute[221301]: 2026-01-31 09:28:47.393 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:47 np0005603622 nova_compute[221301]: 2026-01-31 09:28:47.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:47.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:49.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:49 np0005603622 nova_compute[221301]: 2026-01-31 09:28:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:49 np0005603622 nova_compute[221301]: 2026-01-31 09:28:49.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:28:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:49.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:51.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:51 np0005603622 nova_compute[221301]: 2026-01-31 09:28:51.290 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:51.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:52 np0005603622 nova_compute[221301]: 2026-01-31 09:28:52.396 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:53.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:28:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:53.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:28:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:55.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:55.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:28:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:28:55 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:28:56 np0005603622 nova_compute[221301]: 2026-01-31 09:28:56.292 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:56 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:28:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:57.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:57 np0005603622 nova_compute[221301]: 2026-01-31 09:28:57.398 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:28:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:57.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:28:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:28:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:28:59.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:28:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:28:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:28:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:28:59.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:01.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:01 np0005603622 nova_compute[221301]: 2026-01-31 09:29:01.296 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:01.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:01 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:02 np0005603622 nova_compute[221301]: 2026-01-31 09:29:02.402 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:03.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:03.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:05.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:05.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:06 np0005603622 nova_compute[221301]: 2026-01-31 09:29:06.297 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:07.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:07 np0005603622 nova_compute[221301]: 2026-01-31 09:29:07.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:07.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:29:08 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:29:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:09.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:09.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:10 np0005603622 podman[322614]: 2026-01-31 09:29:10.148336101 +0000 UTC m=+0.066453777 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 31 04:29:10 np0005603622 podman[322613]: 2026-01-31 09:29:10.171646568 +0000 UTC m=+0.095817857 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 31 04:29:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:11.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:11 np0005603622 nova_compute[221301]: 2026-01-31 09:29:11.299 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:11.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:12 np0005603622 nova_compute[221301]: 2026-01-31 09:29:12.416 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:13.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:13.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:15.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:15.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:16 np0005603622 nova_compute[221301]: 2026-01-31 09:29:16.300 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:17.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:17 np0005603622 nova_compute[221301]: 2026-01-31 09:29:17.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:17.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:19.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:19.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:21.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:21 np0005603622 nova_compute[221301]: 2026-01-31 09:29:21.303 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:21.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:22 np0005603622 nova_compute[221301]: 2026-01-31 09:29:22.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:23.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:23.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:25.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:25.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:26 np0005603622 nova_compute[221301]: 2026-01-31 09:29:26.305 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:27.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:27 np0005603622 nova_compute[221301]: 2026-01-31 09:29:27.430 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:27.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:29.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:29.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:29:30.952 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:29:30.952 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:29:30.952 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:31.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:31 np0005603622 nova_compute[221301]: 2026-01-31 09:29:31.306 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:31.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:32 np0005603622 nova_compute[221301]: 2026-01-31 09:29:32.433 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:33.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:33 np0005603622 nova_compute[221301]: 2026-01-31 09:29:33.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:29:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:33.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:29:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:35.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:35 np0005603622 nova_compute[221301]: 2026-01-31 09:29:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:35.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:36 np0005603622 nova_compute[221301]: 2026-01-31 09:29:36.308 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:37.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:37 np0005603622 nova_compute[221301]: 2026-01-31 09:29:37.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:37 np0005603622 nova_compute[221301]: 2026-01-31 09:29:37.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:29:37 np0005603622 nova_compute[221301]: 2026-01-31 09:29:37.435 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:37.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:39.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:39.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:41.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:41 np0005603622 podman[322659]: 2026-01-31 09:29:41.172761154 +0000 UTC m=+0.097362349 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Jan 31 04:29:41 np0005603622 podman[322660]: 2026-01-31 09:29:41.192854205 +0000 UTC m=+0.106712781 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:29:41 np0005603622 nova_compute[221301]: 2026-01-31 09:29:41.310 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:41.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:42 np0005603622 nova_compute[221301]: 2026-01-31 09:29:42.437 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:43.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.427 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.428 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.428 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.429 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:29:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:29:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3438894314' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:29:44 np0005603622 nova_compute[221301]: 2026-01-31 09:29:44.870 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.044 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.045 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4199MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.045 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.046 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:29:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:45.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.162 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.162 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.196 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:29:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:45.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:29:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/891272076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.649 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.654 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.685 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.688 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:29:45 np0005603622 nova_compute[221301]: 2026-01-31 09:29:45.688 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:29:46 np0005603622 nova_compute[221301]: 2026-01-31 09:29:46.311 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:46 np0005603622 nova_compute[221301]: 2026-01-31 09:29:46.688 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:46 np0005603622 nova_compute[221301]: 2026-01-31 09:29:46.689 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:29:46 np0005603622 nova_compute[221301]: 2026-01-31 09:29:46.689 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:29:46 np0005603622 nova_compute[221301]: 2026-01-31 09:29:46.734 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:29:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:47.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:47 np0005603622 nova_compute[221301]: 2026-01-31 09:29:47.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:47.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:48 np0005603622 nova_compute[221301]: 2026-01-31 09:29:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:49 np0005603622 nova_compute[221301]: 2026-01-31 09:29:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:49 np0005603622 nova_compute[221301]: 2026-01-31 09:29:49.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:49.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:50 np0005603622 nova_compute[221301]: 2026-01-31 09:29:50.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:29:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:51 np0005603622 nova_compute[221301]: 2026-01-31 09:29:51.314 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:51.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:52 np0005603622 nova_compute[221301]: 2026-01-31 09:29:52.443 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:53.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:53.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:55.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:56 np0005603622 nova_compute[221301]: 2026-01-31 09:29:56.317 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:57.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:29:57 np0005603622 nova_compute[221301]: 2026-01-31 09:29:57.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:29:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:29:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:57.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:29:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:29:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:29:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:29:59.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:29:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:29:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:29:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:29:59.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:00 np0005603622 ceph-mon[81571]: overall HEALTH_OK
Jan 31 04:30:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:01.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:01 np0005603622 nova_compute[221301]: 2026-01-31 09:30:01.320 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:01.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:02 np0005603622 nova_compute[221301]: 2026-01-31 09:30:02.449 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:03.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:03.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:04 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:04 np0005603622 podman[322923]: 2026-01-31 09:30:04.903351893 +0000 UTC m=+0.067561578 container exec 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 31 04:30:05 np0005603622 podman[322923]: 2026-01-31 09:30:05.002584302 +0000 UTC m=+0.166793977 container exec_died 435a482d140bc871c822475fe5add5e7ea8667a2b188d6d56ea56806c6beef79 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-2f5ab832-5f2e-5a84-bd93-cf8bab960ee2-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 31 04:30:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:05.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:05.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:06 np0005603622 nova_compute[221301]: 2026-01-31 09:30:06.322 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:30:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:06 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:30:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:07.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:07 np0005603622 nova_compute[221301]: 2026-01-31 09:30:07.452 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:07.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:09.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:09.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:11.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:11 np0005603622 nova_compute[221301]: 2026-01-31 09:30:11.324 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:11.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:12 np0005603622 podman[323175]: 2026-01-31 09:30:12.157092899 +0000 UTC m=+0.083036665 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 31 04:30:12 np0005603622 podman[323176]: 2026-01-31 09:30:12.15713978 +0000 UTC m=+0.082847289 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 31 04:30:12 np0005603622 nova_compute[221301]: 2026-01-31 09:30:12.454 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:13.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:13.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:14 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:30:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:30:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:15.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:30:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:15.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:16 np0005603622 nova_compute[221301]: 2026-01-31 09:30:16.327 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:17.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:17 np0005603622 nova_compute[221301]: 2026-01-31 09:30:17.458 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:17.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:19.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:19.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:21.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:21 np0005603622 nova_compute[221301]: 2026-01-31 09:30:21.329 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:21.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:22 np0005603622 nova_compute[221301]: 2026-01-31 09:30:22.461 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:23.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:23.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:25.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:25.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:26 np0005603622 nova_compute[221301]: 2026-01-31 09:30:26.331 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:26 np0005603622 nova_compute[221301]: 2026-01-31 09:30:26.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:27.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:27 np0005603622 nova_compute[221301]: 2026-01-31 09:30:27.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:27.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:29.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:29.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:30:30.953 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:30:30.953 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:30:30.953 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:31.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:31 np0005603622 nova_compute[221301]: 2026-01-31 09:30:31.333 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:31.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:32 np0005603622 nova_compute[221301]: 2026-01-31 09:30:32.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:33.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:33 np0005603622 nova_compute[221301]: 2026-01-31 09:30:33.409 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:33.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:35.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:35 np0005603622 nova_compute[221301]: 2026-01-31 09:30:35.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:35.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:36 np0005603622 nova_compute[221301]: 2026-01-31 09:30:36.335 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:37.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:37 np0005603622 nova_compute[221301]: 2026-01-31 09:30:37.470 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:37.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:39 np0005603622 nova_compute[221301]: 2026-01-31 09:30:39.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:39 np0005603622 nova_compute[221301]: 2026-01-31 09:30:39.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:30:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:39.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:41 np0005603622 nova_compute[221301]: 2026-01-31 09:30:41.337 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:41.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:42 np0005603622 nova_compute[221301]: 2026-01-31 09:30:42.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:43 np0005603622 podman[323271]: 2026-01-31 09:30:43.146232041 +0000 UTC m=+0.072020408 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 31 04:30:43 np0005603622 podman[323270]: 2026-01-31 09:30:43.166961978 +0000 UTC m=+0.089515988 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 04:30:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:43.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:43.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:45.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:45.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #214. Immutable memtables: 0.
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.308211) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 137] Flushing memtable with next log file: 214
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846308254, "job": 137, "event": "flush_started", "num_memtables": 1, "num_entries": 1545, "num_deletes": 251, "total_data_size": 3643063, "memory_usage": 3674800, "flush_reason": "Manual Compaction"}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 137] Level-0 flush table #215: started
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846324672, "cf_name": "default", "job": 137, "event": "table_file_creation", "file_number": 215, "file_size": 2392709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 102265, "largest_seqno": 103805, "table_properties": {"data_size": 2386157, "index_size": 3750, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13683, "raw_average_key_size": 20, "raw_value_size": 2373102, "raw_average_value_size": 3479, "num_data_blocks": 166, "num_entries": 682, "num_filter_entries": 682, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851704, "oldest_key_time": 1769851704, "file_creation_time": 1769851846, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 215, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 137] Flush lasted 16535 microseconds, and 5289 cpu microseconds.
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.324739) [db/flush_job.cc:967] [default] [JOB 137] Level-0 flush table #215: 2392709 bytes OK
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.324769) [db/memtable_list.cc:519] [default] Level-0 commit table #215 started
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.326985) [db/memtable_list.cc:722] [default] Level-0 commit table #215: memtable #1 done
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.326998) EVENT_LOG_v1 {"time_micros": 1769851846326993, "job": 137, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.327022) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 137] Try to delete WAL files size 3636029, prev total WAL file size 3636029, number of live WAL files 2.
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000211.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.327588) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039303336' seq:72057594037927935, type:22 .. '7061786F730039323838' seq:0, type:0; will stop at (end)
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 138] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 137 Base level 0, inputs: [215(2336KB)], [213(12MB)]
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846327624, "job": 138, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [215], "files_L6": [213], "score": -1, "input_data_size": 15139923, "oldest_snapshot_seqno": -1}
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.338 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 138] Generated table #216: 12171 keys, 13038619 bytes, temperature: kUnknown
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846423251, "cf_name": "default", "job": 138, "event": "table_file_creation", "file_number": 216, "file_size": 13038619, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12964112, "index_size": 42970, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 323369, "raw_average_key_size": 26, "raw_value_size": 12755803, "raw_average_value_size": 1048, "num_data_blocks": 1614, "num_entries": 12171, "num_filter_entries": 12171, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769851846, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 216, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.423630) [db/compaction/compaction_job.cc:1663] [default] [JOB 138] Compacted 1@0 + 1@6 files to L6 => 13038619 bytes
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.425700) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.0 rd, 136.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 12688, records dropped: 517 output_compression: NoCompression
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.425727) EVENT_LOG_v1 {"time_micros": 1769851846425714, "job": 138, "event": "compaction_finished", "compaction_time_micros": 95814, "compaction_time_cpu_micros": 27280, "output_level": 6, "num_output_files": 1, "total_output_size": 13038619, "num_input_records": 12688, "num_output_records": 12171, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000215.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846426355, "job": 138, "event": "table_file_deletion", "file_number": 215}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000213.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769851846428347, "job": 138, "event": "table_file_deletion", "file_number": 213}
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.327525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.428425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.428431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.428433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.428437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:30:46.428440) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.446 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.446 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.506 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.507 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.507 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.507 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.507 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:30:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/346971123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:30:46 np0005603622 nova_compute[221301]: 2026-01-31 09:30:46.975 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.112 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.113 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4195MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.113 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.113 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.175 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.176 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.191 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:30:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:47.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:47.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:47 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:30:47 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/363606253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.706 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.710 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.736 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.738 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:30:47 np0005603622 nova_compute[221301]: 2026-01-31 09:30:47.738 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:30:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:49.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:49.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:50 np0005603622 nova_compute[221301]: 2026-01-31 09:30:50.688 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:50 np0005603622 nova_compute[221301]: 2026-01-31 09:30:50.689 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:30:51 np0005603622 nova_compute[221301]: 2026-01-31 09:30:51.342 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:51 np0005603622 nova_compute[221301]: 2026-01-31 09:30:51.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:51.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:52 np0005603622 nova_compute[221301]: 2026-01-31 09:30:52.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:30:52 np0005603622 nova_compute[221301]: 2026-01-31 09:30:52.479 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:53.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:53.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:55.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:55.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:56 np0005603622 nova_compute[221301]: 2026-01-31 09:30:56.343 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:57.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:57 np0005603622 nova_compute[221301]: 2026-01-31 09:30:57.482 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:30:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:57.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:30:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:30:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:30:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:30:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:30:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:30:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:30:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:01.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:01 np0005603622 nova_compute[221301]: 2026-01-31 09:31:01.345 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:01.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:02 np0005603622 nova_compute[221301]: 2026-01-31 09:31:02.486 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 31 04:31:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:03.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 31 04:31:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:05.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:06 np0005603622 nova_compute[221301]: 2026-01-31 09:31:06.346 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:07 np0005603622 nova_compute[221301]: 2026-01-31 09:31:07.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:07.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:09.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:11.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:11 np0005603622 nova_compute[221301]: 2026-01-31 09:31:11.348 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:11.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:12 np0005603622 nova_compute[221301]: 2026-01-31 09:31:12.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:13.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:13 np0005603622 podman[323381]: 2026-01-31 09:31:13.68367122 +0000 UTC m=+0.050196271 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:31:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:13.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:13 np0005603622 podman[323380]: 2026-01-31 09:31:13.711926129 +0000 UTC m=+0.079856558 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:31:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:15.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:16 np0005603622 nova_compute[221301]: 2026-01-31 09:31:16.350 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:31:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:16 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:31:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:17.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:17 np0005603622 nova_compute[221301]: 2026-01-31 09:31:17.495 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:17.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:19.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:19.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:21 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:31:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:21.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:21 np0005603622 nova_compute[221301]: 2026-01-31 09:31:21.352 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:21.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:22 np0005603622 nova_compute[221301]: 2026-01-31 09:31:22.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:23.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:23.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:25.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:25.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:26 np0005603622 nova_compute[221301]: 2026-01-31 09:31:26.353 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:27.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:27 np0005603622 nova_compute[221301]: 2026-01-31 09:31:27.501 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:27.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:29.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:29.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:31:30.954 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:31:30.954 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:31:30.954 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:31.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:31 np0005603622 nova_compute[221301]: 2026-01-31 09:31:31.356 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:31.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:32 np0005603622 nova_compute[221301]: 2026-01-31 09:31:32.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:33.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:33.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:34 np0005603622 nova_compute[221301]: 2026-01-31 09:31:34.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:35.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:35.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:36 np0005603622 nova_compute[221301]: 2026-01-31 09:31:36.359 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:37.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:37 np0005603622 nova_compute[221301]: 2026-01-31 09:31:37.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:37 np0005603622 nova_compute[221301]: 2026-01-31 09:31:37.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:39.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:39 np0005603622 nova_compute[221301]: 2026-01-31 09:31:39.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:39 np0005603622 nova_compute[221301]: 2026-01-31 09:31:39.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:31:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:41.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:41 np0005603622 nova_compute[221301]: 2026-01-31 09:31:41.362 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:41.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:42 np0005603622 nova_compute[221301]: 2026-01-31 09:31:42.512 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:43.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:44 np0005603622 podman[323585]: 2026-01-31 09:31:44.151269416 +0000 UTC m=+0.065680197 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 31 04:31:44 np0005603622 podman[323584]: 2026-01-31 09:31:44.20050032 +0000 UTC m=+0.118391325 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127)
Jan 31 04:31:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:45.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:45.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:46 np0005603622 nova_compute[221301]: 2026-01-31 09:31:46.368 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:47.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:47 np0005603622 nova_compute[221301]: 2026-01-31 09:31:47.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:47 np0005603622 nova_compute[221301]: 2026-01-31 09:31:47.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:31:47 np0005603622 nova_compute[221301]: 2026-01-31 09:31:47.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:31:47 np0005603622 nova_compute[221301]: 2026-01-31 09:31:47.421 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:31:47 np0005603622 nova_compute[221301]: 2026-01-31 09:31:47.515 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:47.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.432 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.433 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.433 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.433 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:31:48 np0005603622 nova_compute[221301]: 2026-01-31 09:31:48.434 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:31:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:31:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1653743701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.051 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.229 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.231 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4205MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.231 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.231 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.321 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.322 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:31:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:49.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.469 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:31:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:31:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:49.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:31:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:31:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/506467788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.945 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.954 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.982 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.984 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:31:49 np0005603622 nova_compute[221301]: 2026-01-31 09:31:49.985 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:31:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:51.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:51 np0005603622 nova_compute[221301]: 2026-01-31 09:31:51.370 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:51.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:51 np0005603622 nova_compute[221301]: 2026-01-31 09:31:51.986 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:51 np0005603622 nova_compute[221301]: 2026-01-31 09:31:51.987 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:52 np0005603622 nova_compute[221301]: 2026-01-31 09:31:52.518 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:53.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:53 np0005603622 nova_compute[221301]: 2026-01-31 09:31:53.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:53 np0005603622 nova_compute[221301]: 2026-01-31 09:31:53.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:53.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:55.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:31:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:55.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:31:56 np0005603622 nova_compute[221301]: 2026-01-31 09:31:56.372 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:57 np0005603622 nova_compute[221301]: 2026-01-31 09:31:57.523 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:31:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:57.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:31:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:31:59.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:31:59 np0005603622 nova_compute[221301]: 2026-01-31 09:31:59.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:31:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:31:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:31:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:31:59.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:01.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:01 np0005603622 nova_compute[221301]: 2026-01-31 09:32:01.375 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:01.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:32:02 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 70K writes, 279K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.04 MB/s#012Cumulative WAL: 70K writes, 26K syncs, 2.71 writes per sync, written: 0.27 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 599 writes, 911 keys, 599 commit groups, 1.0 writes per commit group, ingest: 0.29 MB, 0.00 MB/s#012Interval WAL: 599 writes, 299 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:32:02 np0005603622 nova_compute[221301]: 2026-01-31 09:32:02.525 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:03.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:03.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:05.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:05.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:06 np0005603622 nova_compute[221301]: 2026-01-31 09:32:06.377 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:07.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:07 np0005603622 nova_compute[221301]: 2026-01-31 09:32:07.528 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:07.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:09.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:09.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:32:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:11.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:32:11 np0005603622 nova_compute[221301]: 2026-01-31 09:32:11.378 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:11.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:12 np0005603622 nova_compute[221301]: 2026-01-31 09:32:12.531 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:13.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:13.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:15 np0005603622 podman[323677]: 2026-01-31 09:32:15.118642511 +0000 UTC m=+0.048603719 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 31 04:32:15 np0005603622 podman[323676]: 2026-01-31 09:32:15.142706768 +0000 UTC m=+0.072546662 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 31 04:32:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:15.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:15.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:16 np0005603622 nova_compute[221301]: 2026-01-31 09:32:16.380 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:17.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:17 np0005603622 nova_compute[221301]: 2026-01-31 09:32:17.533 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:17.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:19.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:19.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:21 np0005603622 nova_compute[221301]: 2026-01-31 09:32:21.381 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:21.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:21.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:32:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:22 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:32:22 np0005603622 nova_compute[221301]: 2026-01-31 09:32:22.537 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:23.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:23.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:25.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:25.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:26 np0005603622 nova_compute[221301]: 2026-01-31 09:32:26.384 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:26 np0005603622 nova_compute[221301]: 2026-01-31 09:32:26.409 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:32:27 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.0 total, 600.0 interval#012Cumulative writes: 21K writes, 104K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.21 GB, 0.03 MB/s#012Cumulative WAL: 21K writes, 21K syncs, 1.00 writes per sync, written: 0.21 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1468 writes, 6936 keys, 1468 commit groups, 1.0 writes per commit group, ingest: 15.34 MB, 0.03 MB/s#012Interval WAL: 1467 writes, 1467 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     36.6      3.58              0.42        69    0.052       0      0       0.0       0.0#012  L6      1/0   12.43 MB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   5.6     84.5     72.7     10.04              2.33        68    0.148    578K    36K       0.0       0.0#012 Sum      1/0   12.43 MB   0.0      0.8     0.1      0.7       0.8      0.1       0.0   6.6     62.3     63.2     13.62              2.74       137    0.099    578K    36K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     62.5     63.0      1.17              0.23        10    0.117     63K   2553       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   0.0     84.5     72.7     10.04              2.33        68    0.148    578K    36K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     36.6      3.58              0.42        68    0.053       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7800.0 total, 600.0 interval#012Flush(GB): cumulative 0.128, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.84 GB write, 0.11 MB/s write, 0.83 GB read, 0.11 MB/s read, 13.6 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cec8fd91f0#2 capacity: 304.00 MB usage: 93.28 MB table_size: 0 occupancy: 18446744073709551615 collections: 14 last_copies: 0 last_secs: 0.001154 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5727,89.21 MB,29.3443%) FilterBlock(137,1.56 MB,0.514015%) IndexBlock(137,2.51 MB,0.825144%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 31 04:32:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:27.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:27 np0005603622 nova_compute[221301]: 2026-01-31 09:32:27.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:27 np0005603622 nova_compute[221301]: 2026-01-31 09:32:27.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 31 04:32:27 np0005603622 nova_compute[221301]: 2026-01-31 09:32:27.540 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:27.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:28 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:32:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:29.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:29.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:32:30.955 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:32:30.956 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:32:30.956 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:31 np0005603622 nova_compute[221301]: 2026-01-31 09:32:31.386 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:31.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:32 np0005603622 nova_compute[221301]: 2026-01-31 09:32:32.546 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:33.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:35 np0005603622 nova_compute[221301]: 2026-01-31 09:32:35.410 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:36 np0005603622 nova_compute[221301]: 2026-01-31 09:32:36.388 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:36 np0005603622 nova_compute[221301]: 2026-01-31 09:32:36.959 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:37.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:37 np0005603622 nova_compute[221301]: 2026-01-31 09:32:37.414 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:37 np0005603622 nova_compute[221301]: 2026-01-31 09:32:37.549 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:37.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:39.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:39.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:40 np0005603622 nova_compute[221301]: 2026-01-31 09:32:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:40 np0005603622 nova_compute[221301]: 2026-01-31 09:32:40.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:32:40 np0005603622 nova_compute[221301]: 2026-01-31 09:32:40.398 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:40 np0005603622 nova_compute[221301]: 2026-01-31 09:32:40.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 31 04:32:40 np0005603622 nova_compute[221301]: 2026-01-31 09:32:40.510 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 31 04:32:41 np0005603622 nova_compute[221301]: 2026-01-31 09:32:41.391 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:41.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:41.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:42 np0005603622 nova_compute[221301]: 2026-01-31 09:32:42.553 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:43.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:43.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:45.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:45.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:46 np0005603622 podman[323901]: 2026-01-31 09:32:46.156187305 +0000 UTC m=+0.065316948 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:32:46 np0005603622 podman[323900]: 2026-01-31 09:32:46.167028706 +0000 UTC m=+0.086065215 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 31 04:32:46 np0005603622 nova_compute[221301]: 2026-01-31 09:32:46.392 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:47.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:47 np0005603622 nova_compute[221301]: 2026-01-31 09:32:47.510 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:47 np0005603622 nova_compute[221301]: 2026-01-31 09:32:47.511 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:32:47 np0005603622 nova_compute[221301]: 2026-01-31 09:32:47.511 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:32:47 np0005603622 nova_compute[221301]: 2026-01-31 09:32:47.525 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:32:47 np0005603622 nova_compute[221301]: 2026-01-31 09:32:47.555 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.421 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.422 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.422 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:32:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:32:48 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1371436130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:32:48 np0005603622 nova_compute[221301]: 2026-01-31 09:32:48.867 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.011 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.013 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4219MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.013 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.014 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.076 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.076 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.196 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing inventories for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.318 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating ProviderTree inventory for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.318 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Updating inventory in ProviderTree for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.334 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing aggregate associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.356 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Refreshing trait associations for resource provider f7fd90d1-7583-42ff-b709-f5fc55f6e273, traits: HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.373 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:32:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:49.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:32:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1063644442' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.830 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:32:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:49.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.838 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.857 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.860 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:32:49 np0005603622 nova_compute[221301]: 2026-01-31 09:32:49.860 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:32:51 np0005603622 nova_compute[221301]: 2026-01-31 09:32:51.394 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:51.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:51.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:51 np0005603622 nova_compute[221301]: 2026-01-31 09:32:51.861 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:52 np0005603622 nova_compute[221301]: 2026-01-31 09:32:52.557 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:53 np0005603622 nova_compute[221301]: 2026-01-31 09:32:53.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:53 np0005603622 nova_compute[221301]: 2026-01-31 09:32:53.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:53 np0005603622 nova_compute[221301]: 2026-01-31 09:32:53.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:32:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:53.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:53.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:55.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:55.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:56 np0005603622 nova_compute[221301]: 2026-01-31 09:32:56.397 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:57.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:57 np0005603622 nova_compute[221301]: 2026-01-31 09:32:57.561 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:32:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:57.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:32:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:32:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:32:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:32:59.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:32:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:32:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:32:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:32:59.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:01 np0005603622 nova_compute[221301]: 2026-01-31 09:33:01.401 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:01.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:01.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:02 np0005603622 nova_compute[221301]: 2026-01-31 09:33:02.564 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:03.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:03.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:05.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:33:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:05.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:33:06 np0005603622 nova_compute[221301]: 2026-01-31 09:33:06.404 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:07.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:07 np0005603622 nova_compute[221301]: 2026-01-31 09:33:07.567 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:07.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:09.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:11 np0005603622 nova_compute[221301]: 2026-01-31 09:33:11.404 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:11.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:11.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:12 np0005603622 nova_compute[221301]: 2026-01-31 09:33:12.569 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:13.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:13.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:15.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:15.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:16 np0005603622 nova_compute[221301]: 2026-01-31 09:33:16.407 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:17 np0005603622 podman[323996]: 2026-01-31 09:33:17.126238602 +0000 UTC m=+0.046448640 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 31 04:33:17 np0005603622 podman[323995]: 2026-01-31 09:33:17.155502969 +0000 UTC m=+0.080447214 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 31 04:33:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:17.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:17 np0005603622 nova_compute[221301]: 2026-01-31 09:33:17.571 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:17.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:19.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:19.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:21 np0005603622 nova_compute[221301]: 2026-01-31 09:33:21.409 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:21.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:21.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:22 np0005603622 nova_compute[221301]: 2026-01-31 09:33:22.574 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:23.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:23.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:25.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:25.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:26 np0005603622 nova_compute[221301]: 2026-01-31 09:33:26.411 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:27.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:27 np0005603622 nova_compute[221301]: 2026-01-31 09:33:27.578 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:27.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:29 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:29.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:29.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:33:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:30 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:33:30.955 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:33:30.955 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:33:30.956 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:31 np0005603622 nova_compute[221301]: 2026-01-31 09:33:31.412 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:31.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:31.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:32 np0005603622 nova_compute[221301]: 2026-01-31 09:33:32.581 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:33.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:35.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:33:36 np0005603622 nova_compute[221301]: 2026-01-31 09:33:36.414 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:37 np0005603622 nova_compute[221301]: 2026-01-31 09:33:37.393 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:37 np0005603622 nova_compute[221301]: 2026-01-31 09:33:37.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:37.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:37 np0005603622 nova_compute[221301]: 2026-01-31 09:33:37.585 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:37.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:39.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:40 np0005603622 nova_compute[221301]: 2026-01-31 09:33:40.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:40 np0005603622 nova_compute[221301]: 2026-01-31 09:33:40.395 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:33:41 np0005603622 nova_compute[221301]: 2026-01-31 09:33:41.415 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:41.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:41.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:42 np0005603622 nova_compute[221301]: 2026-01-31 09:33:42.588 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:43.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:45.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:33:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:45.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:33:46 np0005603622 nova_compute[221301]: 2026-01-31 09:33:46.418 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:47 np0005603622 nova_compute[221301]: 2026-01-31 09:33:47.591 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:47.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:48 np0005603622 podman[324222]: 2026-01-31 09:33:48.148495506 +0000 UTC m=+0.065293218 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Jan 31 04:33:48 np0005603622 podman[324221]: 2026-01-31 09:33:48.216217597 +0000 UTC m=+0.132413982 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 31 04:33:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.398 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.453 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.453 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.491 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.492 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.492 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.493 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.493 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:33:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:49.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:33:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1728031325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:33:49 np0005603622 nova_compute[221301]: 2026-01-31 09:33:49.927 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:33:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:49.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.097 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.098 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4192MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.099 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.099 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.184 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.184 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.201 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:33:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:33:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3976869346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.644 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.649 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.697 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.698 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:33:50 np0005603622 nova_compute[221301]: 2026-01-31 09:33:50.699 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:33:51 np0005603622 nova_compute[221301]: 2026-01-31 09:33:51.421 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:51.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:51.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:52 np0005603622 nova_compute[221301]: 2026-01-31 09:33:52.595 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:53.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:53 np0005603622 nova_compute[221301]: 2026-01-31 09:33:53.641 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:53 np0005603622 nova_compute[221301]: 2026-01-31 09:33:53.643 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:53 np0005603622 nova_compute[221301]: 2026-01-31 09:33:53.643 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:33:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:53.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:33:55 np0005603622 nova_compute[221301]: 2026-01-31 09:33:55.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:33:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:55.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:55.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:56 np0005603622 nova_compute[221301]: 2026-01-31 09:33:56.424 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:57.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:57 np0005603622 nova_compute[221301]: 2026-01-31 09:33:57.598 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:33:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:57.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:33:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:33:59.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:33:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:33:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:33:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:33:59.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:01 np0005603622 nova_compute[221301]: 2026-01-31 09:34:01.426 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:01.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:01.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:02 np0005603622 nova_compute[221301]: 2026-01-31 09:34:02.600 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:03.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:03.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:05.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:06 np0005603622 nova_compute[221301]: 2026-01-31 09:34:06.429 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:07 np0005603622 nova_compute[221301]: 2026-01-31 09:34:07.603 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:07.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:09.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:09.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:11 np0005603622 nova_compute[221301]: 2026-01-31 09:34:11.432 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:11.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:11.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:12 np0005603622 nova_compute[221301]: 2026-01-31 09:34:12.606 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:13.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:13.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:15.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:15.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:16 np0005603622 nova_compute[221301]: 2026-01-31 09:34:16.434 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:17.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:17 np0005603622 nova_compute[221301]: 2026-01-31 09:34:17.610 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:17.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:19 np0005603622 podman[324311]: 2026-01-31 09:34:19.150734024 +0000 UTC m=+0.071484554 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 31 04:34:19 np0005603622 podman[324312]: 2026-01-31 09:34:19.182137669 +0000 UTC m=+0.089952871 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:34:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:19.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:19.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:21 np0005603622 nova_compute[221301]: 2026-01-31 09:34:21.436 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:21.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:21.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:22 np0005603622 nova_compute[221301]: 2026-01-31 09:34:22.612 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:23.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:23.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:25.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:25.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:26 np0005603622 nova_compute[221301]: 2026-01-31 09:34:26.439 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:27.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:27 np0005603622 nova_compute[221301]: 2026-01-31 09:34:27.615 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:27.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:29.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:29.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:34:30.957 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:34:30.957 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:34:30.958 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:31 np0005603622 nova_compute[221301]: 2026-01-31 09:34:31.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:31 np0005603622 nova_compute[221301]: 2026-01-31 09:34:31.440 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:31.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:31.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #217. Immutable memtables: 0.
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.325135) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 139] Flushing memtable with next log file: 217
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072325213, "job": 139, "event": "flush_started", "num_memtables": 1, "num_entries": 2356, "num_deletes": 251, "total_data_size": 5908252, "memory_usage": 5981728, "flush_reason": "Manual Compaction"}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 139] Level-0 flush table #218: started
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072356017, "cf_name": "default", "job": 139, "event": "table_file_creation", "file_number": 218, "file_size": 3865367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 103810, "largest_seqno": 106161, "table_properties": {"data_size": 3855846, "index_size": 6078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19097, "raw_average_key_size": 20, "raw_value_size": 3836977, "raw_average_value_size": 4073, "num_data_blocks": 266, "num_entries": 942, "num_filter_entries": 942, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769851847, "oldest_key_time": 1769851847, "file_creation_time": 1769852072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 218, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 139] Flush lasted 30983 microseconds, and 10798 cpu microseconds.
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.356110) [db/flush_job.cc:967] [default] [JOB 139] Level-0 flush table #218: 3865367 bytes OK
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.356159) [db/memtable_list.cc:519] [default] Level-0 commit table #218 started
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.358562) [db/memtable_list.cc:722] [default] Level-0 commit table #218: memtable #1 done
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.358593) EVENT_LOG_v1 {"time_micros": 1769852072358583, "job": 139, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.358627) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 139] Try to delete WAL files size 5898073, prev total WAL file size 5898073, number of live WAL files 2.
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000214.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.360314) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039323837' seq:72057594037927935, type:22 .. '7061786F730039353339' seq:0, type:0; will stop at (end)
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 140] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 139 Base level 0, inputs: [218(3774KB)], [216(12MB)]
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072360390, "job": 140, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [218], "files_L6": [216], "score": -1, "input_data_size": 16903986, "oldest_snapshot_seqno": -1}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 140] Generated table #219: 12596 keys, 14843471 bytes, temperature: kUnknown
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072456773, "cf_name": "default", "job": 140, "event": "table_file_creation", "file_number": 219, "file_size": 14843471, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14764558, "index_size": 46288, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31557, "raw_key_size": 332896, "raw_average_key_size": 26, "raw_value_size": 14547219, "raw_average_value_size": 1154, "num_data_blocks": 1752, "num_entries": 12596, "num_filter_entries": 12596, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769852072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 219, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.458128) [db/compaction/compaction_job.cc:1663] [default] [JOB 140] Compacted 1@0 + 1@6 files to L6 => 14843471 bytes
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.460164) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.1 rd, 153.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.4 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(8.2) write-amplify(3.8) OK, records in: 13113, records dropped: 517 output_compression: NoCompression
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.460197) EVENT_LOG_v1 {"time_micros": 1769852072460181, "job": 140, "event": "compaction_finished", "compaction_time_micros": 96566, "compaction_time_cpu_micros": 32892, "output_level": 6, "num_output_files": 1, "total_output_size": 14843471, "num_input_records": 13113, "num_output_records": 12596, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000218.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072461079, "job": 140, "event": "table_file_deletion", "file_number": 218}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000216.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852072463178, "job": 140, "event": "table_file_deletion", "file_number": 216}
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.360179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.463405) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.463414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.463417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.463418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:34:32.463420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:34:32 np0005603622 nova_compute[221301]: 2026-01-31 09:34:32.618 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:33.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:33.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:35.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:35.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:34:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:36 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:34:36 np0005603622 nova_compute[221301]: 2026-01-31 09:34:36.444 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:37.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:37 np0005603622 nova_compute[221301]: 2026-01-31 09:34:37.621 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:38.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:38 np0005603622 nova_compute[221301]: 2026-01-31 09:34:38.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:38 np0005603622 nova_compute[221301]: 2026-01-31 09:34:38.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:39.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:40.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:41 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:34:41 np0005603622 nova_compute[221301]: 2026-01-31 09:34:41.446 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:41.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:42.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:42 np0005603622 nova_compute[221301]: 2026-01-31 09:34:42.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:42 np0005603622 nova_compute[221301]: 2026-01-31 09:34:42.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:34:42 np0005603622 nova_compute[221301]: 2026-01-31 09:34:42.624 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:43.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:44.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:45.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:34:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:46.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:34:46 np0005603622 nova_compute[221301]: 2026-01-31 09:34:46.448 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:47.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:47 np0005603622 nova_compute[221301]: 2026-01-31 09:34:47.626 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:34:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:48.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:34:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.419 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.419 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.419 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.420 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.420 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:34:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:49.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:49 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:34:49 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/210150799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:34:49 np0005603622 nova_compute[221301]: 2026-01-31 09:34:49.894 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:34:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:50.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.044 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.045 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4209MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.046 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.046 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:34:50 np0005603622 podman[324562]: 2026-01-31 09:34:50.133159731 +0000 UTC m=+0.056365927 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.146 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.147 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.175 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:34:50 np0005603622 podman[324561]: 2026-01-31 09:34:50.179896707 +0000 UTC m=+0.102052145 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 31 04:34:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:34:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2461455368' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.602 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.609 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.630 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.632 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:34:50 np0005603622 nova_compute[221301]: 2026-01-31 09:34:50.633 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:34:51 np0005603622 nova_compute[221301]: 2026-01-31 09:34:51.450 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:51.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:51 np0005603622 nova_compute[221301]: 2026-01-31 09:34:51.634 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:51 np0005603622 nova_compute[221301]: 2026-01-31 09:34:51.635 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:34:51 np0005603622 nova_compute[221301]: 2026-01-31 09:34:51.635 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:34:51 np0005603622 nova_compute[221301]: 2026-01-31 09:34:51.663 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:34:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:52.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:52 np0005603622 nova_compute[221301]: 2026-01-31 09:34:52.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:52 np0005603622 nova_compute[221301]: 2026-01-31 09:34:52.629 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:53.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:54.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:55 np0005603622 nova_compute[221301]: 2026-01-31 09:34:55.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:55 np0005603622 nova_compute[221301]: 2026-01-31 09:34:55.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:55.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:34:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:56.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:34:56 np0005603622 nova_compute[221301]: 2026-01-31 09:34:56.457 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:57 np0005603622 nova_compute[221301]: 2026-01-31 09:34:57.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:34:57 np0005603622 nova_compute[221301]: 2026-01-31 09:34:57.634 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:34:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:57.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:34:58.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:34:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:34:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:34:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:34:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:34:59.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:00.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:01 np0005603622 nova_compute[221301]: 2026-01-31 09:35:01.460 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:01.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:02.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:02 np0005603622 nova_compute[221301]: 2026-01-31 09:35:02.639 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:03.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:04.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:05.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:06.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:06 np0005603622 nova_compute[221301]: 2026-01-31 09:35:06.462 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:07 np0005603622 nova_compute[221301]: 2026-01-31 09:35:07.642 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:07.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:08.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:09.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:10.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:11 np0005603622 nova_compute[221301]: 2026-01-31 09:35:11.464 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:11.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:12.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:12 np0005603622 nova_compute[221301]: 2026-01-31 09:35:12.644 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:13.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:14.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 31 04:35:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1513609383' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 31 04:35:14 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 31 04:35:14 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1513609383' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 31 04:35:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:15.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:16.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:16 np0005603622 nova_compute[221301]: 2026-01-31 09:35:16.467 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:17 np0005603622 nova_compute[221301]: 2026-01-31 09:35:17.648 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:17.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:18.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:35:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:19.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:35:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:20.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:21 np0005603622 podman[324632]: 2026-01-31 09:35:21.127150005 +0000 UTC m=+0.049528053 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 31 04:35:21 np0005603622 podman[324631]: 2026-01-31 09:35:21.188068974 +0000 UTC m=+0.110420791 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true)
Jan 31 04:35:21 np0005603622 nova_compute[221301]: 2026-01-31 09:35:21.470 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:21.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:22.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:22 np0005603622 nova_compute[221301]: 2026-01-31 09:35:22.652 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:23.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:24.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:25.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:26 np0005603622 nova_compute[221301]: 2026-01-31 09:35:26.473 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:27 np0005603622 nova_compute[221301]: 2026-01-31 09:35:27.658 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:27.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:28.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:29.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:30.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:35:30.958 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:35:30.959 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:35:30.959 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:31 np0005603622 nova_compute[221301]: 2026-01-31 09:35:31.476 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:31.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:32.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:32 np0005603622 nova_compute[221301]: 2026-01-31 09:35:32.663 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:33.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:34.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:35.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:36.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:36 np0005603622 nova_compute[221301]: 2026-01-31 09:35:36.478 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:37 np0005603622 nova_compute[221301]: 2026-01-31 09:35:37.667 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:37.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:38.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:39 np0005603622 nova_compute[221301]: 2026-01-31 09:35:39.391 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:35:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:39.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:35:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:40.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:40 np0005603622 nova_compute[221301]: 2026-01-31 09:35:40.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:41 np0005603622 nova_compute[221301]: 2026-01-31 09:35:41.481 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:41.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:42.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 31 04:35:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:42 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 31 04:35:42 np0005603622 nova_compute[221301]: 2026-01-31 09:35:42.670 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:43.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:44.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:44 np0005603622 nova_compute[221301]: 2026-01-31 09:35:44.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:44 np0005603622 nova_compute[221301]: 2026-01-31 09:35:44.397 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:35:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:46.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:46 np0005603622 nova_compute[221301]: 2026-01-31 09:35:46.483 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:47 np0005603622 nova_compute[221301]: 2026-01-31 09:35:47.674 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:47 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:47 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:35:47 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:47.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:35:48 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:48 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:48 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:48.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:48 np0005603622 ceph-mon[81571]: from='mgr.14134 192.168.122.100:0/2753939032' entity='mgr.compute-0.ddmhwk' 
Jan 31 04:35:48 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:49 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:49 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:49 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:50 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:50 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:50 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:50.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.534 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.534 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.534 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.534 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 31 04:35:50 np0005603622 nova_compute[221301]: 2026-01-31 09:35:50.535 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:35:50 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:35:50 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/579233382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.013 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.171 221324 WARNING nova.virt.libvirt.driver [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.172 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4193MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.172 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.173 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.392 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.393 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.418 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.486 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:51 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:51 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:51 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:51.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:51 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 31 04:35:51 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2446714764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.864 221324 DEBUG oslo_concurrency.processutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.872 221324 DEBUG nova.compute.provider_tree [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed in ProviderTree for provider: f7fd90d1-7583-42ff-b709-f5fc55f6e273 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.969 221324 DEBUG nova.scheduler.client.report [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Inventory has not changed for provider f7fd90d1-7583-42ff-b709-f5fc55f6e273 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.973 221324 DEBUG nova.compute.resource_tracker [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 31 04:35:51 np0005603622 nova_compute[221301]: 2026-01-31 09:35:51.973 221324 DEBUG oslo_concurrency.lockutils [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:35:52 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:52 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:52 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:52.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:52 np0005603622 podman[324903]: 2026-01-31 09:35:52.160253229 +0000 UTC m=+0.060846098 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Jan 31 04:35:52 np0005603622 podman[324902]: 2026-01-31 09:35:52.199499615 +0000 UTC m=+0.100038602 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 31 04:35:52 np0005603622 nova_compute[221301]: 2026-01-31 09:35:52.677 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:52 np0005603622 nova_compute[221301]: 2026-01-31 09:35:52.974 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:52 np0005603622 nova_compute[221301]: 2026-01-31 09:35:52.974 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 31 04:35:52 np0005603622 nova_compute[221301]: 2026-01-31 09:35:52.974 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 31 04:35:52 np0005603622 nova_compute[221301]: 2026-01-31 09:35:52.995 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 31 04:35:53 np0005603622 nova_compute[221301]: 2026-01-31 09:35:53.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:53 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:53 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:53 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:53.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:53 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:54 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:54 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:54 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:54.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:55 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:55 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:55 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:55.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:56 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:56 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:56 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:56.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:56 np0005603622 nova_compute[221301]: 2026-01-31 09:35:56.397 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:56 np0005603622 nova_compute[221301]: 2026-01-31 09:35:56.488 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:57 np0005603622 nova_compute[221301]: 2026-01-31 09:35:57.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:57 np0005603622 nova_compute[221301]: 2026-01-31 09:35:57.396 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:35:57 np0005603622 nova_compute[221301]: 2026-01-31 09:35:57.680 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:35:57 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:57 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:57 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:57.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:35:58 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:58 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:35:58 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:35:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:35:58 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:35:59 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:35:59 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:35:59 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:35:59.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:00 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:00 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:36:00 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:00.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:36:01 np0005603622 nova_compute[221301]: 2026-01-31 09:36:01.490 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:01 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:01 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:01 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:01.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:02 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:02 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:02 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:02.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:02 np0005603622 nova_compute[221301]: 2026-01-31 09:36:02.684 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #220. Immutable memtables: 0.
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.552486) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 141] Flushing memtable with next log file: 220
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163552581, "job": 141, "event": "flush_started", "num_memtables": 1, "num_entries": 1123, "num_deletes": 256, "total_data_size": 2378085, "memory_usage": 2414944, "flush_reason": "Manual Compaction"}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 141] Level-0 flush table #221: started
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163561219, "cf_name": "default", "job": 141, "event": "table_file_creation", "file_number": 221, "file_size": 1557690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 106166, "largest_seqno": 107284, "table_properties": {"data_size": 1552709, "index_size": 2504, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10776, "raw_average_key_size": 19, "raw_value_size": 1542693, "raw_average_value_size": 2799, "num_data_blocks": 109, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769852073, "oldest_key_time": 1769852073, "file_creation_time": 1769852163, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 221, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 141] Flush lasted 8885 microseconds, and 3961 cpu microseconds.
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.561380) [db/flush_job.cc:967] [default] [JOB 141] Level-0 flush table #221: 1557690 bytes OK
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.561405) [db/memtable_list.cc:519] [default] Level-0 commit table #221 started
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.563460) [db/memtable_list.cc:722] [default] Level-0 commit table #221: memtable #1 done
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.563486) EVENT_LOG_v1 {"time_micros": 1769852163563477, "job": 141, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.563513) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 141] Try to delete WAL files size 2372656, prev total WAL file size 2372656, number of live WAL files 2.
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000217.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.564210) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323638' seq:72057594037927935, type:22 .. '6C6F676D0034353230' seq:0, type:0; will stop at (end)
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 142] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 141 Base level 0, inputs: [221(1521KB)], [219(14MB)]
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163564265, "job": 142, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [221], "files_L6": [219], "score": -1, "input_data_size": 16401161, "oldest_snapshot_seqno": -1}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 142] Generated table #222: 12622 keys, 16203392 bytes, temperature: kUnknown
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163687116, "cf_name": "default", "job": 142, "event": "table_file_creation", "file_number": 222, "file_size": 16203392, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16122752, "index_size": 47951, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31621, "raw_key_size": 334493, "raw_average_key_size": 26, "raw_value_size": 15903274, "raw_average_value_size": 1259, "num_data_blocks": 1818, "num_entries": 12622, "num_filter_entries": 12622, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769852163, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 222, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.687464) [db/compaction/compaction_job.cc:1663] [default] [JOB 142] Compacted 1@0 + 1@6 files to L6 => 16203392 bytes
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.688805) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.4 rd, 131.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 14.2 +0.0 blob) out(15.5 +0.0 blob), read-write-amplify(20.9) write-amplify(10.4) OK, records in: 13147, records dropped: 525 output_compression: NoCompression
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.688825) EVENT_LOG_v1 {"time_micros": 1769852163688815, "job": 142, "event": "compaction_finished", "compaction_time_micros": 122937, "compaction_time_cpu_micros": 43261, "output_level": 6, "num_output_files": 1, "total_output_size": 16203392, "num_input_records": 13147, "num_output_records": 12622, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000221.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163689122, "job": 142, "event": "table_file_deletion", "file_number": 221}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000219.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852163691167, "job": 142, "event": "table_file_deletion", "file_number": 219}
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.564134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.691326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.691333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.691335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.691337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:03.691338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:03 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:03 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:03 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:03 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:03.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:04 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:04 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:04 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:04.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:05 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:05 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:05 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:05.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:06 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:06 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:36:06 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:06.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:36:06 np0005603622 nova_compute[221301]: 2026-01-31 09:36:06.492 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:07 np0005603622 nova_compute[221301]: 2026-01-31 09:36:07.686 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:07 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:07 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:36:07 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:36:08 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:08 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:08 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:08 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:09 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:09 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:09 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:09.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:10 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:10 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:10 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:11 np0005603622 nova_compute[221301]: 2026-01-31 09:36:11.494 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:11 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:11 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:11 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:12 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:12 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:12 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:12.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:12 np0005603622 nova_compute[221301]: 2026-01-31 09:36:12.689 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:13 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:13 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:13 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:13 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:13.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:14 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:14 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:14 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:14.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:15 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:15 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:15 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:16 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:16 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:16 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:16.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:16 np0005603622 nova_compute[221301]: 2026-01-31 09:36:16.498 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:17 np0005603622 nova_compute[221301]: 2026-01-31 09:36:17.692 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:17 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:17 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:17 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:17.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:18 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:18 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:18 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:18.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:18 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:19 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:19 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:19 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:19.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:20 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:20 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:20 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:21 np0005603622 nova_compute[221301]: 2026-01-31 09:36:21.501 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:21 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:21 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:21 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:22 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:22 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:22 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:22 np0005603622 nova_compute[221301]: 2026-01-31 09:36:22.696 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:23 np0005603622 systemd-logind[817]: New session 63 of user zuul.
Jan 31 04:36:23 np0005603622 systemd[1]: Started Session 63 of User zuul.
Jan 31 04:36:23 np0005603622 podman[324950]: 2026-01-31 09:36:23.097275152 +0000 UTC m=+0.059032578 container health_status ccdab60238fc825d0a7d14707861dd5b762aa628cce99a03421f611d085090f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Jan 31 04:36:23 np0005603622 podman[324949]: 2026-01-31 09:36:23.103034777 +0000 UTC m=+0.073310423 container health_status 8be5106159b730bd73578ad97dea846dd82e8f8a38a78009e82a3c8f221a36e4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'c9c07335481e70451acb503caf3b3b3a05811a07f9fde1e24aebece19089a266-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921-b7d766b9c783f5fc20ce35eff0bec7f1d5238d4ee9bbf149387f1be0cee0d921'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller)
Jan 31 04:36:23 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:23 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:23 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:23 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:24 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:24 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:24 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:24.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:25 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:25 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:25 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:25.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:26 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:26 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:26 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:26.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:26 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 31 04:36:26 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2383351236' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 31 04:36:26 np0005603622 nova_compute[221301]: 2026-01-31 09:36:26.503 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:27 np0005603622 nova_compute[221301]: 2026-01-31 09:36:27.699 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:27 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:27 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:27 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:27.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:28 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:28 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:36:28 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:36:28 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:29 np0005603622 ovs-vsctl[325276]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 31 04:36:29 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:29 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:29 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:29.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:29 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 31 04:36:29 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 31 04:36:29 np0005603622 virtqemud[220925]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 31 04:36:30 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:30 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:30 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:30.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:30 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: cache status {prefix=cache status} (starting...)
Jan 31 04:36:30 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:30 np0005603622 lvm[325584]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 31 04:36:30 np0005603622 lvm[325584]: VG ceph_vg0 finished
Jan 31 04:36:30 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: client ls {prefix=client ls} (starting...)
Jan 31 04:36:30 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:36:30.960 139793 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 31 04:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:36:30.962 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 31 04:36:30 np0005603622 ovn_metadata_agent[139788]: 2026-01-31 09:36:30.962 139793 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: damage ls {prefix=damage ls} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 31 04:36:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3088801758' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump loads {prefix=dump loads} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:31 np0005603622 nova_compute[221301]: 2026-01-31 09:36:31.392 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:31 np0005603622 nova_compute[221301]: 2026-01-31 09:36:31.505 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:31 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 31 04:36:31 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/372616640' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:31 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:31 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:31 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:31.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 31 04:36:31 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/659945712' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 31 04:36:32 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:32 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:32 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:32.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: ops {prefix=ops} (starting...)
Jan 31 04:36:32 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1786550516' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 31 04:36:32 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1977441679' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 31 04:36:32 np0005603622 nova_compute[221301]: 2026-01-31 09:36:32.700 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3513337120' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:36:33 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: session ls {prefix=session ls} (starting...)
Jan 31 04:36:33 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs Can't run that command on an inactive MDS!
Jan 31 04:36:33 np0005603622 ceph-mds[84731]: mds.cephfs.compute-1.bkrghs asok_command: status {prefix=status} (starting...)
Jan 31 04:36:33 np0005603622 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/286076223' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:33 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:33 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:33 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:33.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2560085907' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:36:33 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2920063645' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:36:34 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:34 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:34 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/833461946' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3275024300' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 31 04:36:34 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2144918124' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1019927082' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2391431251' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2296346323' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 31 04:36:35 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:35 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:35 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:35.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 31 04:36:35 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2783406748' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 31 04:36:36 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:36 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:36 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:36.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 31 04:36:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1756318102' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c9d6000/0x0/0x1bfc00000, data 0x534cf26/0x5308000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5116563 data_alloc: 234881024 data_used: 30785536
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461914112 unmapped: 74866688 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb294daf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb29274960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb2850c000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b7b400 session 0x55bb294dbe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb274730e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb294f9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb27675c00 session 0x55bb27473c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a160000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172177 data_alloc: 234881024 data_used: 30785536
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462036992 unmapped: 74743808 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172177 data_alloc: 234881024 data_used: 30785536
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c3bb000/0x0/0x1bfc00000, data 0x5966f88/0x5923000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 74735616 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.317159653s of 18.586101532s, submitted: 71
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb2a098b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29e5a000 session 0x55bb2850de00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462192640 unmapped: 74588160 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780d400 session 0x55bb29616d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb27a16b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b50800 session 0x55bb294dd2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29694c00 session 0x55bb29350000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462200832 unmapped: 74579968 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5214363 data_alloc: 234881024 data_used: 36044800
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29e5a000 session 0x55bb2747c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a161e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c396000/0x0/0x1bfc00000, data 0x598afab/0x5948000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c371000/0x0/0x1bfc00000, data 0x59aefba/0x596d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 462602240 unmapped: 74178560 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb29b50800 session 0x55bb29536b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2d87c400 session 0x55bb29735e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 heartbeat osd_stat(store_statfs(0x19c596000/0x0/0x1bfc00000, data 0x5789fba/0x5748000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 ms_handle_reset con 0x55bb2e506800 session 0x55bb27a22780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463028224 unmapped: 73752576 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5214697 data_alloc: 251658240 data_used: 38395904
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 handle_osd_map epochs [403,404], i have 403, src has [1,404]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 403 handle_osd_map epochs [404,404], i have 404, src has [1,404]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2cf43c00 session 0x55bb2a160780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb29b52800 session 0x55bb292f3e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb29577680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2780c800 session 0x55bb26cb9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 heartbeat osd_stat(store_statfs(0x19c5b1000/0x0/0x1bfc00000, data 0x5515bf5/0x572d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463036416 unmapped: 73744384 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.468164444s of 11.910409927s, submitted: 76
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 463290368 unmapped: 73490432 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5247418 data_alloc: 251658240 data_used: 38752256
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb29b50800 session 0x55bb28063c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 ms_handle_reset con 0x55bb2d87c400 session 0x55bb2968e780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461103104 unmapped: 75677696 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 461103104 unmapped: 75677696 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 heartbeat osd_stat(store_statfs(0x19d960000/0x0/0x1bfc00000, data 0x4165bf5/0x437d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 69074944 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 404 handle_osd_map epochs [404,405], i have 404, src has [1,405]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 467877888 unmapped: 68902912 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 405 heartbeat osd_stat(store_statfs(0x19cf66000/0x0/0x1bfc00000, data 0x4dbebf5/0x4d78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 405 handle_osd_map epochs [405,406], i have 405, src has [1,406]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780c800 session 0x55bb2850a000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082544 data_alloc: 234881024 data_used: 26394624
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19cf30000/0x0/0x1bfc00000, data 0x4df13c3/0x4dac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469016576 unmapped: 67764224 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.698586464s of 10.684947014s, submitted: 257
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 468967424 unmapped: 67813376 heap: 536780800 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5243277 data_alloc: 234881024 data_used: 26406912
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b50800 session 0x55bb2751b860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b52800 session 0x55bb2a067860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb2a1610e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2e506800 session 0x55bb296165a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a160d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469311488 unmapped: 75341824 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19bd75000/0x0/0x1bfc00000, data 0x5fad3ec/0x5f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb2780d400 session 0x55bb295b5680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29694c00 session 0x55bb28063a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 heartbeat osd_stat(store_statfs(0x19bd75000/0x0/0x1bfc00000, data 0x5fad425/0x5f69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469311488 unmapped: 75341824 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 ms_handle_reset con 0x55bb29b50800 session 0x55bb295ba960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469139456 unmapped: 75513856 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 406 handle_osd_map epochs [406,407], i have 406, src has [1,407]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 heartbeat osd_stat(store_statfs(0x19bd9a000/0x0/0x1bfc00000, data 0x5f89416/0x5f44000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469155840 unmapped: 75497472 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29b52800 session 0x55bb2850c5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2780c800 session 0x55bb294dc000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5209919 data_alloc: 234881024 data_used: 26370048
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29694c00 session 0x55bb29735860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb29b50800 session 0x55bb295b50e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb2b7b7800 session 0x55bb295b4d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 ms_handle_reset con 0x55bb27903c00 session 0x55bb292d9860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 heartbeat osd_stat(store_statfs(0x19bd71000/0x0/0x1bfc00000, data 0x5faefc7/0x5f6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 75431936 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 407 handle_osd_map epochs [408,408], i have 408, src has [1,408]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 408 ms_handle_reset con 0x55bb29694c00 session 0x55bb29e683c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 470220800 unmapped: 74432512 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 408 ms_handle_reset con 0x55bb29b50800 session 0x55bb274734a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.688539028s of 10.181116104s, submitted: 137
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 408 handle_osd_map epochs [409,409], i have 409, src has [1,409]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2850a000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472137728 unmapped: 72515584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5147665 data_alloc: 251658240 data_used: 38809600
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694400 session 0x55bb2850dc20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a22780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472145920 unmapped: 72507392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb27903c00 session 0x55bb2a23e5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 heartbeat osd_stat(store_statfs(0x19d302000/0x0/0x1bfc00000, data 0x47c294b/0x49dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,3])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 67059712 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 ms_handle_reset con 0x55bb29694c00 session 0x55bb27a16b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473128960 unmapped: 71524352 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473128960 unmapped: 71524352 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5163434 data_alloc: 251658240 data_used: 38809600
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 heartbeat osd_stat(store_statfs(0x19ceb8000/0x0/0x1bfc00000, data 0x4c0c94b/0x4e26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473161728 unmapped: 71491584 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19ceb8000/0x0/0x1bfc00000, data 0x4c0c94b/0x4e26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473169920 unmapped: 71483392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.688801765s of 10.107757568s, submitted: 117
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 68362240 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5212668 data_alloc: 251658240 data_used: 38866944
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476856320 unmapped: 67796992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c837000/0x0/0x1bfc00000, data 0x52864c2/0x54a1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478584832 unmapped: 66068480 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b50800 session 0x55bb2a098b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27903c00 session 0x55bb294f9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694400 session 0x55bb274730e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694c00 session 0x55bb294dbe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c683000/0x0/0x1bfc00000, data 0x54314c2/0x564c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5257436 data_alloc: 251658240 data_used: 39890944
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478601216 unmapped: 66052096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c671000/0x0/0x1bfc00000, data 0x54524c2/0x566d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5252592 data_alloc: 251658240 data_used: 39927808
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb29274960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478470144 unmapped: 66183168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478478336 unmapped: 66174976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.228658676s of 13.433518410s, submitted: 62
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694400 session 0x55bb2751af00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c671000/0x0/0x1bfc00000, data 0x54524c2/0x566d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5297052 data_alloc: 251658240 data_used: 45432832
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 66166784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780c800 session 0x55bb2a161a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb29614b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19c63f000/0x0/0x1bfc00000, data 0x54844c2/0x569f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb2850ba40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d8f0000/0x0/0x1bfc00000, data 0x3b72460/0x3d8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4972366 data_alloc: 234881024 data_used: 27074560
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477274112 unmapped: 67379200 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d8f0000/0x0/0x1bfc00000, data 0x3b72460/0x3d8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478314496 unmapped: 66338816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.538283348s of 10.021756172s, submitted: 58
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dbc1000/0x0/0x1bfc00000, data 0x3efb460/0x4115000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478748672 unmapped: 65904640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dbc1000/0x0/0x1bfc00000, data 0x3efb460/0x4115000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479059968 unmapped: 65593344 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5028850 data_alloc: 234881024 data_used: 27836416
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 64315392 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d7a9000/0x0/0x1bfc00000, data 0x431b460/0x4535000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480772096 unmapped: 63881216 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5037736 data_alloc: 234881024 data_used: 28106752
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d769000/0x0/0x1bfc00000, data 0x435b460/0x4575000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5038588 data_alloc: 234881024 data_used: 28106752
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.746368408s of 12.203164101s, submitted: 87
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480641024 unmapped: 64012288 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2968e960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27903c00 session 0x55bb27a141e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cefc400 session 0x55bb29351e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d74b000/0x0/0x1bfc00000, data 0x4379460/0x4593000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5039636 data_alloc: 234881024 data_used: 28098560
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480649216 unmapped: 64004096 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480657408 unmapped: 63995904 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19d742000/0x0/0x1bfc00000, data 0x4382460/0x459c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29694c00 session 0x55bb29351680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb280623c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780c800 session 0x55bb27a223c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480681984 unmapped: 63971328 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480690176 unmapped: 63963136 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb36b66800 session 0x55bb282ca3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4981722 data_alloc: 234881024 data_used: 26042368
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb28035400 session 0x55bb29e69680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b52400 session 0x55bb29576f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dad7000/0x0/0x1bfc00000, data 0x3e2e460/0x4048000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480698368 unmapped: 63954944 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.523767471s of 23.023344040s, submitted: 47
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb2850ba40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b7b400 session 0x55bb2a23e960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb27472000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480632832 unmapped: 64020480 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb2850c000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2cd67000 session 0x55bb29537680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb294f94a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a1be00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b7b400 session 0x55bb282cb860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb29536780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477405184 unmapped: 67248128 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4903504 data_alloc: 234881024 data_used: 20979712
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4904784 data_alloc: 234881024 data_used: 21102592
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1cb000/0x0/0x1bfc00000, data 0x38f73db/0x3b0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2b55bc00 session 0x55bb29e690e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb27675c00 session 0x55bb295ba3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477282304 unmapped: 67371008 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a067a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.964398384s of 10.504033089s, submitted: 94
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 ms_handle_reset con 0x55bb29b53800 session 0x55bb296af2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4911099 data_alloc: 234881024 data_used: 21143552
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4966203 data_alloc: 234881024 data_used: 27598848
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19e1a9000/0x0/0x1bfc00000, data 0x391b40e/0x3b35000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477790208 unmapped: 66863104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4966683 data_alloc: 234881024 data_used: 27611136
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.378284454s of 11.749645233s, submitted: 24
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479789056 unmapped: 64864256 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5005681 data_alloc: 234881024 data_used: 28753920
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb26cb9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19dcf7000/0x0/0x1bfc00000, data 0x3dcd40e/0x3fe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 nova_compute[221301]: 2026-01-31 09:36:36.506 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29cfd400 session 0x55bb29350f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29304b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb295bb680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb294db0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb27a1a1e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb27a170e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5010257 data_alloc: 234881024 data_used: 28770304
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.470820427s of 10.827239037s, submitted: 55
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb2850c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb27a170e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb27a1a1e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb294db0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb29304b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19dcf3000/0x0/0x1bfc00000, data 0x3dcf067/0x3fea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d345000/0x0/0x1bfc00000, data 0x477e067/0x4999000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082493 data_alloc: 234881024 data_used: 28770304
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d345000/0x0/0x1bfc00000, data 0x477e067/0x4999000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb26cb9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2a067a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb295ba3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb29e690e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb29536780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb282cb860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479911936 unmapped: 64741376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480157696 unmapped: 64495616 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5175453 data_alloc: 234881024 data_used: 35057664
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d045000/0x0/0x1bfc00000, data 0x4a7e067/0x4c99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480518144 unmapped: 64135168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb29350b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.939766884s of 11.147185326s, submitted: 13
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb29537c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb294dbc20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480534528 unmapped: 64118784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480534528 unmapped: 64118784 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e6000 session 0x55bb295bb2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb296ba5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480821248 unmapped: 63832064 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5199454 data_alloc: 251658240 data_used: 38203392
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d045000/0x0/0x1bfc00000, data 0x4a7e067/0x4c99000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480845824 unmapped: 63807488 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5237254 data_alloc: 251658240 data_used: 38260736
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19cb96000/0x0/0x1bfc00000, data 0x4f2d067/0x5148000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.031694412s of 10.257151604s, submitted: 56
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482091008 unmapped: 62562304 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5237728 data_alloc: 251658240 data_used: 38326272
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 61358080 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19c1e4000/0x0/0x1bfc00000, data 0x58df067/0x5afa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1df1f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 61308928 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5333817 data_alloc: 251658240 data_used: 40177664
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 61300736 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda7000/0x0/0x1bfc00000, data 0x590c067/0x5b27000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.384566307s of 10.649941444s, submitted: 69
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5330865 data_alloc: 251658240 data_used: 40177664
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483450880 unmapped: 61202432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483459072 unmapped: 61194240 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484507648 unmapped: 60145664 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484507648 unmapped: 60145664 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb294f94a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb296174a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f2d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19bda6000/0x0/0x1bfc00000, data 0x590d067/0x5b28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483516416 unmapped: 61136896 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5155803 data_alloc: 234881024 data_used: 30330880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483516416 unmapped: 61136896 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19cc04000/0x0/0x1bfc00000, data 0x4aaf067/0x4cca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,2])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2e506400 session 0x55bb27a22b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2850d0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb2968e3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480935936 unmapped: 63717376 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968fe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b53800 session 0x55bb29305a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19c7a1000/0x0/0x1bfc00000, data 0x5116067/0x512d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.340438843s of 11.199736595s, submitted: 45
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480952320 unmapped: 63700992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb29b7b400 session 0x55bb27a22780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e7400 session 0x55bb282caf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5218242 data_alloc: 234881024 data_used: 30330880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 69935104 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29577680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 69918720 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 69918720 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d21a000/0x0/0x1bfc00000, data 0x469e034/0x46b3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5087643 data_alloc: 234881024 data_used: 23363584
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 69910528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474808320 unmapped: 69844992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d21a000/0x0/0x1bfc00000, data 0x469e034/0x46b3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb2850a3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474808320 unmapped: 69844992 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb2a066b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474816512 unmapped: 69836800 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.243156910s of 10.041037560s, submitted: 49
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb297a0c00 session 0x55bb293252c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474824704 unmapped: 69828608 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d1ab000/0x0/0x1bfc00000, data 0x470d057/0x4723000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5128304 data_alloc: 234881024 data_used: 27824128
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295770e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb27675c00 session 0x55bb296ba000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb279cbc00 session 0x55bb2a067c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474365952 unmapped: 70287360 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 ms_handle_reset con 0x55bb292e7400 session 0x55bb295b45a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 heartbeat osd_stat(store_statfs(0x19d2d0000/0x0/0x1bfc00000, data 0x45e8057/0x45fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb29b55000 session 0x55bb293241e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474374144 unmapped: 70279168 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5107048 data_alloc: 234881024 data_used: 27828224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 heartbeat osd_stat(store_statfs(0x19d2db000/0x0/0x1bfc00000, data 0x43d7d04/0x45f3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb2cefc400 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb29694400 session 0x55bb274730e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295bb0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474382336 unmapped: 70270976 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.464142799s of 10.002035141s, submitted: 91
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 heartbeat osd_stat(store_statfs(0x19cfea000/0x0/0x1bfc00000, data 0x46c8d04/0x48e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e32f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477290496 unmapped: 67362816 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5175422 data_alloc: 234881024 data_used: 28540928
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477298688 unmapped: 67354624 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477306880 unmapped: 67346432 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb27675c00 session 0x55bb29e68d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 ms_handle_reset con 0x55bb279cbc00 session 0x55bb274732c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476790784 unmapped: 67862528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19e2e0000/0x0/0x1bfc00000, data 0x4411833/0x462d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476790784 unmapped: 67862528 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19e2e0000/0x0/0x1bfc00000, data 0x4411833/0x462d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476798976 unmapped: 67854336 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5051174 data_alloc: 234881024 data_used: 21647360
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476798976 unmapped: 67854336 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb294dc960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29275860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4917660 data_alloc: 218103808 data_used: 16318464
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19f002000/0x0/0x1bfc00000, data 0x36f0810/0x390b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2a098780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.090511322s of 11.899164200s, submitted: 102
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b2a8800 session 0x55bb282774a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2a066d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb29275a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb27a23a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb27a161e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471883776 unmapped: 72769536 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb27675c00 session 0x55bb294f9860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb2751a5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb292f2f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb292f2960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb295ec960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29577680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471900160 unmapped: 72753152 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4884964 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4884964 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb2a1605a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471982080 unmapped: 72671232 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940164 data_alloc: 234881024 data_used: 20566016
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940164 data_alloc: 234881024 data_used: 20566016
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19ec41000/0x0/0x1bfc00000, data 0x3367820/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1d2ef9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 72998912 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.335626602s of 21.673030853s, submitted: 56
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 472883200 unmapped: 71770112 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473751552 unmapped: 70901760 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473628672 unmapped: 71024640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5052360 data_alloc: 234881024 data_used: 20586496
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475676672 unmapped: 68976640 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d326000/0x0/0x1bfc00000, data 0x4224820/0x4440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474939392 unmapped: 69713920 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474947584 unmapped: 69705728 heap: 544653312 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2a1610e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb296165a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb292e7400 session 0x55bb282ca5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb26cb9a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747eb40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb295770e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474193920 unmapped: 74661888 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb2850c5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb27903000 session 0x55bb295b52c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5174273 data_alloc: 234881024 data_used: 22171648
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c4b6000/0x0/0x1bfc00000, data 0x5091891/0x52af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474243072 unmapped: 74612736 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29274780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2747d0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.416330338s of 12.273886681s, submitted: 157
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474251264 unmapped: 74604544 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb2850c1e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5174289 data_alloc: 234881024 data_used: 22171648
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb29351a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474185728 unmapped: 74670080 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c4bd000/0x0/0x1bfc00000, data 0x50918c4/0x52b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476332032 unmapped: 72523776 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478511104 unmapped: 70344704 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478511104 unmapped: 70344704 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb26ca2800 session 0x55bb295372c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb282cb860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478519296 unmapped: 70336512 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026371 data_alloc: 234881024 data_used: 27156480
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb29304b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026371 data_alloc: 234881024 data_used: 27156480
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19dcea000/0x0/0x1bfc00000, data 0x38658b4/0x3a84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478527488 unmapped: 70328320 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.068428993s of 12.367558479s, submitted: 35
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481763328 unmapped: 67092480 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d89e000/0x0/0x1bfc00000, data 0x3cb18b4/0x3ed0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e48f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 71229440 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 71221248 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5087449 data_alloc: 234881024 data_used: 27910144
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c46f000/0x0/0x1bfc00000, data 0x3f328b4/0x4151000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092981 data_alloc: 234881024 data_used: 28061696
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479846400 unmapped: 69009408 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479715328 unmapped: 69140480 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c47a000/0x0/0x1bfc00000, data 0x3f358b4/0x4154000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c47a000/0x0/0x1bfc00000, data 0x3f358b4/0x4154000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479723520 unmapped: 69132288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.518205643s of 12.845505714s, submitted: 98
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb293045a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb29351680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084297 data_alloc: 234881024 data_used: 28065792
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5833/0x2ce1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29b53800 session 0x55bb2a23f4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475398144 unmapped: 73457664 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475406336 unmapped: 73449472 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475414528 unmapped: 73441280 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4830431 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d8eb000/0x0/0x1bfc00000, data 0x2ac5810/0x2ce0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475422720 unmapped: 73433088 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.286748886s of 24.817827225s, submitted: 52
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2747c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475627520 unmapped: 73228288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb292f2960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb295b45a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859977 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb29325680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2cefc400 session 0x55bb27a22b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475627520 unmapped: 73228288 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475635712 unmapped: 73220096 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475635712 unmapped: 73220096 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475717632 unmapped: 73138176 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d582000/0x0/0x1bfc00000, data 0x2e31810/0x304c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475652096 unmapped: 73203712 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4860642 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478961664 unmapped: 69894144 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb282ca5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb29735680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb36b66000 session 0x55bb294df860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2d87dc00 session 0x55bb293050e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4894676 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.479883194s of 12.613527298s, submitted: 30
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475832320 unmapped: 73023488 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a098b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb29694400 session 0x55bb293512c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1fc000/0x0/0x1bfc00000, data 0x31b7810/0x33d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475996160 unmapped: 72859648 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4899001 data_alloc: 218103808 data_used: 12861440
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 475996160 unmapped: 72859648 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940761 data_alloc: 218103808 data_used: 18632704
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4940761 data_alloc: 218103808 data_used: 18632704
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19d1d8000/0x0/0x1bfc00000, data 0x31db810/0x33f6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477036544 unmapped: 71819264 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.086318970s of 14.066018105s, submitted: 8
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479961088 unmapped: 68894720 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481017856 unmapped: 67837952 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 481083392 unmapped: 67772416 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056003 data_alloc: 234881024 data_used: 19955712
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c49e000/0x0/0x1bfc00000, data 0x3f14810/0x412f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c49e000/0x0/0x1bfc00000, data 0x3f14810/0x412f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480624640 unmapped: 68231168 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056179 data_alloc: 234881024 data_used: 19955712
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c480000/0x0/0x1bfc00000, data 0x3f33810/0x414e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 heartbeat osd_stat(store_statfs(0x19c480000/0x0/0x1bfc00000, data 0x3f33810/0x414e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.410203934s of 13.434028625s, submitted: 118
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5056143 data_alloc: 234881024 data_used: 19963904
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480600064 unmapped: 68255744 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850a5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480608256 unmapped: 68247552 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb27883400 session 0x55bb29537680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2b8da000 session 0x55bb2850cd20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2a142400 session 0x55bb296ba000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486924288 unmapped: 61931520 heap: 548855808 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb2780d400 session 0x55bb296150e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485277696 unmapped: 70844416 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 414 ms_handle_reset con 0x55bb27883400 session 0x55bb27a22f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 415 heartbeat osd_stat(store_statfs(0x19b45e000/0x0/0x1bfc00000, data 0x4f524db/0x5170000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486375424 unmapped: 69746688 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5223021 data_alloc: 234881024 data_used: 26726400
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 415 handle_osd_map epochs [416,416], i have 416, src has [1,416]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb29694400 session 0x55bb295bb0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19b45a000/0x0/0x1bfc00000, data 0x4f54188/0x5173000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486383616 unmapped: 69738496 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb29614b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486391808 unmapped: 69730304 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb293043c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2780d400 session 0x55bb280623c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483475456 unmapped: 72646656 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb27883400 session 0x55bb27473680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1a780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2a142400 session 0x55bb2a067860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a160f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19b453000/0x0/0x1bfc00000, data 0x4f59e5f/0x517b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 ms_handle_reset con 0x55bb27883400 session 0x55bb27a1ad20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5284242 data_alloc: 234881024 data_used: 26730496
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 heartbeat osd_stat(store_statfs(0x19ace2000/0x0/0x1bfc00000, data 0x56c9ec1/0x58ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483606528 unmapped: 72515584 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.696437836s of 12.454614639s, submitted: 122
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19ace2000/0x0/0x1bfc00000, data 0x56c9ec1/0x58ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483614720 unmapped: 72507392 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb2747f680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483614720 unmapped: 72507392 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b8da000 session 0x55bb295bbe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850c3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb294f9860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acde000/0x0/0x1bfc00000, data 0x56cba00/0x58ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850ab40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27883400 session 0x55bb293252c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb29614b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483631104 unmapped: 72491008 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5287136 data_alloc: 234881024 data_used: 26738688
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b8da000 session 0x55bb295bb0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483631104 unmapped: 72491008 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a22f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdf000/0x0/0x1bfc00000, data 0x56cba00/0x58ef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27883400 session 0x55bb296ba000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483639296 unmapped: 72482816 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483762176 unmapped: 72359936 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb32c02800 session 0x55bb2850b860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2a861400 session 0x55bb2a067a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb27902000 session 0x55bb297354a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb28062d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5342782 data_alloc: 234881024 data_used: 33411072
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 71737344 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 70574080 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5406978 data_alloc: 251658240 data_used: 42504192
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.194473267s of 13.459496498s, submitted: 34
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486039552 unmapped: 70082560 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486047744 unmapped: 70074368 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19acdc000/0x0/0x1bfc00000, data 0x56cba43/0x58f2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486047744 unmapped: 70074368 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 66863104 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 66715648 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5483936 data_alloc: 251658240 data_used: 43495424
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489594880 unmapped: 66527232 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 66519040 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489611264 unmapped: 66510848 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5507051 data_alloc: 251658240 data_used: 45670400
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.791836739s of 10.737925529s, submitted: 109
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19a47a000/0x0/0x1bfc00000, data 0x5f25a43/0x614c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 66191360 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb293050e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb295ecf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489938944 unmapped: 66183168 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 66158592 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb32c02800 session 0x55bb27a23a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5321874 data_alloc: 251658240 data_used: 38035456
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 66142208 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2b55dc00 session 0x55bb2a067c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb36b66000 session 0x55bb27a22960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489988096 unmapped: 66134016 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19b44d000/0x0/0x1bfc00000, data 0x4f5b9ae/0x517f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5193660 data_alloc: 234881024 data_used: 33263616
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5193660 data_alloc: 234881024 data_used: 33263616
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 heartbeat osd_stat(store_statfs(0x19c02a000/0x0/0x1bfc00000, data 0x43809ae/0x45a4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5194940 data_alloc: 234881024 data_used: 33316864
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 66117632 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 67K writes, 269K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.04 MB/s#012Cumulative WAL: 67K writes, 24K syncs, 2.73 writes per sync, written: 0.26 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6455 writes, 25K keys, 6455 commit groups, 1.0 writes per commit group, ingest: 27.90 MB, 0.05 MB/s#012Interval WAL: 6456 writes, 2525 syncs, 2.56 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 ms_handle_reset con 0x55bb29694400 session 0x55bb295b4f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.139095306s of 20.152223587s, submitted: 103
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 417 handle_osd_map epochs [418,418], i have 418, src has [1,418]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282761e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2a143000 session 0x55bb2850dc20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb32c02800 session 0x55bb274723c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490045440 unmapped: 66076672 heap: 556122112 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 418 ms_handle_reset con 0x55bb2780d400 session 0x55bb294de960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497688576 unmapped: 71041024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 419 ms_handle_reset con 0x55bb29694400 session 0x55bb29e68f00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497737728 unmapped: 70991872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497819648 unmapped: 70909952 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5467556 data_alloc: 251658240 data_used: 38674432
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 420 heartbeat osd_stat(store_statfs(0x19a05e000/0x0/0x1bfc00000, data 0x63492b4/0x656f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 497827840 unmapped: 70901760 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494247936 unmapped: 74481664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27a221e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494247936 unmapped: 74481664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb27883400 session 0x55bb27a141e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2a861400 session 0x55bb296165a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 ms_handle_reset con 0x55bb2780d400 session 0x55bb2850c000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5236952 data_alloc: 251658240 data_used: 38674432
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 heartbeat osd_stat(store_statfs(0x19c01f000/0x0/0x1bfc00000, data 0x4387be2/0x45af000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494272512 unmapped: 74457088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.853560448s of 10.686148643s, submitted: 111
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 495321088 unmapped: 73408512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 495321088 unmapped: 73408512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494297088 unmapped: 74432512 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282ca780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb32c02800 session 0x55bb2a066b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb36b66000 session 0x55bb294daf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2780d400 session 0x55bb296bbc20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb29694400 session 0x55bb2a067a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2a861400 session 0x55bb294db4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482951168 unmapped: 85778432 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5103466 data_alloc: 234881024 data_used: 21983232
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb29e5b000 session 0x55bb294f94a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb36b67000 session 0x55bb295ec960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482951168 unmapped: 85778432 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 heartbeat osd_stat(store_statfs(0x19c53c000/0x0/0x1bfc00000, data 0x3e6a316/0x4091000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ba3c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5007400 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471785472 unmapped: 96944128 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 424 heartbeat osd_stat(store_statfs(0x19cde3000/0x0/0x1bfc00000, data 0x35c0e71/0x37e9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.565454483s of 10.000356674s, submitted: 123
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb294f9a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 97075200 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471490560 unmapped: 97239040 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5013770 data_alloc: 218103808 data_used: 13463552
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cde1000/0x0/0x1bfc00000, data 0x35c29b0/0x37ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5089290 data_alloc: 234881024 data_used: 24158208
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cde1000/0x0/0x1bfc00000, data 0x35c29b0/0x37ec000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1f62f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 471949312 unmapped: 96780288 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5089290 data_alloc: 234881024 data_used: 24158208
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.364485741s of 13.372555733s, submitted: 14
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 473571328 unmapped: 95158272 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c49c000/0x0/0x1bfc00000, data 0x3af29b0/0x3d1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 474390528 unmapped: 94339072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c40e000/0x0/0x1bfc00000, data 0x3b789b0/0x3da2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5152986 data_alloc: 234881024 data_used: 25112576
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476151808 unmapped: 92577792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c40e000/0x0/0x1bfc00000, data 0x3b789b0/0x3da2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5145886 data_alloc: 234881024 data_used: 25112576
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476282880 unmapped: 92446720 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c3fb000/0x0/0x1bfc00000, data 0x3b999b0/0x3dc3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5146206 data_alloc: 234881024 data_used: 25120768
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.126967430s of 15.535551071s, submitted: 75
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2850c780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb2a23fe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2ccf1000 session 0x55bb297343c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476291072 unmapped: 92438528 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a067c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1af00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27473c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb295eda40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb26ca2800 session 0x55bb29576000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e69860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdc1000/0x0/0x1bfc00000, data 0x41d29c0/0x43fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5201202 data_alloc: 234881024 data_used: 25120768
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbe000/0x0/0x1bfc00000, data 0x41d59c0/0x4400000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2751b680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb29536780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5202004 data_alloc: 234881024 data_used: 25120768
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29cfc000 session 0x55bb294dc1e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 476602368 unmapped: 92127232 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 477601792 unmapped: 91127808 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5241699 data_alloc: 234881024 data_used: 29421568
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478486528 unmapped: 90243072 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.085439682s of 17.923780441s, submitted: 24
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdbd000/0x0/0x1bfc00000, data 0x41d59e3/0x4401000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5242859 data_alloc: 234881024 data_used: 29425664
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bdb7000/0x0/0x1bfc00000, data 0x41db9e3/0x4407000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 478494720 unmapped: 90234880 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 479543296 unmapped: 89186304 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 480960512 unmapped: 87769088 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5289261 data_alloc: 234881024 data_used: 29421568
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b77c000/0x0/0x1bfc00000, data 0x48169e3/0x4a42000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482779136 unmapped: 85950464 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482893824 unmapped: 85835776 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 482820096 unmapped: 85909504 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 5.414937973s of 10.002117157s, submitted: 100
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5304011 data_alloc: 234881024 data_used: 29417472
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b5be000/0x0/0x1bfc00000, data 0x49cc9e3/0x4bf8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19b5c3000/0x0/0x1bfc00000, data 0x49cf9e3/0x4bfb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb27a17860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb292743c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5144857 data_alloc: 234881024 data_used: 22913024
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c3e2000/0x0/0x1bfc00000, data 0x3bb19d3/0x3ddc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb27a230e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.416720390s of 11.380039215s, submitted: 49
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5144084 data_alloc: 234881024 data_used: 22908928
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483139584 unmapped: 85590016 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29274b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2a861400 session 0x55bb294df2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483155968 unmapped: 85573632 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb294dc5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483205120 unmapped: 85524480 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,2])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483262464 unmapped: 85467136 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483262464 unmapped: 85467136 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934149 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483278848 unmapped: 85450752 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483287040 unmapped: 85442560 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483295232 unmapped: 85434368 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483303424 unmapped: 85426176 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483311616 unmapped: 85417984 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483311616 unmapped: 85417984 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483319808 unmapped: 85409792 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483328000 unmapped: 85401600 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483336192 unmapped: 85393408 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483344384 unmapped: 85385216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483352576 unmapped: 85377024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483360768 unmapped: 85368832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483368960 unmapped: 85360640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483377152 unmapped: 85352448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483377152 unmapped: 85352448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483385344 unmapped: 85344256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483393536 unmapped: 85336064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483401728 unmapped: 85327872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483409920 unmapped: 85319680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4934077 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483418112 unmapped: 85311488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4ba000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a23c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb296ba5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb282770e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32c02800 session 0x55bb29274b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 105.291740417s of 109.832557678s, submitted: 273
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483426304 unmapped: 85303296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb282ca000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb294f92c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb296ae960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2d87d000 session 0x55bb282cb860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5024620 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483926016 unmapped: 84803584 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483926016 unmapped: 84803584 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483934208 unmapped: 84795392 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 483999744 unmapped: 84729856 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5024620 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a1601e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca33000/0x0/0x1bfc00000, data 0x3560a12/0x378b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb2968fa40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484007936 unmapped: 84721664 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2968fe00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb2a23ef00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484368384 unmapped: 84361216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484212736 unmapped: 84516864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104070 data_alloc: 234881024 data_used: 23773184
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104070 data_alloc: 234881024 data_used: 23773184
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 484384768 unmapped: 84344832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19ca09000/0x0/0x1bfc00000, data 0x358aa12/0x37b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.882793427s of 20.041955948s, submitted: 38
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486875136 unmapped: 81854464 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5205218 data_alloc: 234881024 data_used: 25612288
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491757568 unmapped: 76972032 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x3fe8a12/0x4213000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5202962 data_alloc: 234881024 data_used: 25657344
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491536384 unmapped: 77193216 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa5000/0x0/0x1bfc00000, data 0x3fe8a12/0x4213000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5201846 data_alloc: 234881024 data_used: 25669632
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491544576 unmapped: 77185024 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19bfa8000/0x0/0x1bfc00000, data 0x3feba12/0x4216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491552768 unmapped: 77176832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.885244370s of 13.422520638s, submitted: 137
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb3dc9cc00 session 0x55bb27a16d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32709000 session 0x55bb2850c000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491552768 unmapped: 77176832 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb295763c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486440960 unmapped: 82288640 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486449152 unmapped: 82280448 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486457344 unmapped: 82272256 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4945970 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486465536 unmapped: 82264064 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 41.659423828s of 42.354778290s, submitted: 29
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb295baf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d0b9000/0x0/0x1bfc00000, data 0x2ada9b0/0x2d04000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947574 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19d4b9000/0x0/0x1bfc00000, data 0x2adaa12/0x2d05000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486473728 unmapped: 82255872 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2751b4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2b7b7000 session 0x55bb280630e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb2a23e5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb280621e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295ed4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb32709000 session 0x55bb294df2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb298fcc00 session 0x55bb294dc5a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb2780d400 session 0x55bb294df860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1be00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5010242 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486481920 unmapped: 82247680 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486490112 unmapped: 82239488 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5010242 data_alloc: 218103808 data_used: 12922880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486498304 unmapped: 82231296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29617c20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486506496 unmapped: 82223104 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486301696 unmapped: 82427904 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5050882 data_alloc: 218103808 data_used: 18731008
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5050882 data_alloc: 218103808 data_used: 18731008
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486260736 unmapped: 82468864 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19cdca000/0x0/0x1bfc00000, data 0x31c9a12/0x33f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.477214813s of 24.551872253s, submitted: 19
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 486932480 unmapped: 81797120 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5122440 data_alloc: 218103808 data_used: 18755584
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490283008 unmapped: 78446592 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490610688 unmapped: 78118912 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c38c000/0x0/0x1bfc00000, data 0x3c01a12/0x3e2c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490610688 unmapped: 78118912 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2dd000/0x0/0x1bfc00000, data 0x3caea12/0x3ed9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5151356 data_alloc: 234881024 data_used: 19578880
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2dd000/0x0/0x1bfc00000, data 0x3caea12/0x3ed9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490790912 unmapped: 77938688 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5147488 data_alloc: 234881024 data_used: 19587072
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2c1000/0x0/0x1bfc00000, data 0x3cd2a12/0x3efd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490725376 unmapped: 78004224 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.734907150s of 13.229278564s, submitted: 108
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490708992 unmapped: 78020608 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2a066960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490708992 unmapped: 78020608 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 heartbeat osd_stat(store_statfs(0x19c2aa000/0x0/0x1bfc00000, data 0x3ce8a12/0x3f13000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5154359 data_alloc: 234881024 data_used: 19595264
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490717184 unmapped: 78012416 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 425 handle_osd_map epochs [426,426], i have 426, src has [1,426]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 426 ms_handle_reset con 0x55bb30ef5800 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 426 ms_handle_reset con 0x55bb2780d400 session 0x55bb293241e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490594304 unmapped: 78135296 heap: 568729600 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 426 heartbeat osd_stat(store_statfs(0x19c2a6000/0x0/0x1bfc00000, data 0x3cea6cd/0x3f17000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29694400 session 0x55bb2a161a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29e5b000 session 0x55bb27a23a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2747fc20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 509476864 unmapped: 63234048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 ms_handle_reset con 0x55bb2978a400 session 0x55bb27a22960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5438017 data_alloc: 234881024 data_used: 31862784
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498819072 unmapped: 73891840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 heartbeat osd_stat(store_statfs(0x19a8c1000/0x0/0x1bfc00000, data 0x5895388/0x58fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 428 ms_handle_reset con 0x55bb2780d400 session 0x55bb294de1e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498843648 unmapped: 73867264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29694400 session 0x55bb2850a780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498860032 unmapped: 73850880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 heartbeat osd_stat(store_statfs(0x19a8b9000/0x0/0x1bfc00000, data 0x56d1caa/0x5903000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.446578026s of 10.657642365s, submitted: 129
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29cfc400 session 0x55bb27472000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29734b40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb3dc9e000 session 0x55bb293054a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb29694400 session 0x55bb2850ad20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 ms_handle_reset con 0x55bb2780d400 session 0x55bb29305680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5424483 data_alloc: 234881024 data_used: 31879168
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 493346816 unmapped: 79364096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494403584 unmapped: 78307328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb29617a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2751ab40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb32709400 session 0x55bb29537680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb29617860
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb2a0670e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb27a1ab40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb292f21e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb27873800 session 0x55bb27a1be00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb294df2c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5428305 data_alloc: 234881024 data_used: 31887360
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494411776 unmapped: 78299136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494419968 unmapped: 78290944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494419968 unmapped: 78290944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5428305 data_alloc: 234881024 data_used: 31887360
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb2751b4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb295baf00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d385b/0x5908000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb295763c0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.790133476s of 13.918015480s, submitted: 28
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5a400 session 0x55bb2a23ef00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494428160 unmapped: 78282752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5501254 data_alloc: 251658240 data_used: 42024960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 494452736 unmapped: 78258176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b5000/0x0/0x1bfc00000, data 0x56d387e/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496705536 unmapped: 76005376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5534806 data_alloc: 251658240 data_used: 46772224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d387e/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5535606 data_alloc: 251658240 data_used: 46792704
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 496713728 unmapped: 75997184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.097463608s of 13.780971527s, submitted: 6
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 498991104 unmapped: 73719808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499466240 unmapped: 73244672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5590520 data_alloc: 251658240 data_used: 51654656
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499474432 unmapped: 73236480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499474432 unmapped: 73236480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5591224 data_alloc: 251658240 data_used: 51654656
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499572736 unmapped: 73138176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.348139763s of 11.720171928s, submitted: 7
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a638000/0x0/0x1bfc00000, data 0x595087e/0x5b86000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5590872 data_alloc: 251658240 data_used: 51654656
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 499654656 unmapped: 73056256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504668160 unmapped: 68042752 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504676352 unmapped: 68034560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5614032 data_alloc: 251658240 data_used: 53387264
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5614032 data_alloc: 251658240 data_used: 53387264
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.533994675s of 15.573607445s, submitted: 7
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 502423552 unmapped: 70287360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a40c000/0x0/0x1bfc00000, data 0x5b7c87e/0x5db2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5638706 data_alloc: 251658240 data_used: 54968320
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 68247552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29cfc400 session 0x55bb293050e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504463360 unmapped: 68247552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a20a000/0x0/0x1bfc00000, data 0x5d7da7e/0x5fb4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb2968fa40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29694400 session 0x55bb29304780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5595832 data_alloc: 251658240 data_used: 54308864
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb29e5b000 session 0x55bb29e694a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b3000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b3000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5595136 data_alloc: 251658240 data_used: 54304768
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504471552 unmapped: 68239360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.704790115s of 13.284405708s, submitted: 34
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb32709800 session 0x55bb29577a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 heartbeat osd_stat(store_statfs(0x19a6b4000/0x0/0x1bfc00000, data 0x58d4a5b/0x5b0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 ms_handle_reset con 0x55bb2780d400 session 0x55bb29e68d20
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504487936 unmapped: 68222976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 430 handle_osd_map epochs [431,431], i have 431, src has [1,431]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5583026 data_alloc: 251658240 data_used: 54300672
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d5496/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19a8b4000/0x0/0x1bfc00000, data 0x56d5496/0x5909000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5581986 data_alloc: 251658240 data_used: 54300672
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 504496128 unmapped: 68214784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.344113350s of 10.144760132s, submitted: 94
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 ms_handle_reset con 0x55bb29694400 session 0x55bb280625a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 ms_handle_reset con 0x55bb29cfc400 session 0x55bb294dc960
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 heartbeat osd_stat(store_statfs(0x19c294000/0x0/0x1bfc00000, data 0x3cf6434/0x3f29000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 431 handle_osd_map epochs [432,432], i have 432, src has [1,432]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 432 ms_handle_reset con 0x55bb29e5b000 session 0x55bb2a23e780
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5244005 data_alloc: 234881024 data_used: 31895552
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 heartbeat osd_stat(store_statfs(0x19c28e000/0x0/0x1bfc00000, data 0x3cf9bf6/0x3f2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 500727808 unmapped: 71983104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb32709000 session 0x55bb2751b0e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb298fc000 session 0x55bb274730e0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 heartbeat osd_stat(store_statfs(0x19c28e000/0x0/0x1bfc00000, data 0x3cf9bf6/0x3f2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 ms_handle_reset con 0x55bb2780d400 session 0x55bb295ed680
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5012605 data_alloc: 218103808 data_used: 12972032
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 433 handle_osd_map epochs [433,434], i have 433, src has [1,434]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.993594170s of 11.322154045s, submitted: 67
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 ms_handle_reset con 0x55bb29694400 session 0x55bb29577a40
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea735/0x2d20000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491233280 unmapped: 81477632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491233280 unmapped: 81477632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491241472 unmapped: 81469440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491241472 unmapped: 81469440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491249664 unmapped: 81461248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491257856 unmapped: 81453056 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491266048 unmapped: 81444864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491274240 unmapped: 81436672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491282432 unmapped: 81428480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491290624 unmapped: 81420288 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491298816 unmapped: 81412096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491298816 unmapped: 81412096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491307008 unmapped: 81403904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491315200 unmapped: 81395712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491323392 unmapped: 81387520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491331584 unmapped: 81379328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491339776 unmapped: 81371136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490782720 unmapped: 81928192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490848256 unmapped: 81862656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf dump' '{prefix=perf dump}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf schema' '{prefix=perf schema}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490741760 unmapped: 81969152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490799104 unmapped: 81911808 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490807296 unmapped: 81903616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490823680 unmapped: 81887232 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490831872 unmapped: 81879040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490840064 unmapped: 81870848 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490848256 unmapped: 81862656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490848256 unmapped: 81862656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490848256 unmapped: 81862656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490856448 unmapped: 81854464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490856448 unmapped: 81854464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490856448 unmapped: 81854464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490856448 unmapped: 81854464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490864640 unmapped: 81846272 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490872832 unmapped: 81838080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490881024 unmapped: 81829888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490889216 unmapped: 81821696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490889216 unmapped: 81821696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490897408 unmapped: 81813504 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490905600 unmapped: 81805312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490913792 unmapped: 81797120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490913792 unmapped: 81797120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490913792 unmapped: 81797120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490913792 unmapped: 81797120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490930176 unmapped: 81780736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490930176 unmapped: 81780736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490930176 unmapped: 81780736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490930176 unmapped: 81780736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490930176 unmapped: 81780736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490938368 unmapped: 81772544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490954752 unmapped: 81756160 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490954752 unmapped: 81756160 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490954752 unmapped: 81756160 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490962944 unmapped: 81747968 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490962944 unmapped: 81747968 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490962944 unmapped: 81747968 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490962944 unmapped: 81747968 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490962944 unmapped: 81747968 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490971136 unmapped: 81739776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490971136 unmapped: 81739776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490971136 unmapped: 81739776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490971136 unmapped: 81739776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490971136 unmapped: 81739776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490987520 unmapped: 81723392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490987520 unmapped: 81723392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490987520 unmapped: 81723392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491003904 unmapped: 81707008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 70K writes, 278K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.04 MB/s#012Cumulative WAL: 70K writes, 25K syncs, 2.72 writes per sync, written: 0.27 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2445 writes, 9480 keys, 2445 commit groups, 1.0 writes per commit group, ingest: 9.67 MB, 0.02 MB/s#012Interval WAL: 2445 writes, 959 syncs, 2.55 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491012096 unmapped: 81698816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491020288 unmapped: 81690624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491020288 unmapped: 81690624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491020288 unmapped: 81690624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: mgrc ms_handle_reset ms_handle_reset con 0x55bb2cf42800
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3835187053
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3835187053,v1:192.168.122.100:6801/3835187053]
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: mgrc handle_mgr_configure stats_period=5
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491028480 unmapped: 81682432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 ms_handle_reset con 0x55bb2780c800 session 0x55bb2850d4a0
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491036672 unmapped: 81674240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491036672 unmapped: 81674240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 ms_handle_reset con 0x55bb27903c00 session 0x55bb292d9e00
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 ms_handle_reset con 0x55bb29694c00 session 0x55bb2a160000
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491036672 unmapped: 81674240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491036672 unmapped: 81674240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491044864 unmapped: 81666048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491061248 unmapped: 81649664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491069440 unmapped: 81641472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491069440 unmapped: 81641472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491077632 unmapped: 81633280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491077632 unmapped: 81633280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491077632 unmapped: 81633280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491094016 unmapped: 81616896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491102208 unmapped: 81608704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491102208 unmapped: 81608704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491102208 unmapped: 81608704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491102208 unmapped: 81608704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491118592 unmapped: 81592320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491126784 unmapped: 81584128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491134976 unmapped: 81575936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491143168 unmapped: 81567744 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491151360 unmapped: 81559552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491151360 unmapped: 81559552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491151360 unmapped: 81559552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491159552 unmapped: 81551360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491167744 unmapped: 81543168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491167744 unmapped: 81543168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491167744 unmapped: 81543168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491167744 unmapped: 81543168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491167744 unmapped: 81543168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491184128 unmapped: 81526784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491192320 unmapped: 81518592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015364 data_alloc: 218103808 data_used: 12980224
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491208704 unmapped: 81502208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49e000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 272.052581787s of 272.729125977s, submitted: 14
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491225088 unmapped: 81485824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d49f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fa3f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491216896 unmapped: 81494016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 491233280 unmapped: 81477632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488857600 unmapped: 83853312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488923136 unmapped: 83787776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488955904 unmapped: 83755008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488964096 unmapped: 83746816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488964096 unmapped: 83746816 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488972288 unmapped: 83738624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488972288 unmapped: 83738624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488972288 unmapped: 83738624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488972288 unmapped: 83738624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488972288 unmapped: 83738624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488980480 unmapped: 83730432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488980480 unmapped: 83730432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488980480 unmapped: 83730432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488988672 unmapped: 83722240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488996864 unmapped: 83714048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488996864 unmapped: 83714048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488996864 unmapped: 83714048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488996864 unmapped: 83714048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 488996864 unmapped: 83714048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489005056 unmapped: 83705856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489005056 unmapped: 83705856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489005056 unmapped: 83705856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489005056 unmapped: 83705856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489005056 unmapped: 83705856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489013248 unmapped: 83697664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489013248 unmapped: 83697664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489013248 unmapped: 83697664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489021440 unmapped: 83689472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489021440 unmapped: 83689472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489021440 unmapped: 83689472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489029632 unmapped: 83681280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489037824 unmapped: 83673088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489046016 unmapped: 83664896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489046016 unmapped: 83664896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489046016 unmapped: 83664896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489046016 unmapped: 83664896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489054208 unmapped: 83656704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489054208 unmapped: 83656704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489054208 unmapped: 83656704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489054208 unmapped: 83656704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489054208 unmapped: 83656704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489062400 unmapped: 83648512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489070592 unmapped: 83640320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489070592 unmapped: 83640320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489070592 unmapped: 83640320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489078784 unmapped: 83632128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489078784 unmapped: 83632128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489078784 unmapped: 83632128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489086976 unmapped: 83623936 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489095168 unmapped: 83615744 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489103360 unmapped: 83607552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489111552 unmapped: 83599360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489111552 unmapped: 83599360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489119744 unmapped: 83591168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489127936 unmapped: 83582976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489136128 unmapped: 83574784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489144320 unmapped: 83566592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489144320 unmapped: 83566592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489144320 unmapped: 83566592 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489152512 unmapped: 83558400 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489152512 unmapped: 83558400 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489152512 unmapped: 83558400 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489152512 unmapped: 83558400 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489152512 unmapped: 83558400 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489160704 unmapped: 83550208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489160704 unmapped: 83550208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489160704 unmapped: 83550208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489177088 unmapped: 83533824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489185280 unmapped: 83525632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489193472 unmapped: 83517440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489193472 unmapped: 83517440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489193472 unmapped: 83517440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489193472 unmapped: 83517440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489201664 unmapped: 83509248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489218048 unmapped: 83492864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489218048 unmapped: 83492864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489218048 unmapped: 83492864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489226240 unmapped: 83484672 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489234432 unmapped: 83476480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 83460096 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 83451904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 83451904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 83451904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489259008 unmapped: 83451904 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489275392 unmapped: 83435520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489275392 unmapped: 83435520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489275392 unmapped: 83435520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489275392 unmapped: 83435520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489275392 unmapped: 83435520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489283584 unmapped: 83427328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489283584 unmapped: 83427328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489283584 unmapped: 83427328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489283584 unmapped: 83427328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489283584 unmapped: 83427328 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489291776 unmapped: 83419136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489291776 unmapped: 83419136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489291776 unmapped: 83419136 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489299968 unmapped: 83410944 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489316352 unmapped: 83394560 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489324544 unmapped: 83386368 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 83378176 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489340928 unmapped: 83369984 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489340928 unmapped: 83369984 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489349120 unmapped: 83361792 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489349120 unmapped: 83361792 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489357312 unmapped: 83353600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489365504 unmapped: 83345408 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489381888 unmapped: 83329024 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489390080 unmapped: 83320832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489390080 unmapped: 83320832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489398272 unmapped: 83312640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489398272 unmapped: 83312640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489398272 unmapped: 83312640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489398272 unmapped: 83312640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489398272 unmapped: 83312640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489406464 unmapped: 83304448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489414656 unmapped: 83296256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489414656 unmapped: 83296256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489422848 unmapped: 83288064 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489422848 unmapped: 83288064 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489422848 unmapped: 83288064 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489431040 unmapped: 83279872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489431040 unmapped: 83279872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489431040 unmapped: 83279872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489431040 unmapped: 83279872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489431040 unmapped: 83279872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489439232 unmapped: 83271680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489439232 unmapped: 83271680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489439232 unmapped: 83271680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489447424 unmapped: 83263488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489447424 unmapped: 83263488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489455616 unmapped: 83255296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489455616 unmapped: 83255296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489455616 unmapped: 83255296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489455616 unmapped: 83255296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489472000 unmapped: 83238912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489480192 unmapped: 83230720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489488384 unmapped: 83222528 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489496576 unmapped: 83214336 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489512960 unmapped: 83197952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489521152 unmapped: 83189760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489537536 unmapped: 83173376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489537536 unmapped: 83173376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489537536 unmapped: 83173376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489545728 unmapped: 83165184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489545728 unmapped: 83165184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489545728 unmapped: 83165184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489545728 unmapped: 83165184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489545728 unmapped: 83165184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489562112 unmapped: 83148800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489562112 unmapped: 83148800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489562112 unmapped: 83148800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489562112 unmapped: 83148800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489562112 unmapped: 83148800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489570304 unmapped: 83140608 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489570304 unmapped: 83140608 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489578496 unmapped: 83132416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489578496 unmapped: 83132416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489578496 unmapped: 83132416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489586688 unmapped: 83124224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489586688 unmapped: 83124224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489586688 unmapped: 83124224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489586688 unmapped: 83124224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489586688 unmapped: 83124224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489594880 unmapped: 83116032 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489603072 unmapped: 83107840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489611264 unmapped: 83099648 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489611264 unmapped: 83099648 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489619456 unmapped: 83091456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489619456 unmapped: 83091456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489619456 unmapped: 83091456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489619456 unmapped: 83091456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489619456 unmapped: 83091456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489627648 unmapped: 83083264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489627648 unmapped: 83083264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489627648 unmapped: 83083264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489635840 unmapped: 83075072 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489635840 unmapped: 83075072 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489635840 unmapped: 83075072 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489644032 unmapped: 83066880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489644032 unmapped: 83066880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489644032 unmapped: 83066880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489644032 unmapped: 83066880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489644032 unmapped: 83066880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489652224 unmapped: 83058688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489652224 unmapped: 83058688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489652224 unmapped: 83058688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489660416 unmapped: 83050496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489660416 unmapped: 83050496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489660416 unmapped: 83050496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489660416 unmapped: 83050496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489660416 unmapped: 83050496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489676800 unmapped: 83034112 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489684992 unmapped: 83025920 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489684992 unmapped: 83025920 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489693184 unmapped: 83017728 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489693184 unmapped: 83017728 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489693184 unmapped: 83017728 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489701376 unmapped: 83009536 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489701376 unmapped: 83009536 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489701376 unmapped: 83009536 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489701376 unmapped: 83009536 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489701376 unmapped: 83009536 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489717760 unmapped: 82993152 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489725952 unmapped: 82984960 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489725952 unmapped: 82984960 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489725952 unmapped: 82984960 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489725952 unmapped: 82984960 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489734144 unmapped: 82976768 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489734144 unmapped: 82976768 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489734144 unmapped: 82976768 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489734144 unmapped: 82976768 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489734144 unmapped: 82976768 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489742336 unmapped: 82968576 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489742336 unmapped: 82968576 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489750528 unmapped: 82960384 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489758720 unmapped: 82952192 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489766912 unmapped: 82944000 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489783296 unmapped: 82927616 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489791488 unmapped: 82919424 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489807872 unmapped: 82903040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489807872 unmapped: 82903040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489807872 unmapped: 82903040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489807872 unmapped: 82903040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489807872 unmapped: 82903040 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489824256 unmapped: 82886656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489824256 unmapped: 82886656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489824256 unmapped: 82886656 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489832448 unmapped: 82878464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489832448 unmapped: 82878464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489832448 unmapped: 82878464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489832448 unmapped: 82878464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489832448 unmapped: 82878464 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489840640 unmapped: 82870272 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489848832 unmapped: 82862080 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489857024 unmapped: 82853888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489857024 unmapped: 82853888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489857024 unmapped: 82853888 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489865216 unmapped: 82845696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489865216 unmapped: 82845696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489865216 unmapped: 82845696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489873408 unmapped: 82837504 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489873408 unmapped: 82837504 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489873408 unmapped: 82837504 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489881600 unmapped: 82829312 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489889792 unmapped: 82821120 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489897984 unmapped: 82812928 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489897984 unmapped: 82812928 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489906176 unmapped: 82804736 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489914368 unmapped: 82796544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489914368 unmapped: 82796544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489914368 unmapped: 82796544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489914368 unmapped: 82796544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489914368 unmapped: 82796544 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489922560 unmapped: 82788352 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489922560 unmapped: 82788352 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489922560 unmapped: 82788352 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489930752 unmapped: 82780160 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489947136 unmapped: 82763776 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489955328 unmapped: 82755584 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489955328 unmapped: 82755584 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 82747392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 82747392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 82747392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 82747392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489963520 unmapped: 82747392 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489971712 unmapped: 82739200 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489971712 unmapped: 82739200 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489971712 unmapped: 82739200 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 82731008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 82731008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 82731008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 82731008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489979904 unmapped: 82731008 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 489996288 unmapped: 82714624 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 82706432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 82706432 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490012672 unmapped: 82698240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490012672 unmapped: 82698240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490012672 unmapped: 82698240 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490020864 unmapped: 82690048 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490029056 unmapped: 82681856 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 70K writes, 279K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.27 GB, 0.04 MB/s#012Cumulative WAL: 70K writes, 26K syncs, 2.71 writes per sync, written: 0.27 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 599 writes, 911 keys, 599 commit groups, 1.0 writes per commit group, ingest: 0.29 MB, 0.00 MB/s#012Interval WAL: 599 writes, 299 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490037248 unmapped: 82673664 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490045440 unmapped: 82665472 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490053632 unmapped: 82657280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490053632 unmapped: 82657280 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490061824 unmapped: 82649088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490061824 unmapped: 82649088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490061824 unmapped: 82649088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490061824 unmapped: 82649088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490061824 unmapped: 82649088 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490070016 unmapped: 82640896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490070016 unmapped: 82640896 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490078208 unmapped: 82632704 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490086400 unmapped: 82624512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490086400 unmapped: 82624512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490086400 unmapped: 82624512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490086400 unmapped: 82624512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490086400 unmapped: 82624512 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490094592 unmapped: 82616320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490094592 unmapped: 82616320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490094592 unmapped: 82616320 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490102784 unmapped: 82608128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490102784 unmapped: 82608128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490102784 unmapped: 82608128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490102784 unmapped: 82608128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490102784 unmapped: 82608128 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490119168 unmapped: 82591744 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490119168 unmapped: 82591744 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490119168 unmapped: 82591744 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490127360 unmapped: 82583552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490127360 unmapped: 82583552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490127360 unmapped: 82583552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490127360 unmapped: 82583552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490127360 unmapped: 82583552 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490135552 unmapped: 82575360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490135552 unmapped: 82575360 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490143744 unmapped: 82567168 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490151936 unmapped: 82558976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490151936 unmapped: 82558976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490151936 unmapped: 82558976 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490160128 unmapped: 82550784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490160128 unmapped: 82550784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490160128 unmapped: 82550784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490160128 unmapped: 82550784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490160128 unmapped: 82550784 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490184704 unmapped: 82526208 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490192896 unmapped: 82518016 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490201088 unmapped: 82509824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490201088 unmapped: 82509824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490201088 unmapped: 82509824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490201088 unmapped: 82509824 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490209280 unmapped: 82501632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490209280 unmapped: 82501632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490209280 unmapped: 82501632 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490217472 unmapped: 82493440 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490225664 unmapped: 82485248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490225664 unmapped: 82485248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490225664 unmapped: 82485248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490225664 unmapped: 82485248 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490242048 unmapped: 82468864 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 603.160888672s of 604.237854004s, submitted: 300
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490258432 unmapped: 82452480 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 490299392 unmapped: 82411520 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485261312 unmapped: 87449600 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485294080 unmapped: 87416832 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485302272 unmapped: 87408640 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485310464 unmapped: 87400448 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485318656 unmapped: 87392256 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485326848 unmapped: 87384064 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485326848 unmapped: 87384064 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485335040 unmapped: 87375872 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485343232 unmapped: 87367680 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485351424 unmapped: 87359488 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485359616 unmapped: 87351296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485359616 unmapped: 87351296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 31 04:36:36 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/748214068' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485359616 unmapped: 87351296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485359616 unmapped: 87351296 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485367808 unmapped: 87343104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485367808 unmapped: 87343104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485367808 unmapped: 87343104 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485376000 unmapped: 87334912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485376000 unmapped: 87334912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485376000 unmapped: 87334912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485376000 unmapped: 87334912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485376000 unmapped: 87334912 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485384192 unmapped: 87326720 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485400576 unmapped: 87310336 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485400576 unmapped: 87310336 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485400576 unmapped: 87310336 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485408768 unmapped: 87302144 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485408768 unmapped: 87302144 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485408768 unmapped: 87302144 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485408768 unmapped: 87302144 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485416960 unmapped: 87293952 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485425152 unmapped: 87285760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485425152 unmapped: 87285760 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485433344 unmapped: 87277568 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485441536 unmapped: 87269376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485441536 unmapped: 87269376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485441536 unmapped: 87269376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485441536 unmapped: 87269376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485441536 unmapped: 87269376 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485449728 unmapped: 87261184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485449728 unmapped: 87261184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485449728 unmapped: 87261184 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485457920 unmapped: 87252992 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485457920 unmapped: 87252992 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485466112 unmapped: 87244800 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485474304 unmapped: 87236608 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485474304 unmapped: 87236608 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485474304 unmapped: 87236608 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485482496 unmapped: 87228416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485482496 unmapped: 87228416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485482496 unmapped: 87228416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485482496 unmapped: 87228416 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485490688 unmapped: 87220224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485490688 unmapped: 87220224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485490688 unmapped: 87220224 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485498880 unmapped: 87212032 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485498880 unmapped: 87212032 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485498880 unmapped: 87212032 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485507072 unmapped: 87203840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485507072 unmapped: 87203840 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485515264 unmapped: 87195648 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485515264 unmapped: 87195648 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485515264 unmapped: 87195648 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485523456 unmapped: 87187456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485523456 unmapped: 87187456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485523456 unmapped: 87187456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485523456 unmapped: 87187456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485523456 unmapped: 87187456 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485531648 unmapped: 87179264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485531648 unmapped: 87179264 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485539840 unmapped: 87171072 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485539840 unmapped: 87171072 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485548032 unmapped: 87162880 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 87154688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 87154688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 87154688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 87154688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485556224 unmapped: 87154688 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485564416 unmapped: 87146496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485564416 unmapped: 87146496 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485572608 unmapped: 87138304 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485572608 unmapped: 87138304 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485572608 unmapped: 87138304 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: bluestore.MempoolThread(0x55bb25fb3b60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5015508 data_alloc: 218103808 data_used: 12996608
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485572608 unmapped: 87138304 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485646336 unmapped: 87064576 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}'
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485769216 unmapped: 86941696 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: osd.1 434 heartbeat osd_stat(store_statfs(0x19d08f000/0x0/0x1bfc00000, data 0x2aea6d3/0x2d1f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1fe4f9c6), peers [0,2] op hist [])
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 87539712 heap: 572710912 old mem: 2845415833 new mem: 2845415833
Jan 31 04:36:36 np0005603622 ceph-osd[79107]: do_command 'log dump' '{prefix=log dump}'
Jan 31 04:36:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 31 04:36:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4102701743' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 31 04:36:37 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 31 04:36:37 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3107859373' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 31 04:36:37 np0005603622 nova_compute[221301]: 2026-01-31 09:36:37.703 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:37 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:37 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:37 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:37.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 31 04:36:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2982246083' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 31 04:36:38 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:38 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:38 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:38.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:38 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 31 04:36:38 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2684643141' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3438091263' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3828757763' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 31 04:36:39 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:39 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:39 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:39.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3436386758' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 31 04:36:39 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3604428023' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 31 04:36:40 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:40 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:40 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3871227092' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/532877366' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 31 04:36:40 np0005603622 nova_compute[221301]: 2026-01-31 09:36:40.459 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1243671867' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 31 04:36:40 np0005603622 systemd[1]: Starting Hostname Service...
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2918882917' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 31 04:36:40 np0005603622 systemd[1]: Started Hostname Service.
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 31 04:36:40 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996375111' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3467409445' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3968968582' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 31 04:36:41 np0005603622 nova_compute[221301]: 2026-01-31 09:36:41.509 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4285607117' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3968577894' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 31 04:36:41 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:41 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:41 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:41.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 31 04:36:41 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/821260756' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 31 04:36:42 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:42 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:42 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:42.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:42 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 31 04:36:42 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/271845010' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 31 04:36:42 np0005603622 nova_compute[221301]: 2026-01-31 09:36:42.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:42 np0005603622 nova_compute[221301]: 2026-01-31 09:36:42.707 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3456484059' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #223. Immutable memtables: 0.
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.571816) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:856] [default] [JOB 143] Flushing memtable with next log file: 223
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203571939, "job": 143, "event": "flush_started", "num_memtables": 1, "num_entries": 814, "num_deletes": 250, "total_data_size": 1275399, "memory_usage": 1297472, "flush_reason": "Manual Compaction"}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:885] [default] [JOB 143] Level-0 flush table #224: started
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203579140, "cf_name": "default", "job": 143, "event": "table_file_creation", "file_number": 224, "file_size": 626441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 107290, "largest_seqno": 108098, "table_properties": {"data_size": 622473, "index_size": 1491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 12178, "raw_average_key_size": 22, "raw_value_size": 613722, "raw_average_value_size": 1136, "num_data_blocks": 63, "num_entries": 540, "num_filter_entries": 540, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769852164, "oldest_key_time": 1769852164, "file_creation_time": 1769852203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 224, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 143] Flush lasted 7354 microseconds, and 1966 cpu microseconds.
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.579184) [db/flush_job.cc:967] [default] [JOB 143] Level-0 flush table #224: 626441 bytes OK
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.579204) [db/memtable_list.cc:519] [default] Level-0 commit table #224 started
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.580940) [db/memtable_list.cc:722] [default] Level-0 commit table #224: memtable #1 done
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.580969) EVENT_LOG_v1 {"time_micros": 1769852203580959, "job": 143, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.580996) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 143] Try to delete WAL files size 1270699, prev total WAL file size 1270699, number of live WAL files 2.
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000220.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.581507) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373630' seq:72057594037927935, type:22 .. '6D6772737461740034303131' seq:0, type:0; will stop at (end)
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 144] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 143 Base level 0, inputs: [224(611KB)], [222(15MB)]
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203581542, "job": 144, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [224], "files_L6": [222], "score": -1, "input_data_size": 16829833, "oldest_snapshot_seqno": -1}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 144] Generated table #225: 12668 keys, 13220538 bytes, temperature: kUnknown
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203655594, "cf_name": "default", "job": 144, "event": "table_file_creation", "file_number": 225, "file_size": 13220538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13143928, "index_size": 43857, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31685, "raw_key_size": 336731, "raw_average_key_size": 26, "raw_value_size": 12928036, "raw_average_value_size": 1020, "num_data_blocks": 1645, "num_entries": 12668, "num_filter_entries": 12668, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769844147, "oldest_key_time": 0, "file_creation_time": 1769852203, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cbf6aaa-ac01-4fca-be5c-88ee6f516697", "db_session_id": "06P5A570C25H9SCKA4A2", "orig_file_number": 225, "seqno_to_time_mapping": "N/A"}}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.655905) [db/compaction/compaction_job.cc:1663] [default] [JOB 144] Compacted 1@0 + 1@6 files to L6 => 13220538 bytes
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.657373) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 227.0 rd, 178.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 15.5 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(48.0) write-amplify(21.1) OK, records in: 13162, records dropped: 494 output_compression: NoCompression
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.657395) EVENT_LOG_v1 {"time_micros": 1769852203657385, "job": 144, "event": "compaction_finished", "compaction_time_micros": 74147, "compaction_time_cpu_micros": 23517, "output_level": 6, "num_output_files": 1, "total_output_size": 13220538, "num_input_records": 13162, "num_output_records": 12668, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000224.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203657719, "job": 144, "event": "table_file_deletion", "file_number": 224}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000222.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769852203659511, "job": 144, "event": "table_file_deletion", "file_number": 222}
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.581418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.659549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.659554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.659556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.659557) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: rocksdb: (Original Log Time 2026/01/31-09:36:43.659558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon).osd e434 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 31 04:36:43 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:43 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 31 04:36:43 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:43.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 31 04:36:43 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1467299085' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:44 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:44 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:44 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:44.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3318038354' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 31 04:36:44 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3686495358' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:45 np0005603622 nova_compute[221301]: 2026-01-31 09:36:45.395 221324 DEBUG oslo_service.periodic_task [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 31 04:36:45 np0005603622 nova_compute[221301]: 2026-01-31 09:36:45.396 221324 DEBUG nova.compute.manager [None req-a47bc505-5423-44dc-97ce-d3f4b9c66315 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:45 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:45 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 31 04:36:45 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.102 - anonymous [31/Jan/2026:09:36:45.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2820288027' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 31 04:36:45 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 31 04:36:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 31 04:36:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 31 04:36:46 np0005603622 radosgw[84338]: ====== starting new request req=0x7f517db166f0 =====
Jan 31 04:36:46 np0005603622 radosgw[84338]: ====== req done req=0x7f517db166f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 31 04:36:46 np0005603622 radosgw[84338]: beast: 0x7f517db166f0: 192.168.122.100 - anonymous [31/Jan/2026:09:36:46.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 31 04:36:46 np0005603622 nova_compute[221301]: 2026-01-31 09:36:46.511 221324 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 31 04:36:46 np0005603622 ceph-mon[81571]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 31 04:36:46 np0005603622 ceph-mon[81571]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1570655767' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
